Download Google.Professional-Cloud-Architect.Dump4Pass.2024-11-26.245q.tqb

Download Exam

File Info

Exam Professional Cloud Architect on Google Cloud Platform
Number Professional-Cloud-Architect
File Name Google.Professional-Cloud-Architect.Dump4Pass.2024-11-26.245q.tqb
Size 2 MB
Posted Nov 26, 2024
Download Google.Professional-Cloud-Architect.Dump4Pass.2024-11-26.245q.tqb


How to open VCEX & EXAM Files?

Files with VCEX & EXAM extensions can be opened by ProfExam Simulator.

Purchase

Coupon: MASTEREXAM
With discount: 20%






Demo Questions

Question 1

The JencoMart security team requires that all Google Cloud Platform infrastructure is deployed using a least privilege model with separation of duties for administration between production and development resources.  
What Google domain and project structure should you recommend? 
 


  1. Create two G Suite accounts to manage users: one for development/test/staging and one for production. Each account should contain one project for every application 
  2. Create two G Suite accounts to manage users: one with a single project for all development applications and one with a single project for all production applications 
  3. Create a single G Suite account to manage users with each stage of each application in its own project 
  4. Create a single G Suite account to manage users with one project for the development/test/staging environment and one project for the production environment  
Correct answer: C
Explanation:
Note: The principle of least privilege and separation of duties are concepts that, although semantically different, are intrinsically related from the standpoint of security. The intent behind both is to prevent people from having higher privilege levels than they actually need  Principle of Least Privilege: Users should only have the least amount of privileges required to perform their job and no more. This reduces authorization exploitation by limiting access to resources such as targets, jobs, or monitoring templates for which they are not authorized. Separation of Duties: Beyond limiting user privilege level, you also limit user duties, or the specific jobs they can perform. No user should be given responsibility for more than one related function. This limits the ability of a user to perform a malicious action and then cover up that action.  Reference: https://cloud.google.com/kms/docs/separation-of-duties  
Note: The principle of least privilege and separation of duties are concepts that, although semantically different, are intrinsically related from the standpoint of security. The intent behind both is to prevent people from having higher privilege levels than they actually need  
  • Principle of Least Privilege: Users should only have the least amount of privileges required to perform their job and no more. This reduces authorization exploitation by limiting access to resources such as targets, jobs, or monitoring templates for which they are not authorized. 
  • Separation of Duties: Beyond limiting user privilege level, you also limit user duties, or the specific jobs they can perform. No user should be given responsibility for more than one related function. This limits the ability of a user to perform a malicious action and then cover up that action.  
Reference: https://cloud.google.com/kms/docs/separation-of-duties  



Question 2

A few days after JencoMart migrates the user credentials database to Google Cloud Platform and shuts down the old server, the new database server stops responding to SSH connections. It is still serving database requests to the application servers correctly.  
What three steps should you take to diagnose the problem? (Choose three.) 
 


  1. Delete the virtual machine (VM) and disks and create a new one 
  2. Delete the instance, attach the disk to a new VM, and investigate 
  3. Take a snapshot of the disk and connect to a new machine to investigate 
  4. Check inbound firewall rules for the network the machine is connected to 
  5. Connect the machine to another network with very simple firewall rules and investigate 
  6. Print the Serial Console output for the instance for troubleshooting, activate the interactive console, and investigate  
Correct answer: CDF
Explanation:
D: Handling "Unable to connect on port 22" error message Possible causes include:  There is no firewall rule allowing SSH access on the port. SSH access on port 22 is enabled on all Compute Engine instances by default. If you have disabled access, SSH from the Browser will not work. If you run sshd on a port other than 22, you need to enable the access to that port with a custom firewall rule. The firewall rule allowing SSH access is enabled, but is not configured to allow connections from GCP Console services. Source IP addresses for browser-based SSH sessions are dynamically allocated by GCP Console and can vary from session to session.  F: Handling "Could not connect, retrying..." error You can verify that the daemon is running by navigating to the serial console output page and looking for output lines prefixed with the accounts-from-metadata: string. If you are using a standard image but you do not see these output prefixes in the serial console output, the daemon might be stopped. Reboot the instance to restart the daemon.  Reference: https://cloud.google.com/compute/docs/ssh-in-browser https://cloud.google.com/compute/docs/ssh-in-browser  
D: Handling "Unable to connect on port 22" error message 
Possible causes include:  
  • There is no firewall rule allowing SSH access on the port. SSH access on port 22 is enabled on all Compute Engine instances by default. If you have disabled access, SSH from the Browser will not work. If you run sshd on a port other than 22, you need to enable the access to that port with a custom firewall rule. 
  • The firewall rule allowing SSH access is enabled, but is not configured to allow connections from GCP Console services. Source IP addresses for browser-based SSH sessions are dynamically allocated by GCP Console and can vary from session to session. 
 
F: Handling "Could not connect, retrying..." error 
You can verify that the daemon is running by navigating to the serial console output page and looking for output lines prefixed with the accounts-from-metadata: string. If you are using a standard image but you do not see these output prefixes in the serial console output, the daemon might be stopped. Reboot the instance to restart the daemon. 
 
Reference: 
https://cloud.google.com/compute/docs/ssh-in-browser 
https://cloud.google.com/compute/docs/ssh-in-browser  



Question 3

JencoMart has built a version of their application on Google Cloud Platform that serves traffic to Asia. You want to measure success against their business and technical goals. 
Which metrics should you track? 


  1. Error rates for requests from Asia
  2. Latency difference between US and Asia
  3. Total visits, error rates, and latency from Asia
  4. Total visits and average latency for users from Asia
  5. The number of character sets present in the database
Correct answer: D
Explanation:
From scenario: Business Requirements include: Expand services into Asia Technical Requirements include: Decrease latency in Asia 
From scenario: 
Business Requirements include: Expand services into Asia 
Technical Requirements include: Decrease latency in Asia 



Question 4

JencoMart wants to move their User Profiles database to Google Cloud Platform. 
Which Google Database should they use? 


  1. Cloud Spanner 
  2. Google BigQuery 
  3. Google Cloud SQL 
  4. Google Cloud Datastore  
Correct answer: D
Explanation:
Common workloads for Google Cloud Datastore:  User profiles Product catalogs Game state  Reference: https://cloud.google.com/storage-options/ https://cloud.google.com/datastore/docs/concepts/overview  
Common workloads for Google Cloud Datastore:  
  • User profiles 
  • Product catalogs 
  • Game state  
Reference: https://cloud.google.com/storage-options/ 
https://cloud.google.com/datastore/docs/concepts/overview  



Question 5

Mountkirk Games has deployed their new backend on Google Cloud Platform (GCP). You want to create a through testing process for new versions of the backend before they are released to the public. You want the testing environment to scale in an economical way. How should you design the process? 
 


  1. Create a scalable environment in GCP for simulating production load 
  2. Use the existing infrastructure to test the GCP-based backend at scale 
  3. Build stress tests into each component of your application using resources internal to GCP to simulate load 
  4. Create a set of static environments in GCP to test different levels of load – for example, high, medium, and low  
Correct answer: A
Explanation:
From scenario: Requirements for Game Backend Platform  Dynamically scale up or down based on game activity Connect to a managed NoSQL database service Run customize Linux distro  
From scenario: Requirements for Game Backend Platform  
  1. Dynamically scale up or down based on game activity 
  2. Connect to a managed NoSQL database service 
  3. Run customize Linux distro  



Question 6

Mountkirk Games’ gaming servers are not automatically scaling properly. Last month, they rolled out a new feature, which suddenly became very popular. A record number of users are trying to use the service, but many of them are getting 503 errors and very slow response times. What should they investigate first? 
 


  1. Verify that the database is online 
  2. Verify that the project quota hasn’t been exceeded 
  3. Verify that the new feature code did not introduce any performance bugs 
  4. Verify that the load-testing team is not running their tool against production  
Correct answer: B
Explanation:
503 is service unavailable error. If the database was online everyone would get the 503 error.  
503 is service unavailable error. If the database was online everyone would get the 503 error. 
 



Question 7

Mountkirk Games wants to set up a real-time analytics platform for their new game. The new platform must meet their technical requirements.  
Which combination of Google technologies will meet all of their requirements? 
 


  1. Kubernetes Engine, Cloud Pub/Sub, and Cloud SQL
  2. Cloud Dataflow, Cloud Storage, Cloud Pub/Sub, and BigQuery
  3. Cloud SQL, Cloud Storage, Cloud Pub/Sub, and Cloud Dataflow
  4. Cloud Dataproc, Cloud Pub/Sub, Cloud SQL, and Cloud Dataflow
  5. Cloud Pub/Sub, Compute Engine, Cloud Storage, and Cloud Dataproc
Correct answer: B
Explanation:
Ingest millions of streaming events per second from anywhere in the world with Cloud Pub/Sub, powered by Google's unique, high-speed private network. Process the streams with Cloud Dataflow to ensure reliable, exactly-once, low-latency data transformation. Stream the transformed data into BigQuery, the cloud-native data warehousing service, for immediate analysis via SQL or popular visualization tools. From scenario: They plan to deploy the game’s backend on Google Compute Engine so they can capture streaming metrics, run intensive analytics. Requirements for Game Analytics Platform Dynamically scale up or down based on game activityProcess incoming data on the fly directly from the game serversProcess data that arrives late because of slow mobile networksAllow SQL queries to access at least 10 TB of historical dataProcess files that are regularly uploaded by users’ mobile devicesUse only fully managed servicesReference: https://cloud.google.com/solutions/big-data/stream-analytics/ 
Ingest millions of streaming events per second from anywhere in the world with Cloud Pub/Sub, powered by Google's unique, high-speed private network. Process the streams with Cloud Dataflow to ensure reliable, exactly-once, low-latency data transformation. Stream the transformed data into BigQuery, the cloud-native data warehousing service, for immediate analysis via SQL or popular visualization tools. 
From scenario: They plan to deploy the game’s backend on Google Compute Engine so they can capture streaming metrics, run intensive analytics. 
Requirements for Game Analytics Platform 
  1. Dynamically scale up or down based on game activity
  2. Process incoming data on the fly directly from the game servers
  3. Process data that arrives late because of slow mobile networks
  4. Allow SQL queries to access at least 10 TB of historical data
  5. Process files that are regularly uploaded by users’ mobile devices
  6. Use only fully managed services
Reference: https://cloud.google.com/solutions/big-data/stream-analytics/ 



Question 8

For this question, refer to the Mountkirk Games case study. You need to analyze and define the technical architecture for the compute workloads for your company, Mountkirk Games. Considering the Mountkirk Games business and technical requirements, what should you do? 
 


  1. Create network load balancers. Use preemptible Compute Engine instances. 
  2. Create network load balancers. Use non-preemptible Compute Engine instances. 
  3. Create a global load balancer with managed instance groups and autoscaling policies. Use preemptible Compute Engine instances. 
  4. Create a global load balancer with managed instance groups and autoscaling policies. Use non-preemptible Compute Engine instances.  
Correct answer: D



Question 9

You need to optimize batch file transfers into Cloud Storage for Mountkirk Games’ new Google Cloud solution. 
The batch files contain game statistics that need to be staged in Cloud Storage and be processed by an extract transform load (ETL) tool. What should you do? 
 


  1. Use gsutil to batch move files in sequence. 
  2. Use gsutil to batch copy the files in parallel. 
  3. Use gsutil to extract the files as the first part of ETL. 
  4. Use gsutil to load the files as the last part of ETL.  
Correct answer: B
Explanation:
Reference: https://cloud.google.com/storage/docs/gsutil/commands/cp  
Reference: https://cloud.google.com/storage/docs/gsutil/commands/cp 
 



Question 10

You are implementing Firestore for Mountkirk Games. Mountkirk Games wants to give a new game programmatic access to a legacy game's Firestore database. Access should be as restricted as possible. What should you do? 
 


  1. Create a service account (SA) in the legacy game’s Google Cloud project, add a second SA in the new game’s IAM page, and then give the Organization Admin role to both SAs. 
  2. Create a service account (SA) in the legacy game’s Google Cloud project, give the SA the Organization Admin role, and then give it the Firebase Admin role in both projects. 
  3. Create a service account (SA) in the legacy game’s Google Cloud project, add this SA in the new game’s IAM page, and then give it the Firebase Admin role in both projects. 
  4. Create a service account (SA) in the legacy game’s Google Cloud project, give it the Firebase Admin role, and then migrate the new game to the legacy game’s project.  
Correct answer: D









PROFEXAM WITH A 20% DISCOUNT

You can buy ProfExam with a 20% discount!



HOW TO OPEN VCEX FILES

Use ProfExam Simulator to open VCEX files