Download Google.Professional-Cloud-Architect.Dump4Pass.2024-11-09.177q.tqb

Download Exam

File Info

Exam Professional Cloud Architect on Google Cloud Platform
Number Professional-Cloud-Architect
File Name Google.Professional-Cloud-Architect.Dump4Pass.2024-11-09.177q.tqb
Size 1 MB
Posted Nov 09, 2024
Download Google.Professional-Cloud-Architect.Dump4Pass.2024-11-09.177q.tqb

How to open VCEX & EXAM Files?

Files with VCEX & EXAM extensions can be opened by ProfExam Simulator.

Purchase

Coupon: MASTEREXAM
With discount: 20%






Demo Questions

Question 1

Your company’s test suite is a custom C++ application that runs tests throughout each day on Linux virtual machines. The full test suite takes several hours to complete, running on a limited number of on-premises servers reserved for testing. Your company wants to move the testing infrastructure to the cloud, to reduce the amount of time it takes to fully test a change to the system, while changing the tests as little as possible.  
Which cloud infrastructure should you recommend? 
 


  1. Google Compute Engine unmanaged instance groups and Network Load Balancer 
  2. Google Compute Engine managed instance groups with auto-scaling 
  3. Google Cloud Dataproc to run Apache Hadoop jobs to process each test 
  4. Google App Engine with Google StackDriver for logging  
Correct answer: B
Explanation:
Google Compute Engine enables users to launch virtual machines (VMs) on demand. VMs can be launched from the standard images or custom images created by users.  Managed instance groups offer autoscaling capabilities that allow you to automatically add or remove instances from a managed instance group based on increases or decreases in load. Autoscaling helps your applications gracefully handle increases in traffic and reduces cost when the need for resources is lower.  Incorrect Answers: B: There is no mention of incoming IP data traffic for the custom C++ applications.  C: Apache Hadoop is not fit for testing C++ applications. Apache Hadoop is an open-source software framework used for distributed storage and processing of datasets of big data using the MapReduce programming model.  D: Google App Engine is intended to be used for web applications. Google App Engine (often referred to as GAE or simply App Engine) is a web framework and cloud computing platform for developing and hosting web applications in Google-managed data centers.  Reference: https://cloud.google.com/compute/docs/autoscaler/  
Google Compute Engine enables users to launch virtual machines (VMs) on demand. VMs can be launched from the standard images or custom images created by users.  
Managed instance groups offer autoscaling capabilities that allow you to automatically add or remove instances from a managed instance group based on increases or decreases in load. Autoscaling helps your applications gracefully handle increases in traffic and reduces cost when the need for resources is lower. 
 
Incorrect Answers: 
B: There is no mention of incoming IP data traffic for the custom C++ applications.  
C: Apache Hadoop is not fit for testing C++ applications. Apache Hadoop is an open-source software framework used for distributed storage and processing of datasets of big data using the MapReduce programming model.  
D: Google App Engine is intended to be used for web applications. 
Google App Engine (often referred to as GAE or simply App Engine) is a web framework and cloud computing platform for developing and hosting web applications in Google-managed data centers. 
 
Reference: https://cloud.google.com/compute/docs/autoscaler/  



Question 2

A lead software engineer tells you that his new application design uses websockets and HTTP sessions that are not distributed across the web servers. You want to help him ensure his application will run properly on Google Cloud Platform.  
What should you do? 
 


  1. Help the engineer to convert his websocket code to use HTTP streaming 
  2. Review the encryption requirements for websocket connections with the security team 
  3. Meet with the cloud operations team and the engineer to discuss load balancer options 
  4. Help the engineer redesign the application to use a distributed user session service that does not rely on websockets and HTTP sessions.  
Correct answer: C
Explanation:
Google Cloud Platform (GCP) HTTP(S) load balancing provides global load balancing for HTTP(S) requests destined for your instances. The HTTP(S) load balancer has native support for the WebSocket protocol.  Incorrect Answers: A: HTTP server push, also known as HTTP streaming, is a client-server communication pattern that sends information from an HTTP server to a client asynchronously, without a client request. A server push architecture is especially effective for highly interactive web or mobile applications, where one or more clients need to receive continuous information from the server.  Reference: https://cloud.google.com/compute/docs/load-balancing/http/  
Google Cloud Platform (GCP) HTTP(S) load balancing provides global load balancing for HTTP(S) requests destined for your instances. 
The HTTP(S) load balancer has native support for the WebSocket protocol. 
 
Incorrect Answers: 
A: HTTP server push, also known as HTTP streaming, is a client-server communication pattern that sends information from an HTTP server to a client asynchronously, without a client request. A server push architecture is especially effective for highly interactive web or mobile applications, where one or more clients need to receive continuous information from the server. 
 
Reference: https://cloud.google.com/compute/docs/load-balancing/http/  



Question 3

The application reliability team at your company this added a debug feature to their backend service to send all server events to Google Cloud Storage for eventual analysis. The event records are at least 50 KB and at most 15 MB and are expected to peak at 3,000 events per second. You want to minimize data loss.  
Which process should you implement? 
 


  1.   Append metadata to file body   Compress individual files 
      Name files with serverName – Timestamp 
      Create a new bucket if bucket is older than 1 hour and save individual files to the new bucket. Otherwise, save files to existing bucket. 
  2.   Batch every 10,000 events with a single manifest file for metadata   Compress event files and manifest file into a single archive file 
      Name files using serverName – EventSequence 
      Create a new bucket if bucket is older than 1 day and save the single archive file to the new bucket. Otherwise, save the single archive file to existing bucket. 
  3.   Compress individual files   Name files with serverName – EventSequence 
      Save files to one bucket 
      Set custom metadata headers for each object after saving 
  4.   Append metadata to file body   Compress individual files 
      Name files with a random prefix pattern 
      Save files to one bucket  
Correct answer: D
Explanation:
   
 
 
 



Question 4

A recent audit revealed that a new network was created in your GCP project. In this network, a GCE instance has an SSH port open to the world. You want to discover this network’s origin.  
What should you do? 
 


  1. Search for Create VM entry in the Stackdriver alerting console 
  2. Navigate to the Activity page in the Home section. Set category to Data Access and search for Create VM entry 
  3. In the Logging section of the console, specify GCE Network as the logging section. Search for the Create Insert entry 
  4. Connect to the GCE instance using project SSH keys. Identify previous logins in system logs, and match these with the project owners list  
Correct answer: C
Explanation:
Incorrect Answers:  A: To use the Stackdriver alerting console we must first set up alerting policies. B: Data access logs only contain read-only operations. Audit logs help you determine who did what, where, and when. Cloud Audit Logging returns two types of logs: Admin activity logs Data access logs: Contains log entries for operations that perform read-only operations do not modify any data, such as get, list, and aggregated list methods.  
Incorrect Answers:  
A: To use the Stackdriver alerting console we must first set up alerting policies. 
B: Data access logs only contain read-only operations. 
Audit logs help you determine who did what, where, and when. 
Cloud Audit Logging returns two types of logs: 
  • Admin activity logs 
  • Data access logs: Contains log entries for operations that perform read-only operations do not modify any data, such as get, list, and aggregated list methods.  



Question 5

You want to make a copy of a production Linux virtual machine in the US-Central region. You want to manage and replace the copy easily if there are changes on the production virtual machine. You will deploy the copy as a new instance in a different project in the US-East region.  
What steps must you take? 
 


  1. Use the Linux dd and netcat commands to copy and stream the root disk contents to a new virtual machine instance in the US-East region. 
  2. Create a snapshot of the root disk and select the snapshot as the root disk when you create a new virtual machine instance in the US-East region. 
  3. Create an image file from the root disk with Linux dd command, create a new virtual machine instance in the US-East region 
  4. Create a snapshot of the root disk, create an image file in Google Cloud Storage from the snapshot, and create a new virtual machine instance in the US-East region using the image file the root disk.  
Correct answer: D



Question 6

Your company runs several databases on a single MySQL instance. They need to take backups of a specific database at regular intervals. The backup activity needs to complete as quickly as possible and cannot be allowed to impact disk performance.  
How should you configure the storage? 
 


  1. Configure a cron job to use the gcloud tool to take regular backups using persistent disk snapshots. 
  2. Mount a Local SSD volume as the backup location. After the backup is complete, use gsutil to move the backup to Google Cloud Storage. 
  3. Use gcsfise to mount a Google Cloud Storage bucket as a volume directly on the instance and write backups to the mounted location using mysqldump.  
  4. Mount additional persistent disk volumes onto each virtual machine (VM) instance in a RAID10 array and use LVM to create snapshots to send to Cloud Storage  
Correct answer: B



Question 7

You are helping the QA team to roll out a new load-testing tool to test the scalability of your primary cloud services that run on Google Compute Engine with Cloud Bigtable.  
Which three requirements should they include? (Choose three.) 
 


  1. Ensure that the load tests validate the performance of Cloud Bigtable 
  2. Create a separate Google Cloud project to use for the load-testing environment 
  3. Schedule the load-testing tool to regularly run against the production environment 
  4. Ensure all third-party systems your services use is capable of handling high load 
  5. Instrument the production services to record every transaction for replay by the load-testing tool 
  6. Instrument the load-testing tool and the target services with detailed logging and metrics collection  
Correct answer: ABF



Question 8

Your company places a high value on being responsive and meeting customer needs quickly. Their primary business objectives are release speed and agility. You want to reduce the chance of security errors being accidentally introduced.  
Which two actions can you take? (Choose two.) 
 


  1. Ensure every code check-in is peer reviewed by a security SME 
  2. Use source code security analyzers as part of the CI/CD pipeline  
  3. Ensure you have stubs to unit test all interfaces between components 
  4. Enable code signing and a trusted binary repository integrated with your CI/CD pipeline 
  5. Run a vulnerability security scanner as part of your continuous-integration /continuous-delivery (CI/CD) pipeline  
Correct answer: DE



Question 9

You want to enable your running Google Kubernetes Engine cluster to scale as demand for your application changes.  
What should you do? 
 


  1. Add additional nodes to your Kubernetes Engine cluster using the following command: gcloud container clusters resize 
    CLUSTER_Name – -size 10 
  2. Add a tag to the instances in the cluster with the following command: gcloud compute instances add-tags 
    INSTANCE - -tags enable- 
    autoscaling max-nodes-10 
  3. Update the existing Kubernetes Engine cluster with the following command: gcloud alpha container clusters 
    update mycluster - -enable- 
    autoscaling - -min-nodes=1 - -max-nodes=10 
  4. Create a new Kubernetes Engine cluster with the following command: gcloud alpha container clusters 
    create mycluster - -enable- 
    autoscaling - -min-nodes=1 - -max-nodes=10 
    and redeploy your application  
Correct answer: C



Question 10

Your marketing department wants to send out a promotional email campaign. The development team wants to minimize direct operation management. They project a wide range of possible customer responses, from 100 to 500,000 click-through per day. The link leads to a simple website that explains the promotion and collects user information and preferences.  
Which infrastructure should you recommend? (Choose two.) 
 


  1. Use Google App Engine to serve the website and Google Cloud Datastore to store user data. 
  2. Use a Google Container Engine cluster to serve the website and store data to persistent disk. 
  3. Use a managed instance group to serve the website and Google Cloud Bigtable to store user data. 
  4. Use a single Compute Engine virtual machine (VM) to host a web server, backend by Google Cloud SQL.  
Correct answer: AC
Explanation:
  Reference: https://cloud.google.com/storage-options/  
 
 
Reference: https://cloud.google.com/storage-options/  









CONNECT US

Facebook

Twitter

PROFEXAM WITH A 20% DISCOUNT

You can buy ProfExam with a 20% discount!



HOW TO OPEN VCEX FILES

Use ProfExam Simulator to open VCEX files