Practice Free Associate Cloud Engineer Exam Online Questions
You have an application that uses Cloud Spanner as a backend database. The application has a very predictable traffic pattern. You want to automatically scale up or down the number of Spanner nodes depending on traffic.
What should you do?
- A . Create a cron job that runs on a scheduled basis to review stackdriver monitoring metrics, and then resize the Spanner instance accordingly.
- B . Create a Stackdriver alerting policy to send an alert to oncall SRE emails when Cloud Spanner CPU exceeds the threshold. SREs would scale resources up or down accordingly.
- C . Create a Stackdriver alerting policy to send an alert to Google Cloud Support email when Cloud Spanner CPU exceeds your threshold. Google support would scale resources up or down accordingly.
- D . Create a Stackdriver alerting policy to send an alert to webhook when Cloud Spanner CPU is over or under your threshold. Create a Cloud Function that listens to HTTP and resizes Spanner resources accordingly.
D
Explanation:
As to mexblood1’s point, CPU utilization is a recommended proxy for traffic when it comes to Cloud Spanner. See: Alerts for high CPU utilization The following table specifies our recommendations for maximum CPU usage for both single-region and multi-region instances. These numbers are to ensure that your instance has enough compute capacity to continue to serve your traffic in the event of the loss of an entire zone (for single-region instances) or an entire region (for multi-region instances). – https://cloud.google.com/spanner/docs/cpu-utilization
You are deploying a web application using Compute Engine. You created a managed instance group (MIG) to host the application. You want to follow Google-recommended practices to implement a secure and highly available solution.
What should you do?
- A . Use SSL proxy load balancing for the MIG and an A record in your DNS private zone with the load balancer’s IP address.
- B . Use SSL proxy load balancing for the MIG and a CNAME record in your DNS public zone with the load balancer’s IP address.
- C . Use HTTP(S) load balancing for the MIG and a CNAME record in your DNS private zone with the load balancer’s IP address.
- D . Use HTTP(S) load balancing for the MIG and an A record in your DNS public zone with the load balancer’s IP address.
D
Explanation:
HTTP(S) load balancing is a Google-recommended practice for distributing web traffic across multiple regions and zones, and providing high availability, scalability, and security for web applications. It supports both IPv4 and IPv6 addresses, and can handle SSL/TLS termination and encryption. It also integrates with Cloud CDN, Cloud Armor, and Cloud Identity-Aware Proxy for enhanced performance and protection. A MIG can be used as a backend service for HTTP(S) load balancing, and can automatically scale and heal the VM instances that host the web application.
To configure DNS for HTTP(S) load balancing, you need to create an A record in your DNS public zone with the load balancer’s IP address. This will map your domain name to the load balancer’s IP address, and allow users to access your web application using the domain name. A CNAME record is not recommended, as it can cause latency and DNS resolution issues. A private zone is not suitable, as it is only visible within your VPC network, and not to the public internet.
HTTP(S) Load Balancing documentation
Setting up DNS records for HTTP(S) load balancing
Choosing a load balancer
Your company developed a mobile game that is deployed on Google Cloud. Gamers are connecting to the game with their personal phones over the Internet. The game sends UDP packets to update the servers about the gamers’ actions while they are playing in multiplayer mode. Your game backend can scale over multiple virtual machines (VMs), and you want to expose the VMs over a single IP address.
What should you do?
- A . Configure an SSL Proxy load balancer in front of the application servers.
- B . Configure an Internal UDP load balancer in front of the application servers.
- C . Configure an External HTTP(s) load balancer in front of the application servers.
- D . Configure an External Network load balancer in front of the application servers.
D
Explanation:
cell phones are sending UDP packets and the only that can receive that type of traffic is a External Network TCP/UDPhttps://cloud.google.com/load-balancing/docs/network
https://cloud.google.com/load-balancing/docs/choosing-load-balancer#lb-decision-tree
You want to send and consume Cloud Pub/Sub messages from your App Engine application. The Cloud Pub/Sub API is currently disabled. You will use a service account to authenticate yourapplication to the API. You want to make sure your application can use Cloud Pub/Sub.
What should you do?
- A . Enable the Cloud Pub/Sub API in the API Library on the GCP Console.
- B . Rely on the automatic enablement of the Cloud Pub/Sub API when the Service Account accesses it.
- C . Use Deployment Manager to deploy your application. Rely on the automatic enablement of all APIs used by the application being deployed.
- D . Grant the App Engine Default service account the role of Cloud Pub/Sub Admin. Have your application enable the API on the first connection to Cloud Pub/Sub.
A
Explanation:
Quickstart: using the Google Cloud Console
This page shows you how to perform basic tasks in Pub/Sub using the Google Cloud Console.
Note: If you are new to Pub/Sub, we recommend that you start with the interactive tutorial.
Before you begin
Set up a Cloud Console project.
Set up a project
Click to:
Create or select a project.
Enable the Pub/Sub API for that project.
You can view and manage these resources at any time in the Cloud Console.
Install and initialize the Cloud SDK.
Note: You can run the gcloud tool in the Cloud Console without installing the Cloud SDK. To run the gcloud tool in the Cloud Console, use Cloud Shell.
https://cloud.google.com/pubsub/docs/quickstart-console
You want to send and consume Cloud Pub/Sub messages from your App Engine application. The Cloud Pub/Sub API is currently disabled. You will use a service account to authenticate yourapplication to the API. You want to make sure your application can use Cloud Pub/Sub.
What should you do?
- A . Enable the Cloud Pub/Sub API in the API Library on the GCP Console.
- B . Rely on the automatic enablement of the Cloud Pub/Sub API when the Service Account accesses it.
- C . Use Deployment Manager to deploy your application. Rely on the automatic enablement of all APIs used by the application being deployed.
- D . Grant the App Engine Default service account the role of Cloud Pub/Sub Admin. Have your application enable the API on the first connection to Cloud Pub/Sub.
A
Explanation:
Quickstart: using the Google Cloud Console
This page shows you how to perform basic tasks in Pub/Sub using the Google Cloud Console.
Note: If you are new to Pub/Sub, we recommend that you start with the interactive tutorial.
Before you begin
Set up a Cloud Console project.
Set up a project
Click to:
Create or select a project.
Enable the Pub/Sub API for that project.
You can view and manage these resources at any time in the Cloud Console.
Install and initialize the Cloud SDK.
Note: You can run the gcloud tool in the Cloud Console without installing the Cloud SDK. To run the gcloud tool in the Cloud Console, use Cloud Shell.
https://cloud.google.com/pubsub/docs/quickstart-console
You want to send and consume Cloud Pub/Sub messages from your App Engine application. The Cloud Pub/Sub API is currently disabled. You will use a service account to authenticate yourapplication to the API. You want to make sure your application can use Cloud Pub/Sub.
What should you do?
- A . Enable the Cloud Pub/Sub API in the API Library on the GCP Console.
- B . Rely on the automatic enablement of the Cloud Pub/Sub API when the Service Account accesses it.
- C . Use Deployment Manager to deploy your application. Rely on the automatic enablement of all APIs used by the application being deployed.
- D . Grant the App Engine Default service account the role of Cloud Pub/Sub Admin. Have your application enable the API on the first connection to Cloud Pub/Sub.
A
Explanation:
Quickstart: using the Google Cloud Console
This page shows you how to perform basic tasks in Pub/Sub using the Google Cloud Console.
Note: If you are new to Pub/Sub, we recommend that you start with the interactive tutorial.
Before you begin
Set up a Cloud Console project.
Set up a project
Click to:
Create or select a project.
Enable the Pub/Sub API for that project.
You can view and manage these resources at any time in the Cloud Console.
Install and initialize the Cloud SDK.
Note: You can run the gcloud tool in the Cloud Console without installing the Cloud SDK. To run the gcloud tool in the Cloud Console, use Cloud Shell.
https://cloud.google.com/pubsub/docs/quickstart-console
You have a Bigtable instance that consists of three nodes that store personally identifiable information (Pll) data. You need to log all read or write operations, including any metadata or configuration reads of this database table, in your company’s Security Information and Event Management (SIEM) system.
What should you do?
- A . • Navigate to Cloud Mentioning in the Google Cloud console, and create a custom monitoring job for the Bigtable instance to track all changes.
• Create an alert by using webhook endpoints. with the SIEM endpoint as a receiver - B . • Navigate to the Audit Logs page in the Google Cloud console, and enable Data Read. Data Write and Admin Read logs for the Bigtable instance
• Create a Pub/Sub topic as a Cloud Logging sink destination, and add your SIEM as a subscriber to the topic. - C . • Install the Ops Agent on the Bigtable instance during configuration. K
• Create a service account with read permissions for the Bigtable instance.
• Create a custom Dataflow job with this service account to export logs to the company’s SIEM system. - D . • Navigate to the Audit Logs page in the Google Cloud console, and enable Admin Write logs for the Biglable instance.
• Create a Cloud Functions instance to export logs from Cloud Logging to your SIEM.
You want to add a new auditor to a Google Cloud Platform project. The auditor should be allowed to read, but not modify, all project items.
How should you configure the auditor’s permissions?
- A . Create a custom role with view-only project permissions. Add the user’s account to the custom role.
- B . Create a custom role with view-only service permissions. Add the user’s account to the custom role.
- C . Select the built-in IAM project Viewer role. Add the user’s account to this role.
- D . Select the built-in IAM service Viewer role. Add the user’s account to this role.
C
Explanation:
Reference: https://cloud.google.com/resource-manager/docs/access-control-proj
The primitive role roles/viewer provides read access to all resources in the project. The permissions in this role are limited to Get and list access for all resources. As we have an out of the box role that exactly fits our requirement, we should use this.
Ref: https://cloud.google.com/resource-manager/docs/access-control-proj
It is advisable to use the existing GCP provided roles over creating custom roles with similar permissions as this becomes a maintenance overhead. If GCP modifies how permissions are handled or adds/removes permissions, the default GCP provided roles are automatically updated by Google whereas if they were custom roles, the responsibility is with us and this adds to the operational overhead and needs to be avoided.
Your organization is a financial company that needs to store audit log files for 3 years. Your organization has hundreds of Google Cloud projects. You need to implement a cost-effective approach for log file retention.
What should you do?
- A . Create an export to the sink that saves logs from Cloud Audit to BigQuery.
- B . Create an export to the sink that saves logs from Cloud Audit to a Coldline Storage bucket.
- C . Write a custom script that uses logging API to copy the logs from Stackdriver logs to BigQuery.
- D . Export these logs to Cloud Pub/Sub and write a Cloud Dataflow pipeline to store logs to Cloud SQL.
B
Explanation:
Coldline Storage is the perfect service to store audit logs from all the projects and is very cost-efficient as well. Coldline Storage is a very low-cost, highly durable storage service for storing infrequently accessed data.
You have two Google Cloud projects: project-a with VPC vpc-a (10.0.0.0/16) and project-b with VPC vpc-b (10.8.0.0/16). Your frontend application resides in vpc-a and the backend API services ate deployed in vpc-b. You need to efficiently and cost-effectively enable communication between these Google Cloud projects. You also want to follow Google-recommended practices.
What should you do?
- A . Configure a Cloud Router in vpc-a and another Cloud Router in vpc-b.
- B . Configure a Cloud Interconnect connection between vpc-a and vpc-b.
- C . Create VPC Network Peering between vpc-a and vpc-b.
- D . Create an OpenVPN connection between vpc-a and vpc-b.
