Practice Free Professional Cloud Developer Exam Online Questions
You are deploying your application to a Compute Engine virtual machine instance. Your application is configured to write its log files to disk. You want to view the logs in Stackdriver Logging without changing the application code.
What should you do?
- A . Install the Stackdriver Logging Agent and configure it to send the application logs.
- B . Use a Stackdriver Logging Library to log directly from the application to Stackdriver Logging.
- C . Provide the log file folder path in the metadata of the instance to configure it to send the application logs.
- D . Change the application to log to /var/log so that its logs are automatically sent to Stackdriver Logging.
Your company has a BigQuery data mart that provides analytics information to hundreds of employees. One user of wants to run jobs without interrupting important workloads. This user isn’t concerned about the time it takes to run these jobs. You want to fulfill this request while minimizing cost to the company and the effort required on your part.
What should you do?
- A . Ask the user to run the jobs as batch jobs.
- B . Create a separate project for the user to run jobs.
- C . Add the user as a job.user role in the existing project.
- D . Allow the user to run jobs when important workloads are not running.
You are developing a new public-facing application that needs to retrieve specific properties in the metadata of users’ objects in their respective Cloud Storage buckets. Due to privacy and data residency requirements, you must retrieve only the metadata and not the object data. You want to maximize the performance of the retrieval process.
How should you retrieve the metadata?
- A . Use the patch method.
- B . Use the compose method.
- C . Use the copy method.
- D . Use the fields request parameter.
D
Explanation:
https://cloud.google.com/storage/docs/json_api/v1/objects/get
You are using Cloud Build to create a new Docker image on each source code commit to a Cloud Source Repositoties repository. Your application is built on every commit to the master branch. You want to release specific commits made to the master branch in an automated method.
What should you do?
- A . Manually trigger the build for new releases.
- B . Create a build trigger on a Git tag pattern. Use a Git tag convention for new releases.
- C . Create a build trigger on a Git branch name pattern. Use a Git branch naming convention for new releases.
- D . Commit your source code to a second Cloud Source Repositories repository with a second Cloud Build trigger. Use this repository for new releases only.
C
Explanation:
Reference: https://docs.docker.com/docker-hub/builds/
You are developing an event-driven application. You have created a topic to receive messages sent to Pub/Sub. You want those messages to be processed in real time. You need the application to be independent from any other system and only incur compute costs when new messages arrive. You want to configure the simplest and most efficient architecture.
What should you do?
- A . Deploy your code on Cloud Functions. Use a Pub/Sub trigger to invoke the Cloud Function. Use the Pub/Sub API to create a pull subscription to the Pub/Sub topic and read messages from it.
- B . Deploy your code on Cloud Functions. Use a Pub/Sub trigger to handle new messages in the topic.
- C . Deploy the application on Google Kubernetes Engine. Use the Pub/Sub API to create a pull subscription to the Pub/Sub topic and read messages from it
- D . Deploy the application on Compute Engine. Use a Pub/Sub push subscription to process new messages in the topic.
B
Explanation:
https://cloud.google.com/functions/docs/calling/pubsub
You are developing an event-driven application. You have created a topic to receive messages sent to Pub/Sub. You want those messages to be processed in real time. You need the application to be independent from any other system and only incur compute costs when new messages arrive. You want to configure the simplest and most efficient architecture.
What should you do?
- A . Deploy your code on Cloud Functions. Use a Pub/Sub trigger to invoke the Cloud Function. Use the Pub/Sub API to create a pull subscription to the Pub/Sub topic and read messages from it.
- B . Deploy your code on Cloud Functions. Use a Pub/Sub trigger to handle new messages in the topic.
- C . Deploy the application on Google Kubernetes Engine. Use the Pub/Sub API to create a pull subscription to the Pub/Sub topic and read messages from it
- D . Deploy the application on Compute Engine. Use a Pub/Sub push subscription to process new messages in the topic.
B
Explanation:
https://cloud.google.com/functions/docs/calling/pubsub
Your team detected a spike of errors in an application running on Cloud Run in your production project. The application is configured to read messages from Pub/Sub topic A, process the messages, and write the messages to topic B. You want to conduct tests to identify the cause of the errors. You can use a set of mock messages for testing.
What should you do?
- A . Deploy the Pub/Sub and Cloud Run emulators on your local machine. Deploy the application locally, and change the logging level in the application to DEBUG or INFO. Write mock messages to topic A, and then analyze the logs.
- B . Use the gcloud CLI to write mock messages to topic A. Change the logging level in the application to DEBUG or INFO, and then analyze the logs.
- C . Deploy the Pub/Sub emulator on your local machine. Point the production application to your local Pub/Sub topics. Write mock messages to topic A, and then analyze the logs.
- D . Use the Google Cloud console to write mock messages to topic A. Change the logging level in the application to DEBUG or INFO, and then analyze the logs.
Your application requires service accounts to be authenticated to GCP products via credentials stored on its host Compute Engine virtual machine instances. You want to distribute these credentials to the host instances as securely as possible.
What should you do?
- A . Use HTTP signed URLs to securely provide access to the required resources.
- B . Use the instance’s service account Application Default Credentials to authenticate to the required resources.
- C . Generate a P12 file from the GCP Console after the instance is deployed, and copy the credentials to the host instance before starting the application.
- D . Commit the credential JSON file into your application’s source repository, and have your CI/CD process package it with the software that is deployed to the instance.
B
Explanation:
Reference: https://cloud.google.com/compute/docs/api/how-tos/authorization
For this question, refer to the HipLocal case study.
How should HipLocal redesign their architecture to ensure that the application scales to support a large increase in users?
- A . Use Google Kubernetes Engine (GKE) to run the application as a microservice. Run the MySQL database on a dedicated GKE node.
- B . Use multiple Compute Engine instances to run MySQL to store state information. Use a Google Cloud-managed load balancer to distribute the load between instances. Use managed instance groups for scaling.
- C . Use Memorystore to store session information and CloudSQL to store state information. Use a Google Cloud-managed load balancer to distribute the load between instances. Use managed instance groups for scaling.
- D . Use a Cloud Storage bucket to serve the application as a static website, and use another Cloud Storage bucket to store user state information.
You are developing a web application that will be accessible over both HTTP and HTTPS and will run on Compute Engine instances. On occasion, you will need to SSH from your remote laptop into one of the Compute Engine instances to conduct maintenance on the app.
How should you configure the instances while following Google-recommended best practices?
- A . Set up a backend with Compute Engine web server instances with a private IP address behind a TCP proxy load balancer.
- B . Configure the firewall rules to allow all ingress traffic to connect to the Compute Engine web servers, with each server having a unique external IP address.
- C . Configure Cloud Identity-Aware Proxy API for SSH access. Then configure the Compute Engine servers with private IP addresses behind an HTTP(s) load balancer for the application web traffic.
- D . Set up a backend with Compute Engine web server instances with a private IP address behind an HTTP(S) load balancer. Set up a bastion host with a public IP address and open firewall ports. Connect to the web instances using the bastion host.
C
Explanation:
Reference:
https://cloud.google.com/compute/docs/instances/connecting-advanced#cloud_iap https://cloud.google.com/solutions/connecting-securely#storing_host_keys_by_enabling_guest_attributes