Practice Free Associate Cloud Engineer Exam Online Questions
A team of data scientists infrequently needs to use a Google Kubernetes Engine (GKE) cluster that you manage. They require GPUs for some long-running, non-restartable jobs. You want to minimize cost.
What should you do?
- A . Enable node auto-provisioning on the GKE cluster.
- B . Create a VerticalPodAutscaler for those workloads.
- C . Create a node pool with preemptible VMs and GPUs attached to those VMs.
- D . Create a node pool of instances with GPUs, and enable autoscaling on this node pool with a minimum size of 1.
A
Explanation:
auto-provisioning = Attaches and deletes node pools to cluster based on the requirements. Hence creating a GPU node pool, and auto-scaling would be better https://cloud.google.com/kubernetes-engine/docs/how-to/node-auto-provisioning
You need to set a budget alert for use of Compute Engineer services on one of the three Google Cloud Platform projects that you manage. All three projects are linked to a single billing account.
What should you do?
- A . Verify that you are the project billing administrator. Select the associated billing account and create a budget and alert for the appropriate project.
- B . Verify that you are the project billing administrator. Select the associated billing account and create a budget and a custom alert.
- C . Verify that you are the project administrator. Select the associated billing account and create a budget for the appropriate project.
- D . Verify that you are project administrator. Select the associated billing account and create a budget and a custom alert.
A
Explanation:
https://cloud.google.com/iam/docs/understanding-roles#billing-roles
You need to set a budget alert for use of Compute Engineer services on one of the three Google Cloud Platform projects that you manage. All three projects are linked to a single billing account.
What should you do?
- A . Verify that you are the project billing administrator. Select the associated billing account and create a budget and alert for the appropriate project.
- B . Verify that you are the project billing administrator. Select the associated billing account and create a budget and a custom alert.
- C . Verify that you are the project administrator. Select the associated billing account and create a budget for the appropriate project.
- D . Verify that you are project administrator. Select the associated billing account and create a budget and a custom alert.
A
Explanation:
https://cloud.google.com/iam/docs/understanding-roles#billing-roles
Your company completed the acquisition of a startup and is now merging the IT systems of both companies. The startup had a production Google Cloud project in their organization. You need to move this project into your organization and ensure that the project is billed lo your organization. You want to accomplish this task with minimal effort.
What should you do?
- A . Use the projects. move method to move the project to your organization. Update the billing account of the project to that of your organization.
- B . Ensure that you have an Organization Administrator Identity and Access Management (IAM) role assigned to you in both organizations. Navigate to the Resource Manager in the startup’s Google Cloud organization, and drag the project to your company’s organization.
- C . Create a Private Catalog tor the Google Cloud Marketplace, and upload the resources of the startup’s production project to the Catalog. Share the Catalog with your organization, and deploy the resources in your company’s project.
- D . Create an infrastructure-as-code template tor all resources in the project by using Terraform. and deploy that template to a new project in your organization. Delete the protect from the startup’s Google Cloud organization.
Your company completed the acquisition of a startup and is now merging the IT systems of both companies. The startup had a production Google Cloud project in their organization. You need to move this project into your organization and ensure that the project is billed lo your organization. You want to accomplish this task with minimal effort.
What should you do?
- A . Use the projects. move method to move the project to your organization. Update the billing account of the project to that of your organization.
- B . Ensure that you have an Organization Administrator Identity and Access Management (IAM) role assigned to you in both organizations. Navigate to the Resource Manager in the startup’s Google Cloud organization, and drag the project to your company’s organization.
- C . Create a Private Catalog tor the Google Cloud Marketplace, and upload the resources of the startup’s production project to the Catalog. Share the Catalog with your organization, and deploy the resources in your company’s project.
- D . Create an infrastructure-as-code template tor all resources in the project by using Terraform. and deploy that template to a new project in your organization. Delete the protect from the startup’s Google Cloud organization.
A colleague handed over a Google Cloud project for you to maintain. As part of a security checkup, you want to review who has been granted the Project Owner role.
What should you do?
- A . In the Google Cloud console, validate which SSH keys have been stored as project-wide keys.
- B . Navigate to Identity-Aware Proxy and check the permissions for these resources.
- C . Enable Audit logs on the IAM & admin page for all resources, and validate the results.
- D . Use the gcloud projects get-iam-policy command to view the current role assignments.
D
Explanation:
The gcloud projects get-iam-policy command displays the IAM policy for a project, which includes the roles and members assigned to those roles. The Project Owner role grants full access to all resources and actions in the project. By using this command, you can review who has been granted this role and make any necessary changes.
Reference:
1: Associate Cloud Engineer Certification Exam Guide | Learn – Google Cloud
2: gcloud projects get-iam-policy | Cloud SDK Documentation | Google Cloud
3: Understanding roles | Cloud IAM Documentation | Google Cloud
A colleague handed over a Google Cloud project for you to maintain. As part of a security checkup, you want to review who has been granted the Project Owner role.
What should you do?
- A . In the Google Cloud console, validate which SSH keys have been stored as project-wide keys.
- B . Navigate to Identity-Aware Proxy and check the permissions for these resources.
- C . Enable Audit logs on the IAM & admin page for all resources, and validate the results.
- D . Use the gcloud projects get-iam-policy command to view the current role assignments.
D
Explanation:
The gcloud projects get-iam-policy command displays the IAM policy for a project, which includes the roles and members assigned to those roles. The Project Owner role grants full access to all resources and actions in the project. By using this command, you can review who has been granted this role and make any necessary changes.
Reference:
1: Associate Cloud Engineer Certification Exam Guide | Learn – Google Cloud
2: gcloud projects get-iam-policy | Cloud SDK Documentation | Google Cloud
3: Understanding roles | Cloud IAM Documentation | Google Cloud
An employee was terminated, but their access to Google Cloud Platform (GCP) was not removed until 2 weeks later. You need to find out this employee accessed any sensitive customer information after their termination.
What should you do?
- A . View System Event Logs in Stackdriver. Search for the user’s email as the principal.
- B . View System Event Logs in Stackdriver. Search for the service account associated with the user.
- C . View Data Access audit logs in Stackdriver. Search for the user’s email as the principal.
- D . View the Admin Activity log in Stackdriver. Search for the service account associated with the user.
C
Explanation:
https://cloud.google.com/logging/docs/audit
Data Access audit logs Data Access audit logs contain API calls that read the configuration or metadata of resources, as well as user-driven API calls that create, modify, or read user-provided resource data.
https://cloud.google.com/logging/docs/audit#data-access
The storage costs for your application logs have far exceeded the project budget. The logs are currently being retained indefinitely in the Cloud Storage bucket myapp-gcp-ace-logs. You have been asked to remove logs older than 90 days from your Cloud Storage bucket. You want to optimize ongoing Cloud Storage spend.
What should you do?
- A . Write a script that runs gsutil Is -| C gs://myapp-gcp-ace-logs/ to find and remove items older than
90 days. Schedule the script with cron. - B . Write a lifecycle management rule in JSON and push it to the bucket with gsutil lifecycle set config-json-file.
- C . Write a lifecycle management rule in XML and push it to the bucket with gsutil lifecycle set config-xml-file.
- D . Write a script that runs gsutil Is -Ir gs://myapp-gcp-ace-logs/ to find and remove items older than 90 days. Repeat this process every morning.
B
Explanation:
You write a lifecycle management rule in XML and push it to the bucket with gsutil lifecycle set config-xml-file. is not right.
gsutil lifecycle set enables you to set the lifecycle configuration on one or more buckets based on the configuration file provided. However, XML is not a valid supported type for the configuration file.
Ref: https://cloud.google.com/storage/docs/gsutil/commands/lifecycle
Write a script that runs gsutil ls -lr gs://myapp-gcp-ace-logs/ to find and remove items older than 90 days. Repeat this process every morning. is not right.
This manual approach is error-prone, time-consuming and expensive. GCP Cloud Storage provides lifecycle management rules that let you achieve this with minimal effort.
Write a script that runs gsutil ls -l gs://myapp-gcp-ace-logs/ to find and remove items older than 90 days. Schedule the script with cron. is not right.
This manual approach is error-prone, time-consuming and expensive. GCP Cloud Storage provides lifecycle management rules that let you achieve this with minimal effort.
Write a lifecycle management rule in JSON and push it to the bucket with gsutil lifecycle set config-json-file. is the right answer.
You can assign a lifecycle management configuration to a bucket. The configuration contains a set of rules which apply to current and future objects in the bucket. When an object meets the criteria of
one of the rules, Cloud Storage automatically performs a specified action on the object. One of the supported actions is to Delete objects. You can set up a lifecycle management to delete objects older than 90 days. gsutil lifecycle set enables you to set the lifecycle configuration on the bucket based on the configuration file. JSON is the only supported type for the configuration file. The config-json-file specified on the command line should be a path to a local file containing the lifecycle configuration JSON document.
Ref: https://cloud.google.com/storage/docs/gsutil/commands/lifecycle
Ref: https://cloud.google.com/storage/docs/lifecycle
Your team is building a website that handles votes from a large user population. The incoming votes will arrive at various rates. You want to optimize the storage and processing of the votes.
What should you do?
- A . Save the incoming votes to Firestore. Use Cloud Scheduler to trigger a Cloud Functions instance to periodically process the votes.
- B . Use a dedicated instance to process the incoming votes. Send the votes directly to this instance.
- C . Save the incoming votes to a JSON file on Cloud Storage. Process the votes in a batch at the end of the day.
- D . Save the incoming votes to Pub/Sub. Use the Pub/Sub topic to trigger a Cloud Functions instance to process the votes.
B
Explanation:
Pub/Sub is a scalable and reliable messaging service that can handle large volumes of data from different sources at different rates. It allows you to decouple the producers and consumers of the data, and provides a durable and persistent storage for the messages until they are delivered. Cloud Functions is a serverless platform that can execute code in response to events, such as messages published to a Pub/Sub topic. It can scale automatically based on the load, and you only pay for the resources you use. By using Pub/Sub and Cloud Functions, you can optimize the storage and processing of the votes, as you can handle the variable rates of incoming votes, process them in real time or near real time, and avoid managing servers or VMs.
Reference: Pub/Sub documentation
Cloud Functions documentation
Choosing a messaging service for Google Cloud
