Practice Free Associate Cloud Engineer Exam Online Questions
You have deployed an application on a single Compute Engine instance. The application writes logs to disk. Users start reporting errors with the application. You want to diagnose the problem.
What should you do?
- A . Navigate to Cloud Logging and view the application logs.
- B . Connect to the instance’s serial console and read the application logs.
- C . Configure a Health Check on the instance and set a Low Healthy Threshold value.
- D . Install and configure the Cloud Logging Agent and view the logs from Cloud Logging.
D
Explanation:
Reference: https://cloud.google.com/error-reporting/docs/setup/compute-engine
Cloud Loging knows nothing about applications installed on the system without an agent collecting logs. Using the serial console is not a best-practice and is impractical on a large scale.
The VM images for Compute Engine and Amazon Elastic Compute Cloud (EC2) don’t include the
Logging agent, so you must complete these steps to install it on those instances. The agent runs
under both Linux and Windows.
Source: https://cloud.google.com/logging/docs/agent/logging/installation
You have just created a new project which will be used to deploy a globally distributed application. You will use Cloud Spanner for data storage. You want to create a Cloud Spanner instance. You want to perform the first step in preparation of creating the instance.
What should you do?
- A . Grant yourself the IAM role of Cloud Spanner Admin
- B . Create a new VPC network with subnetworks in all desired regions
- C . Configure your Cloud Spanner instance to be multi-regional
- D . Enable the Cloud Spanner API
C
Explanation:
https://cloud.google.com/spanner/docs/getting-started/set-up
Your company has an existing GCP organization with hundreds of projects and a billing account. Your company recently acquired another company that also has hundreds of projects and its own billing account. You would like to consolidate all GCP costs of both GCP organizations onto a single invoice. You would like to consolidate all costs as of tomorrow.
What should you do?
- A . Link the acquired company’s projects to your company’s billing account.
- B . Configure the acquired company’s billing account and your company’s billing account to export the billing data into the same BigQuery dataset.
- C . Migrate the acquired company’s projects into your company’s GCP organization. Link the migrated projects to your company’s billing account.
- D . Create a new GCP organization and a new billing account. Migrate the acquired company’s projects and your company’s projects into the new GCP organization and link the projects to the new billing account.
A
Explanation:
https://cloud.google.com/resource-manager/docs/project-migration#oauth_consent_screen
https://cloud.google.com/resource-manager/docs/project-migration
Your company publishes large files on an Apache web server that runs on a Compute Engine instance. The Apache web server is not the only application running in the project. You want to receive an email when the egress network costs for the server exceed 100 dollars for the current month as measured by Google Cloud Platform (GCP).
What should you do?
- A . Set up a budget alert on the project with an amount of 100 dollars, a threshold of 100%, and notification type of “email.”
- B . Set up a budget alert on the billing account with an amount of 100 dollars, a threshold of 100%, and notification type of “email.”
- C . Export the billing data to BigQuery. Create a Cloud Function that uses BigQuery to sum the egress network costs of the exported billing data for the Apache web server for the current month and sends an email if it is over 100 dollars. Schedule the Cloud Function using Cloud Scheduler to run hourly.
- D . Use the Stackdriver Logging Agent to export the Apache web server logs to Stackdriver Logging. Create a Cloud Function that uses BigQuery to parse the HTTP response log data in Stackdriver for the current month and sends an email if the size of all HTTP responses, multiplied by current GCP egress prices, totals over 100 dollars. Schedule the Cloud Function using Cloud Scheduler to run hourly.
C
Explanation:
https://blog.doit-intl.com/the-truth-behind-google-cloud-egress-traffic-6e8f57b5c2f8
You need to configure optimal data storage for files stored in Cloud Storage for minimal cost. The files are used in a mission-critical analytics pipeline that is used continually. The users are in Boston, MA (United States).
What should you do?
- A . Configure regional storage for the region closest to the users Configure a Nearline storage class
- B . Configure regional storage for the region closest to the users Configure a Standard storage class
- C . Configure dual-regional storage for the dual region closest to the users Configure a Nearline storage class
- D . Configure dual-regional storage for the dual region closest to the users Configure a Standard storage class
B
Explanation:
Keywords: – continually -> Standard – mission-critical analytics -> dual-regional
You need to configure optimal data storage for files stored in Cloud Storage for minimal cost. The files are used in a mission-critical analytics pipeline that is used continually. The users are in Boston, MA (United States).
What should you do?
- A . Configure regional storage for the region closest to the users Configure a Nearline storage class
- B . Configure regional storage for the region closest to the users Configure a Standard storage class
- C . Configure dual-regional storage for the dual region closest to the users Configure a Nearline storage class
- D . Configure dual-regional storage for the dual region closest to the users Configure a Standard storage class
B
Explanation:
Keywords: – continually -> Standard – mission-critical analytics -> dual-regional
You have created a new project in Google Cloud through the gcloud command line interface (CLI) and linked a billing account. You need to create a new Compute
Engine instance using the CLI. You need to perform the prerequisite steps.
What should you do?
- A . Create a Cloud Monitoring Workspace.
- B . Create a VPC network in the project.
- C . Enable the compute googleapis.com API.
- D . Grant yourself the IAM role of Compute Admin.
Your company requires all developers to have the same permissions, regardless of the Google Cloud
project they are working on. Your company’s security policy also restricts developer permissions to Compute Engine. Cloud Functions, and Cloud SQL. You want to implement the security policy with minimal effort.
What should you do?
- A . • Create a custom role with Compute Engine, Cloud Functions, and Cloud SQL permissions in one project within the Google Cloud organization.
• Copy the role across all projects created within the organization with the gcloud iam roles copy command.
• Assign the role to developers in those projects. - B . • Add all developers to a Google group in Google Groups for Workspace.
• Assign the predefined role of Compute Admin to the Google group at the Google Cloud organization level. - C . • Add all developers to a Google group in Cloud Identity.
• Assign predefined roles for Compute Engine, Cloud Functions, and Cloud SQL permissions to the Google group for each project in the Google Cloud organization. - D . • Add all developers to a Google group in Cloud Identity.
• Create a custom role with Compute Engine, Cloud Functions, and Cloud SQL permissions at the Google Cloud organization level.
• Assign the custom role to the Google group.
D
Explanation:
https://www.cloudskillsboost.google/focuses/1035?parent=catalog#:~:text=custom%20role%20at%2 0the%20organization%20level
Your company requires all developers to have the same permissions, regardless of the Google Cloud
project they are working on. Your company’s security policy also restricts developer permissions to Compute Engine. Cloud Functions, and Cloud SQL. You want to implement the security policy with minimal effort.
What should you do?
- A . • Create a custom role with Compute Engine, Cloud Functions, and Cloud SQL permissions in one project within the Google Cloud organization.
• Copy the role across all projects created within the organization with the gcloud iam roles copy command.
• Assign the role to developers in those projects. - B . • Add all developers to a Google group in Google Groups for Workspace.
• Assign the predefined role of Compute Admin to the Google group at the Google Cloud organization level. - C . • Add all developers to a Google group in Cloud Identity.
• Assign predefined roles for Compute Engine, Cloud Functions, and Cloud SQL permissions to the Google group for each project in the Google Cloud organization. - D . • Add all developers to a Google group in Cloud Identity.
• Create a custom role with Compute Engine, Cloud Functions, and Cloud SQL permissions at the Google Cloud organization level.
• Assign the custom role to the Google group.
D
Explanation:
https://www.cloudskillsboost.google/focuses/1035?parent=catalog#:~:text=custom%20role%20at%2 0the%20organization%20level
Your Dataproc cluster runs in a single Virtual Private Cloud (VPC) network in a single subnet with range 172.16.20.128/25. There are no private IP addresses available in the VPC network. You want to add new VMs to communicate with your cluster using the minimum number of steps.
What should you do?
- A . Modify the existing subnet range to 172.16.20.0/24.
- B . Create a new Secondary IP Range in the VPC and configure the VMs to use that range.
- C . Create a new VPC network for the VMs. Enable VPC Peering between the VMs’ VPC network and the Dataproc cluster VPC network.
- D . Create a new VPC network for the VMs with a subnet of 172.32.0.0/16. Enable VPC network Peering between the Dataproc VPC network and the VMs VPC network. Configure a custom Route exchange.
A
Explanation:
/25:
CIDR to IP Range
Result
CIDR Range 172.16.20.128/25
Netmask 255.255.255.128
Wildcard Bits 0.0.0.127
First IP 172.16.20.128
First IP (Decimal) 2886734976
Last IP 172.16.20.255
Last IP (Decimal) 2886735103
Total Host 128
CIDR
