Practice Free Associate Cloud Engineer Exam Online Questions
You need to enable traffic between multiple groups of Compute Engine instances that are currently running two different GCP projects. Each group of Compute Engine instances is running in its own VPC.
What should you do?
- A . Verify that both projects are in a GCP Organization. Create a new VPC and add all instances.
- B . Verify that both projects are in a GCP Organization. Share the VPC from one project and request that the Compute Engine instances in the other project use this shared VPC.
- C . Verify that you are the Project Administrator of both projects. Create two new VPCs and add all
instances. - D . Verify that you are the Project Administrator of both projects. Create a new VPC and add all instances.
B
Explanation:
Shared VPC allows an organization to connect resources from multiple projects to a common Virtual Private Cloud (VPC) network, so that they can communicate with each other securely and efficiently using internal IPs from that network. When you use Shared VPC, you designate a project as a host project and attach one or more other service projects to it. The VPC networks in the host project are called Shared VPC networks. Eligible resources from service projects can use subnets in the Shared VPC network
https://cloud.google.com/vpc/docs/shared-vpc
"For example, an existing instance in a service project cannot be reconfigured to use a Shared VPC network, but a new instance can be created to use available subnets in a Shared VPC network."
Your company is modernizing its applications and refactoring them to containerized microservices. You need to deploy the infrastructure on Google Cloud so that teams can deploy their applications. The applications cannot be exposed publicly. You want to minimize management and operational overhead.
What should you do?
- A . Provision a Standard zonal Google Kubernetes Engine (GKE) cluster.
- B . Provision a fleet of Compute Engine instances and install Kubernetes.
- C . Provision a Google Kubernetes Engine (GKE) Autopilot cluster.
- D . Provision a Standard regional Google Kubernetes Engine (GKE) cluster.
C
Explanation:
GKE Autopilot is a mode of operation in GKE where Google manages the underlying infrastructure, including nodes, node pools, and their upgrades. This significantly reduces the management and operational overhead for the user, allowing teams to focus solely on deploying and managing their containerized applications. Since the applications are not exposed publicly, the zonal or regional nature of the cluster primarily impacts availability within Google Cloud, and Autopilot is available for both. Autopilot minimizes the operational burden, which is a key requirement.
Option A: A Standard zonal GKE cluster requires you to manage the nodes yourself, including sizing, scaling, and upgrades, increasing operational overhead compared to Autopilot.
Option B: Manually installing and managing Kubernetes on a fleet of Compute Engine instances involves the highest level of management overhead, which contradicts the requirement to minimize it.
Option D: A Standard regional GKE cluster provides higher availability than a zonal cluster by replicating the control plane and nodes across multiple zones within a region. However, it still requires you to manage the underlying nodes, unlike Autopilot.
Reference to Google Cloud Certified – Associate Cloud Engineer Documents:
The different modes of GKE operation, including Standard and Autopilot, and their respective management responsibilities and benefits, are clearly outlined in the Google Kubernetes Engine documentation, a core topic for the Associate Cloud Engineer certification. The emphasis on reduced operational overhead with Autopilot is a key differentiator.
You are developing a financial trading application that will be used globally. Data is stored and queried using a relational structure, and clients from all over the world should get the exact identical state of the data. The application will be deployed in multiple regions to provide the lowest latency to end users. You need to select a storage option for the application data while minimizing latency.
What should you do?
- A . Use Cloud Bigtable for data storage.
- B . Use Cloud SQL for data storage.
- C . Use Cloud Spanner for data storage.
- D . Use Firestore for data storage.
C
Explanation:
Keywords, Financial data (large data) used globally, data stored and queried using relational structure (SQL), clients should get exact identical copies (Strong Consistency), Multiple region, low latency to end user, select storage option to minimize latency.
You are developing a financial trading application that will be used globally. Data is stored and queried using a relational structure, and clients from all over the world should get the exact identical state of the data. The application will be deployed in multiple regions to provide the lowest latency to end users. You need to select a storage option for the application data while minimizing latency.
What should you do?
- A . Use Cloud Bigtable for data storage.
- B . Use Cloud SQL for data storage.
- C . Use Cloud Spanner for data storage.
- D . Use Firestore for data storage.
C
Explanation:
Keywords, Financial data (large data) used globally, data stored and queried using relational structure (SQL), clients should get exact identical copies (Strong Consistency), Multiple region, low latency to end user, select storage option to minimize latency.
You are the team lead of a group of 10 developers. You provided each developer with an individual Google Cloud Project that they can use as their personal sandbox to experiment with different Google Cloud solutions. You want to be notified if any of the developers are spending above $500 per month on their sandbox environment.
What should you do?
- A . Create a single budget for all projects and configure budget alerts on this budget.
- B . Create a separate billing account per sandbox project and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per billing account.
- C . Create a budget per project and configure budget alerts on all of these budgets.
- D . Create a single billing account for all sandbox projects and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per project.
C
Explanation:
Set budgets and budget alerts Overview Avoid surprises on your bill by creating Cloud Billing budgets to monitor all of your Google Cloud charges in one place. A budget enables you to track your actual Google Cloud spend against your planned spend. After you’ve set a budget amount, you set budget alert threshold rules that are used to trigger email notifications. Budget alert emails help you stay informed about how your spend is tracking against your budget. 2. Set budget scope Set the budget Scope and then click Next. In the Projects field, select one or more projects that you want to apply the budget alert to. To apply the budget alert to all the projects in the Cloud Billing account, choose Select all. https://cloud.google.com/billing/docs/how-to/budgets#budget-scop
Reference: https://cloud.google.com/billing/docs/how-to/budgets
You are the team lead of a group of 10 developers. You provided each developer with an individual Google Cloud Project that they can use as their personal sandbox to experiment with different Google Cloud solutions. You want to be notified if any of the developers are spending above $500 per month on their sandbox environment.
What should you do?
- A . Create a single budget for all projects and configure budget alerts on this budget.
- B . Create a separate billing account per sandbox project and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per billing account.
- C . Create a budget per project and configure budget alerts on all of these budgets.
- D . Create a single billing account for all sandbox projects and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per project.
C
Explanation:
Set budgets and budget alerts Overview Avoid surprises on your bill by creating Cloud Billing budgets to monitor all of your Google Cloud charges in one place. A budget enables you to track your actual Google Cloud spend against your planned spend. After you’ve set a budget amount, you set budget alert threshold rules that are used to trigger email notifications. Budget alert emails help you stay informed about how your spend is tracking against your budget. 2. Set budget scope Set the budget Scope and then click Next. In the Projects field, select one or more projects that you want to apply the budget alert to. To apply the budget alert to all the projects in the Cloud Billing account, choose Select all. https://cloud.google.com/billing/docs/how-to/budgets#budget-scop
Reference: https://cloud.google.com/billing/docs/how-to/budgets
You want to find out when users were added to Cloud Spanner Identity Access Management (IAM) roles on your Google Cloud Platform (GCP) project.
What should you do in the GCP Console?
- A . Open the Cloud Spanner console to review configurations.
- B . Open the IAM & admin console to review IAM policies for Cloud Spanner roles.
- C . Go to the Stackdriver Monitoring console and review information for Cloud Spanner.
- D . Go to the Stackdriver Logging console, review admin activity logs, and filter them for Cloud Spanner IAM roles.
D
Explanation:
https://cloud.google.com/monitoring/audit-logging
Your company is using Google Workspace to manage employee accounts. Anticipated growth will increase the number of personnel from 100 employees to 1.000 employees within 2 years. Most employees will need access to your company’s Google Cloud account. The systems and processes will need to support 10x growth without performance degradation, unnecessary complexity, or security issues.
What should you do?
- A . Migrate the users to Active Directory. Connect the Human Resources system to Active Directory. Turn on Google Cloud Directory Sync (GCDS) for Cloud Identity. Turn on Identity Federation from Cloud Identity to Active Directory.
- B . Organize the users in Cloud Identity into groups. Enforce multi-factor authentication in Cloud Identity.
- C . Turn on identity federation between Cloud Identity and Google Workspace. Enforce multi-factor authentication for domain wide delegation.
- D . Use a third-party identity provider service through federation. Synchronize the users from Google Workplace to the third-party provider in real time.
Configure the instance’s crontab to execute these scripts daily at 1:00 AM.
Explanation:
Creating scheduled snapshots for persistent disk This document describes how to create a snapshot schedule to regularly and automatically back up your zonal and regional persistent disks. Use snapshot schedules as a best practice to back up your Compute Engine workloads. After creating a snapshot schedule, you can apply it to one or more persistent disks. https://cloud.google.com/compute/docs/disks/scheduled-snapshots
You are migrating your on-premises workload to Google Cloud. Your company is implementing its Cloud Billing configuration and requires access to a granular breakdown of its Google Cloud costs. You need to ensure that the Cloud Billing datasets are available in BigQuery so you can conduct a detailed analysis of costs.
What should you do?
- A . Enable the BigQuery API and ensure that the BigQuery User IAM role is selected. Change the BigQuery dataset to select a data location.
- B . Create a Cloud Billing account. Enable the BigQuery Data Transfer Service API to export pricing data.
- C . Enable Cloud Billing data export to BigQuery when you create a Cloud Billing account.
- D . Enable Cloud Billing on the project and link a Cloud Billing account. Then view the billing data table in the BigQuery dataset.
C
Explanation:
The most direct and recommended way to get a granular breakdown of your Google Cloud costs in
BigQuery is to enable Cloud Billing data export to BigQuery when you create or manage your Cloud Billing account. This automatically sets up a daily export of your billing data to a BigQuery dataset you specify.
Option A: Enabling the BigQuery API and managing IAM roles are necessary for interacting with BigQuery, but they don’t automatically populate it with Cloud Billing data. Selecting a data location is also important for BigQuery datasets but is a separate step from enabling billing export.
Option B: The BigQuery Data Transfer Service is used for transferring data from various sources into BigQuery, but for Cloud Billing data, the direct export feature is the standard and simpler method.
Option D: Enabling Cloud Billing and linking an account makes billing data available in the Cloud Billing console, but it doesn’t automatically export it to BigQuery for detailed analysis. You need to explicitly configure the BigQuery export.
Reference to Google Cloud Certified – Associate Cloud Engineer Documents:
The process of setting up Cloud Billing export to BigQuery is clearly documented in the Google Cloud Billing documentation, which is a fundamental area for the Associate Cloud Engineer certification. Understanding how to access and analyze billing data is crucial for cost management.
