Practice Free Professional Cloud Security Engineer Exam Online Questions
What does the TOGAF ADM recommend for use in developing an Architecture Vision document?
- A . Requirements Management
- B . Architecture Principles
- C . Gap Analysis
- D . Business Scenarios
D
Explanation:
Business scenarios are a technique recommended by the TOGAF ADM for use in developing an Architecture Vision document12. Business scenarios are a means of capturing the business requirements and drivers, the processes and actors involved, and the desired outcomes and measures of success34. Business scenarios help to create a common vision and understanding among the stakeholders, and to identify and validate the architecture requirements. Business scenarios also provide a basis for analyzing the impact and value of the proposed architecture.
Reference:
• The TOGAF Standard, Version 9.2 – Phase A: Architecture Vision – The Open Group
• TOGAF® Standard ― Introduction – Phase A: Architecture Vision
• The TOGAF Standard, Version 9.2 – Definitions – The Open Group
• Business Scenarios – The Open Group
• [The TOGAF Standard, Version 9.2 – Architecture Requirements Specification – The Open Group]
• [The TOGAF Standard, Version 9.2 – Architecture Vision – The Open Group]
• [The TOGAF Standard, Version 9.2 – Business Transformation Readiness Assessment – The Open Group]
You are setting up Cloud Identity for your company’s Google Cloud organization User accounts will be provisioned from Microsoft Entra ID through Directory Sync, and there will be single sign-on through Entra ID You need to secure the super administrator accounts for the organization. Your solution must follow the principle of least privilege and implement strong authentication
What should you do?
- A . Create dedicated accounts for super administrators Ensure that 2-step verification is enforced for the super administrator accounts in Entra ID
- B . Create dedicated accounts for super administrators Enforce Google 2-step verification for the super administrator accounts
- C . Create accounts that combine the organization administrator and the super administrator privileges Ensure that 2-step verification is enforced for the super administrator accounts in Entra ID
- D . Create accounts that combine the organization administrators and the super administrator privileges Enforce Google 2-step verification for the super administrator accounts
B
Explanation:
The problem focuses on securing "super administrator accounts for the organization" when Cloud Identity is synced with Microsoft Entra ID and uses Entra ID for SSO The key requirements are the principle of least privilege and strong authentication
Principle of Least Privilege & Dedicated Accounts: Google’s best practices strongly recommend creating dedicated, non-federated accounts for super administrators that are distinct from regular user accounts These accounts should only be used for super administrator tasks and not for daily activities This segregation ensures that the highest privilege accounts are isolated and adhere to the principle of least privilege by not having combined responsibilities
Extract
Reference: "Designate Organization Administrators We recommend keeping your super admin account separate from your Organization Administrator group" and "Give super admins a separate account that requires a separate login For example, user alice@examplecom could have a super admin account alice-admin@examplecom" and "Use the super admin account only when needed Delegate administrator tasks to user accounts with limited admin roles Use the least privilege approach" (Google Cloud Documentation: "Super administrator account best practices | Resource Manager Documentation" – https://cloudgooglecom/resource-manager/docs/super-admin-best-practices)
Strong Authentication (Google 2-Step Verification): Even when using a third-party identity provider like Microsoft Entra ID for most users, Google recommends enforcing Google’s own 2-Step Verification for the critical super administrator accounts This provides a "break-glass" mechanism that is independent of the external IdP If the Entra ID integration were to fail or become compromised, the Google-managed super administrator accounts, protected by Google’s own 2SV, would still be accessible for emergency recovery
Extract
Reference: "Even when using the legacy SSO profile, super admins can’t sign in with SSO in these cases: Admin console When super administrators try to sign in to an SSO-enabled domain via admingooglecom, they must enter their full Google administrator account email address and associated Google password (not their SSO username and password), and click Sign in to directly access the Admin console Google doesn’t redirect them to the SSO sign-in page" (Google Cloud Identity Help: "Super administrator SSO" –
https://supportgooglecom/cloudidentity/answer/6341409) – This highlights that super admin accounts can bypass SSO for direct Admin console access, making Google’s 2SV crucial Extract
Reference: "It’s especially important for super admins to use 2SV because their accounts control access to all business and employee data in the organization Protect your business with 2-Step Verification Use security keys for 2-Step Verification" (Cloud Identity Help: "Security best practices for administrator accounts" – https://supportgooglecom/cloudidentity/answer/9011373) Options C and D are incorrect because combining "organization administrator" (IAM role for GCP resources) and "super administrator" (Google Workspace/Cloud Identity domain-level control) privileges violates the principle of least privilege Option A is less secure than B because relying solely on Entra ID’s 2SV for super administrators means a compromise of Entra ID or an outage would leave the Google Cloud organization vulnerable without an independent break-glass mechanism
Your organization recently deployed a new application on Google Kubernetes Engine. You need to deploy a solution to protect the application.
The solution has the following requirements:
Scans must run at least once per week
Must be able to detect cross-site scripting vulnerabilities
Must be able to authenticate using Google accounts
Which solution should you use?
- A . Google Cloud Armor
- B . Web Security Scanner
- C . Security Health Analytics
- D . Container Threat Detection
B
Explanation:
Web Security Scanner is designed to scan your web applications deployed on Google Cloud for common vulnerabilities, including cross-site scripting (XSS). It can authenticate using Google accounts and can be scheduled to run scans regularly.
Steps:
Enable Web Security Scanner: In the Google Cloud Console, enable Web Security Scanner for your project.
Configure Scan: Set up the scan configuration, specifying the target URLs, authentication details (Google accounts), and scan frequency (at least once per week).
Run and Monitor Scans: Run the scans and monitor the results for vulnerabilities, addressing any issues found.
Reference:
Web Security Scanner documentation
You are a Security Administrator at your organization. You need to restrict service account creation capability within production environments. You want to accomplish this centrally across the organization.
What should you do?
- A . Use Identity and Access Management (IAM) to restrict access of all users and service accounts that have access to the production environment.
- B . Use organization policy constraints/iam.disableServiceAccountKeyCreation boolean to disable the creation of new service accounts.
- C . Use organization policy constraints/iam.disableServiceAccountKeyUpload boolean to disable the creation of new service accounts.
- D . Use organization policy constraints/iam.disableServiceAccountCreation boolean to disable the creation of new service accounts.
D
Explanation:
Reference: https://cloud.google.com/resource-manager/docs/organization-policy/restricting-service-accounts
You can use the iam.disableServiceAccountCreation boolean constraint to disable the creation of new service accounts. This allows you to centralize management of service accounts while not restricting the other permissions your developers have on projects. https://cloud.google.com/resource-manager/docs/organization-policy/restricting-service-accounts#disable_service_account_creation
A customer has 300 engineers. The company wants to grant different levels of access and efficiently manage IAM permissions between users in the development and production environment projects.
Which two steps should the company take to meet these requirements? (Choose two.)
- A . Create a project with multiple VPC networks for each environment.
- B . Create a folder for each development and production environment.
- C . Create a Google Group for the Engineering team, and assign permissions at the folder level.
- D . Create an Organizational Policy constraint for each folder environment.
- E . Create projects for each environment, and grant IAM rights to each engineering user.
BC
Explanation:
To manage IAM permissions efficiently for a large engineering team with different levels of access in development and production environments, follow these steps:
Create Separate Folders:
Create a folder for the development environment.
Create a folder for the production environment.
This allows you to organize projects and apply different policies and permissions to each environment.
Navigate to IAM & Admin in the GCP Console.
Select "Folders" from the left-hand menu.
Create a new folder named "Development".
Create a new folder named "Production".
Create Google Groups:
Create Google Groups for different teams within the engineering department (e.g., Development Team, Production Team).
This helps in managing permissions centrally.
Use the Google Admin Console to create groups.
Add relevant engineers to each group.
Assign Permissions at the Folder Level:
Assign appropriate IAM roles to the Google Groups at the folder level.
For example, grant Viewer role to the Development Team group for the development folder. Grant Editor or more restrictive roles as required for the Production Team group for the production folder.
Select the development folder.
Go to the "Permissions" tab.
Click on "Add" and enter the email address of the Development Team Google Group.
Assign the "Viewer" role.
Repeat for the production folder, assigning appropriate roles to the Production Team Google Group. By following these steps, you create a clear separation between development and production environments and manage permissions efficiently using Google Groups and folders.
Reference:
Google Cloud IAM Documentation
Google Cloud Resource Manager Documentation
A customer has 300 engineers. The company wants to grant different levels of access and efficiently manage IAM permissions between users in the development and production environment projects.
Which two steps should the company take to meet these requirements? (Choose two.)
- A . Create a project with multiple VPC networks for each environment.
- B . Create a folder for each development and production environment.
- C . Create a Google Group for the Engineering team, and assign permissions at the folder level.
- D . Create an Organizational Policy constraint for each folder environment.
- E . Create projects for each environment, and grant IAM rights to each engineering user.
BC
Explanation:
To manage IAM permissions efficiently for a large engineering team with different levels of access in development and production environments, follow these steps:
Create Separate Folders:
Create a folder for the development environment.
Create a folder for the production environment.
This allows you to organize projects and apply different policies and permissions to each environment.
Navigate to IAM & Admin in the GCP Console.
Select "Folders" from the left-hand menu.
Create a new folder named "Development".
Create a new folder named "Production".
Create Google Groups:
Create Google Groups for different teams within the engineering department (e.g., Development Team, Production Team).
This helps in managing permissions centrally.
Use the Google Admin Console to create groups.
Add relevant engineers to each group.
Assign Permissions at the Folder Level:
Assign appropriate IAM roles to the Google Groups at the folder level.
For example, grant Viewer role to the Development Team group for the development folder. Grant Editor or more restrictive roles as required for the Production Team group for the production folder.
Select the development folder.
Go to the "Permissions" tab.
Click on "Add" and enter the email address of the Development Team Google Group.
Assign the "Viewer" role.
Repeat for the production folder, assigning appropriate roles to the Production Team Google Group. By following these steps, you create a clear separation between development and production environments and manage permissions efficiently using Google Groups and folders.
Reference:
Google Cloud IAM Documentation
Google Cloud Resource Manager Documentation
Your team needs to make sure that their backend database can only be accessed by the frontend application and no other instances on the network.
How should your team design this network?
- A . Create an ingress firewall rule to allow access only from the application to the database using firewall tags.
- B . Create a different subnet for the frontend application and database to ensure network isolation.
- C . Create two VPC networks, and connect the two networks using Cloud VPN gateways to ensure network isolation.
- D . Create two VPC networks, and connect the two networks using VPC peering to ensure network isolation.
A
Explanation:
"However, even though it is possible to uses tags for target filtering in this manner, we recommend that you use service accounts where possible. Target tags are not access-controlled and can be changed by someone with the instance Admin role while VMs are in service. Service accounts are access-controlled, meaning that a specific user must be explicitly authorized to use a service account. There can only be one service account per instance, whereas there can be multiple tags. Also, service accounts assigned to a VM can only be changed when the VM is stopped"
Your organization wants to protect all workloads that run on Compute Engine VM to ensure that the instances weren’t compromised by boot-level or kernel-level malware. Also, you need to ensure that data in use on the VM cannot be read by the underlying host system by using a hardware-based solution.
What should you do?
- A . • 1 Use Google Shielded VM including secure boot Virtual Trusted Platform Module (vTPM) and integrity monitoring
• 2 Create a Cloud Run function to check for the VM settings generate metrics and run the function regularly - B . • 1 Activate Virtual Machine Threat Detection in Security Command Center (SCO Premium
• 2 Monitor the findings in SCC - C . • 1 Use Google Shielded VM including secure boot Virtual Trusted Platform Module (vTPM) and integrity monitoring
• 2 Activate Confidential Computing
• 3 Enforce these actions by using organization policies - D . • 1 Use secure hardened images from the Google Cloud Marketplace
• 2 When deploying the images activate the Confidential Computing option
• 3 Enforce the use of the correct images and Confidential Computing by using organization Policies
C
Explanation:
Use Google Shielded VM including secure boot Virtual Trusted Platform Module (vTPM) and integrity monitoring: Shielded VMs provide verifiable integrity of the VM by ensuring that it was not tampered with or compromised at the boot level. They use features like Secure Boot, vTPM, and integrity monitoring to detect and prevent malicious changes to the VM’s operating system and firmware. Activate Confidential Computing: Confidential Computing provides a secure environment for processing sensitive data. It uses hardware-based enclaves to protect data in use by ensuring it cannot be accessed by the underlying host or any other unauthorized entity. By leveraging Intel SGX or AMD SEV, it ensures that data remains encrypted even when it is being processed.
Enforce these actions by using organization policies: Organization policies can enforce the use of Shielded VMs and Confidential Computing across your organization. This ensures that all VMs comply with these security measures without requiring manual configuration for each VM.
Reference: Shielded VMs documentation
Confidential Computing documentation
Organization Policies documentation
Your organization wants to protect all workloads that run on Compute Engine VM to ensure that the instances weren’t compromised by boot-level or kernel-level malware. Also, you need to ensure that data in use on the VM cannot be read by the underlying host system by using a hardware-based solution.
What should you do?
- A . • 1 Use Google Shielded VM including secure boot Virtual Trusted Platform Module (vTPM) and integrity monitoring
• 2 Create a Cloud Run function to check for the VM settings generate metrics and run the function regularly - B . • 1 Activate Virtual Machine Threat Detection in Security Command Center (SCO Premium
• 2 Monitor the findings in SCC - C . • 1 Use Google Shielded VM including secure boot Virtual Trusted Platform Module (vTPM) and integrity monitoring
• 2 Activate Confidential Computing
• 3 Enforce these actions by using organization policies - D . • 1 Use secure hardened images from the Google Cloud Marketplace
• 2 When deploying the images activate the Confidential Computing option
• 3 Enforce the use of the correct images and Confidential Computing by using organization Policies
C
Explanation:
Use Google Shielded VM including secure boot Virtual Trusted Platform Module (vTPM) and integrity monitoring: Shielded VMs provide verifiable integrity of the VM by ensuring that it was not tampered with or compromised at the boot level. They use features like Secure Boot, vTPM, and integrity monitoring to detect and prevent malicious changes to the VM’s operating system and firmware. Activate Confidential Computing: Confidential Computing provides a secure environment for processing sensitive data. It uses hardware-based enclaves to protect data in use by ensuring it cannot be accessed by the underlying host or any other unauthorized entity. By leveraging Intel SGX or AMD SEV, it ensures that data remains encrypted even when it is being processed.
Enforce these actions by using organization policies: Organization policies can enforce the use of Shielded VMs and Confidential Computing across your organization. This ensures that all VMs comply with these security measures without requiring manual configuration for each VM.
Reference: Shielded VMs documentation
Confidential Computing documentation
Organization Policies documentation
In an effort for your company messaging app to comply with FIPS 140-2, a decision was made to use GCP compute and network services. The messaging app architecture includes a Managed Instance Group (MIG) that controls a cluster of Compute Engine instances. The instances use Local SSDs for data caching and UDP for instance-to-instance communications. The app development team is willing to make any changes necessary to comply with the standard
Which options should you recommend to meet the requirements?
- A . Encrypt all cache storage and VM-to-VM communication using the BoringCrypto module.
- B . Set Disk Encryption on the Instance Template used by the MIG to customer-managed key and use BoringSSL for all data transit between instances.
- C . Change the app instance-to-instance communications from UDP to TCP and enable BoringSSL on clients’ TLS connections.
- D . Set Disk Encryption on the Instance Template used by the MIG to Google-managed Key and use BoringSSL library on all instance-to-instance communications.
B
Explanation:
To comply with FIPS 140-2 for the messaging app, you need to ensure that both data at rest and data
in transit are encrypted according to the standard. Using customer-managed encryption keys (CMEK)
ensures that you have control over the encryption keys, and BoringSSL is a library that meets FIPS
140-2 standards for encrypting data in transit.
Steps:
Encrypt Local SSDs: Modify the instance template for the Managed Instance Group (MIG) to use customer-managed encryption keys (CMEK) for encrypting Local SSDs.
Enable BoringSSL: Update the application to use the BoringSSL library for all instance-to-instance communication to ensure that all data in transit is encrypted according to FIPS 140-2 standards.
Reference:
Google Cloud: Customer-managed encryption keys (CMEK)
BoringSSL documentation