Practice Free SAA-C03 Exam Online Questions
A company is building an ecommerce application and needs to store sensitive customer information. The company needs to give customers the ability to complete purchase transactions on the website. The company also needs to ensure that sensitive customer data is protected, even from database administrators.
Which solution meets these requirements?
- A . Store sensitive data in an Amazon Elastic Block Store (Amazon EBS) volume. Use EBS encryption to encrypt the data. Use an IAM instance role to restrict access.
- B . Store sensitive data in Amazon RDS for MySQL. Use AWS Key Management Service (AWS KMS) client-side encryption to encrypt the data.
- C . Store sensitive data in Amazon S3. Use AWS Key Management Service (AWS KMS) server-side encryption to encrypt the data. Use S3 bucket policies to restrict access.
- D . Store sensitive data in Amazon FSx for Windows Server. Mount the file share on application servers. Use Windows file permissions to restrict access.
B
Explanation:
it allows the company to store sensitive customer information in a managed AWS service and give customers the ability to complete purchase transactions on the website. By using AWS Key Management Service (AWS KMS) client-side encryption, the company can encrypt the data before sending it to Amazon RDS for MySQL. This ensures that sensitive customer data is protected, even from database administrators, as only the application has access to the encryption keys.
Reference: Using Encryption with Amazon RDS for MySQL
Encrypting Amazon RDS Resources
An image-processing company has a web application that users use to upload images. The application uploads the images into an Amazon S3 bucket. The company has set up S3 event notifications to publish the object creation events to an Amazon Simple Queue Service (Amazon SQS) standard queue. The SQS queue serves as the event source for an AWS Lambda function that processes the images and sends the results to users through email.
Users report that they are receiving multiple email messages for every uploaded image. A solutions architect determines that SQS messages are invoking the Lambda function more than once, resulting in multiple email messages.
What should the solutions architect do to resolve this issue with the LEAST operational overhead?
- A . Set up long polling in the SQS queue by increasing the Receive Message wait time to 30 seconds.
- B . Change the SQS standard queue to an SQS FIFO queue. Use the message deduplication ID to discard duplicate messages.
- C . Increase the visibility timeout in the SQS queue to a value that is greater than the total of the function timeout and the batch window timeout.
- D . Modify the Lambda function to delete each message from the SQS queue immediately after the message is read before processing.
A gaming company hosts a browser-based application on AWS. The users of the application consume a large number of videos and images that are stored in Amazon S3. This content is the same for all users.
The application has increased in popularity, and millions of users worldwide are accessing these media files. The company wants to provide the files to the users while reducing the load on the origin.
Which solution meets these requirements MOST cost-effectively?
- A . Deploy an AWS Global Accelerator accelerator in front of the web servers.
- B . Deploy an Amazon CloudFront web distribution in front of the S3 bucket.
- C . Deploy an Amazon ElastiCache for Redis instance in front of the web servers.
- D . Deploy an Amazon ElastiCache for Memcached instance in front of the web servers.
B
Explanation:
ElastiCache, enhances the performance of web applications by quickly retrieving information from fully-managed in-memory data stores. It utilizes Memcached and Redis, and manages to considerably reduce the time your applications would, otherwise, take to read data from disk-based databases. Amazon CloudFront supports dynamic content from HTTP and WebSocket protocols, which are based on the Transmission Control Protocol (TCP) protocol. Common use cases include dynamic API calls, web pages and web applications, as well as an application’s static files such as audio and images. It also supports on-demand media streaming over HTTP. AWS Global Accelerator supports both User Datagram Protocol (UDP) and TCP-based protocols. It is commonly used for non-HTTP use cases, such as gaming, IoT and voice over IP. It is also good for HTTP use cases that need static IP addresses or fast regional failover
A company wants to run applications in containers in the AWS Cloud. These applications are stateless and can tolerate disruptions within the underlying infrastructure. The company needs a solution that minimizes cost and operational overhead.
What should a solutions architect do to meet these requirements?
- A . Use Spot Instances in an Amazon EC2 Auto Scaling group to run the application containers.
- B . Use Spot Instances in an Amazon Elastic Kubernetes Service (Amazon EKS) managed node group.
- C . Use On-Demand Instances in an Amazon EC2 Auto Scaling group to run the application containers.
- D . Use On-Demand Instances in an Amazon Elastic Kubernetes Service (Amazon EKS) managed node group.
A
Explanation:
https: //aws.amazon.com/cn/blogs/compute/cost-optimization-and-resilience-eks-with-spot-instances/
A company needs a solution to prevent photos with unwanted content from being uploaded to the company’s web application. The solution must not involve training a machine learning (ML) model.
Which solution will meet these requirements?
- A . Create and deploy a model by using Amazon SageMaker Autopilot. Create a real-time endpoint that the web application invokes when new photos are uploaded.
- B . Create an AWS Lambda function that uses Amazon Rekognition to detect unwanted content. Create a Lambda function URL that the web application invokes when new photos are uploaded.
- C . Create an Amazon CloudFront function that uses Amazon Comprehend to detect unwanted content. Associate the function with the web application.
- D . Create an AWS Lambda function that uses Amazon Rekognition Video to detect unwanted content.
Create a Lambda function URL that the web application invokes when new photos are uploaded.
B
Explanation:
The solution that will meet the requirements is to create an AWS Lambda function that uses Amazon Rekognition to detect unwanted content, and create a Lambda function URL that the web application invokes when new photos are uploaded. This solution does not involve training a machine learning model, as Amazon Rekognition is a fully managed service that provides pre-trained computer vision models for image and video analysis. Amazon Rekognition can detect unwanted content such as explicit or suggestive adult content, violence, weapons, drugs, and more. By using AWS Lambda, the company can create a serverless function that can be triggered by an HTTP request from the web application. The Lambda function can use the Amazon Rekognition API to analyze the uploaded photos and return a response indicating whether they contain unwanted content or not.
The other solutions are not as effective as the first one because they either involve training a machine learning model, do not support image analysis, or do not work with photos. Creating and deploying a model by using Amazon SageMaker Autopilot involves training a machine learning model, which is not required for the scenario. Amazon SageMaker Autopilot is a service that automatically creates, trains, and tunes the best machine learning models for classification or regression based on the data provided by the user. Creating an Amazon CloudFront function that uses Amazon Comprehend to detect unwanted content does not support image analysis, as Amazon Comprehend is a natural language processing service that analyzes text, not images. Amazon Comprehend can extract insights and relationships from text such as language, sentiment, entities, topics, and more. Creating an AWS Lambda function that uses Amazon Rekognition Video todetect unwanted content does not work with photos, as Amazon Rekognition Video is designed for analyzing video streams, not static images. AmazonRekognition Video can detect activities, objects, faces, celebrities, text, and more in video streams.
Reference: Amazon Rekognition
AWS Lambda
Detecting unsafe content – Amazon Rekognition
Amazon SageMaker Autopilot
Amazon Comprehend
A company has an on-premises server that uses an Oracle database to process and store customer information. The company wants to use an AWS database service to achieve higher availability and to improve application performance. The company also wants to offload reporting from its primary database system.
Which solution will meet these requirements in the MOST operationally efficient way?
- A . Use AWS Database Migration Service (AWS DMS) to create an Amazon RDS DB instance in multiple AWS Regions Point the reporting functions toward a separate DB instance from the primary DB instance.
- B . Use Amazon RDS in a Single-AZ deployment to create an Oracle database Create a read replica in the same zone as the primary DB instance. Direct the reporting functions to the read replica.
- C . Use Amazon RDS deployed in a Multi-AZ cluster deployment to create an Oracle database Direct the reporting functions to use the reader instance in the cluster deployment
- D . Use Amazon RDS deployed in a Multi-AZ instance deployment to create an Amazon Aurora database. Direct the reporting functions to the reader instances.
D
Explanation:
Amazon Aurora is a fully managed relational database that is compatible with MySQL and PostgreSQL. It provides up to five times better performance than MySQL and up to three times better performance than PostgreSQL. It also provides high availability and durability by replicating data across multiple Availability Zones and continuously backing up data to Amazon S31. By using Amazon RDS deployed in a Multi-AZ instance deployment to create an Amazon Aurora database, the solution can achieve higher availability and improve application performance.
Amazon Aurora supports read replicas, which are separate instances that share the same underlying storage as the primary instance. Read replicas can be used to offload read-onlyqueries from the primary instance and improve performance. Read replicas can also beused for reporting functions2. By directing the reporting functions to the reader instances, the solution can offload reporting from its primary database system.
A company has an organization in AWS Organizations. The company runs Amazon EC2 instances across four AWS accounts in the root organizational unit (OU). There are three nonproduction
accounts and one production account. The company wants to prohibit users from launching EC2 instances of a certain size in the nonproduction accounts. The company has created a service control policy (SCP) to deny access to launch instances that use the prohibited types.
Which solutions to deploy the SCP will meet these requirements? (Select TWO.)
- A . Attach the SCP to the root OU for the organization.
- B . Attach the SCP to the three nonproduction Organizations member accounts.
- C . Attach the SCP to the Organizations management account.
- D . Create an OU for the production account. Attach the SCP to the OU. Move the production member account into the new OU.
- E . Create an OU for the required accounts. Attach the SCP to the OU. Move the nonproduction member accounts into the new OU.
B, E
Explanation:
SCPs are a type of organization policy that you can use to manage permissions in your organization. SCPs offer central control over the maximum available permissions for all accounts in your organization. SCPs help you to ensure your accounts stay within your organization’s access control guidelines1.
To apply an SCP to a specific set of accounts, you need to create an OU for those accounts and attach the SCP to the OU. This way, the SCP affects only the member accounts in that OU and not the other accounts in the organization. If you attach the SCP to the root OU, it will apply to all accounts in the organization, including the production account, which is not the desired outcome. If you attach the SCP to the management account, it will have no effect, as SCPs do not affect users or roles in the management account1.
Therefore, the best solutions to deploy the SCP are B and E.
Option B attaches the SCP directly to the three nonproduction accounts, while option E creates a separate OU for the nonproduction accounts and attaches the SCP to the OU. Both options will achieve the same result of restricting the EC2instance types in the nonproduction accounts, but option E might be more scalable and manageable if there are more accounts or policies to be applied in the future2.
Reference: 1: Service control policies (SCPs) – AWS Organizations
2: Best Practices for AWS Organizations Service Control Policies in a Multi-Account Environment
A company is migrating a data processing application to AWS. The application processes several short-lived batch jobs that cannot be disrupted. The process generates data after each batch job
finishes running. The company accesses the data for 30 days following data generation. After 30 days, the company stores the data for 2 years.
The company wants to optimize costs for the application and data storage.
Which solution will meet these requirements?
- A . Use Amazon EC2 Spot Instances to run the application. Store the data in Amazon S3 Standard. Move the data to S3 Glacier Instant Retrieval after 30 days. Configure a bucket policy to delete the data after 2 years.
- B . Use Amazon EC2 On-Demand Instances to run the application. Store the data in Amazon S3 Glacier Instant Retrieval. Move the data to S3 Glacier Deep Archive after 30 days. Configure an S3 Lifecycle configuration to delete the data after 2 years.
- C . Use Amazon EC2 Spot Instances to run the application. Store the data in Amazon S3 Standard. Move the data to S3 Glacier Flexible Retrieval after 30 days. Configure a bucket policy to delete the data after 2 years.
- D . Use Amazon EC2 On-Demand Instances to run the application. Store the data in Amazon S3 Standard. Move the data to S3 Glacier Deep Archive after 30 days. Configure an S3 Lifecycle configuration to delete the data after 2 years.
A company has an on-premises application that generates a large amount of time-sensitive data that is backed up to Amazon S3. The application has grown and there are user complaints about internet bandwidth limitations. A solutions architect needs to design a long-term solution that allows for both timely backups to Amazon S3 and with minimal impact on internet connectivity for internal users.
Which solution meets these requirements?
- A . Establish AWS VPN connections and proxy all traffic through a VPC gateway endpoint
- B . Establish a new AWS Direct Connect connection and direct backup traffic through this new connection.
- C . Order daily AWS Snowball devices Load the data onto the Snowball devices and return the devices to AWS each day.
- D . Submit a support ticket through the AWS Management Console Request the removal of S3 service limits from the account.
B
Explanation:
To address the issue of bandwidth limitations on the company’s on-premises application, and to minimize the impact on internal user connectivity, a new AWS Direct Connect connection should be established to direct backup traffic through this new connection. This solution will offer a secure, high-speed connection between the company’s data center and AWS, which will allow the company to transfer data quickly without consuming internet bandwidth.
Reference: AWS Direct Connect documentation: https: //aws.amazon.com/directconnect/
A company is designing a new application that uploads files to an Amazon S3 bucket. The uploaded files are processed to extract metadata.
Processing must take less than 5 seconds. The volume and frequency of the uploads vary from a few files each hour to hundreds of concurrent uploads.
Which solution will meet these requirements MOST cost-effectively?
- A . Configure AWS CloudTrail trails to log Amazon S3 API calls. Use AWS AppSync to process the files.
- B . Configure a new object created S3 event notification within the bucket to invoke an AWS Lambda function to process the files.
- C . Configure Amazon Kinesis Data Streams to deliver the files to the S3 bucket. Invoke an AWS Lambda function to process the files.
- D . Deploy an Amazon EC2 instance. Create a script that lists all files in the S3 bucket and processes new files. Use a cron job that runs every minute to run the script.
B
Explanation:
Using S3 event notifications to trigger AWS Lambda for file processing is a cost-effective and
serverless solution. Lambda scales automatically with upload volume, and processing each file takes less than 5 seconds, fitting within Lambda’s execution time.
Option A: AWS AppSync is designed for GraphQL APIs and is not suitable for file processing.
Option C: Kinesis is overkill and more expensive for this use case.
Option D: Running an EC2 instance incurs ongoing costs and is less flexible compared to Lambda.
AWS Documentation
Reference: Amazon S3 Event Notifications
AWS Lambda Overview