Amazon SAA-C03 dumps

Amazon SAA-C03 Dumps

Amazon AWS Certified Solutions Architect - Associate (SAA-C03)

Looking for Amazon SAA-C03 Practice Questions? Rejoice because you have reached your destination. Amazonawsdumps.com have prepared a special kind of test material that alters according to the individual candidate’s skillset. Our smart system presents Amazon SAA-C03 Question Answers exactly like they are in the actual exam. We report your progress at the end of each test to ensures 100% success.

discount banner
PDF Demo $49 Add to cart
Test Engine Demo $59 Add to cart
PDF + Test Engine $69 Add to cart

Here are some more features of Amazon SAA-C03 PDF:

999 questions with answers Updation Date : 28 Apr, 2025
Unlimited practice questions Routinely Daily Updates
Takes Just 1 Day to Prepare Exam Passing Guaranteed at First Go
Money-Back Facility 3 Months Free Updates

SAA-C03 is the new test code of the AWS Certified Solutions Architect - Associate test which is booked to be followed through on August 30, 2022. Its continuous variation is SAA-C02 and is reserved to be decommissioned by August 29, 2022 so you really want to know the differentiation between these two structures.

Introduction

The AWS Certified Solutions Architect - Associate certificate test is expected for IT Specialists who play out a solution Modeler or DevOps work and have a long time of dynamic contribution with arranging open, cost-capable, deficiency receptive, and flexible dispersed structures on the Amazon Web solutions (AWS) stage. Expecting you at this point have the SAA insistence and you wish you re-affirm, then, you want to step through the exam by August 29 or you can similarly hold on for the new SAA-C03 transformation to arise.

AMAZON AWS SAA-C03 EXAM DUMPS

 Amazon AWS SAA-C03 Dumps are best for accurate results and 100% passing guarantee methodologies. Amazonawsdumps.com provides most genuine and authentic question and answer PDF material that prepare you 100% for your actual Amazon AWS SAA-C03 Exam. Amazonawsdumps gives you full assistance, passing guarantee and total privacy and security when you purchased Amazon AWS SAA-C03 Exam Dumps from their site.

Target Audience for Amazon AWS SAA-C03 Exam

The candidate who wants to clear Amazon AWS SAA-C03 Exam should have one year working experience in developing cloud solutions by utilizing AWS services.

Exam Type

The exam comprises on two type of questions:

  • Multiple choice questions
  • Multiple response questions

Exam modules Amazon AWS SAA-C03 Exam

  • Plan Secure Structures 30%
  • Plan Tough Designs 26%
  • Plan High-Performing Structures 24%
  • Plan Cost-Improved Models 20%

Exam format for Amazon AWS SAA-C03 Exam

  • Test Code: SAA-C03
  • Delivery Date: August 30, 2022
  • Perquisite: None
  • No. of Inquiries: 65
  • Score Reach: 100/1000
  • Cost: 150 USD (Practice test: 20 USD)
  • Passing Score: 720/1000
  • Time duration: 2 hours 10 minutes (130 minutes)
  • Type of test: Situation based. Different decision/various responses.
  • Conveyance Technique: Testing focus or online delegated Test

Why Dumps are Necessary to Pass Amazon AWS SAA-C03 Exam

Most Recent Material Supplier for SAA-C03 Test

Breeze through Amazon SAA-C03 exam with Amazonawsdumps SAA-C03 Dumps PDF. All Amazon SAA-C03 Test Questions are the latest and by and large revived and cover the entire Amazon AWS Certified Solutions Architect - Associate SAA-C03 test schedule. The SAA-C03 Questions and Answers are Printable in Brilliant Guide that you can download in your PC or another device and start setting up your SAA-C03 Test.

Dumps PDF material Made by Trained Professionals

Exactly when you are attempting to consider to be trustworthy exam dumps for the course of action of AWS Certified Solutions Architect - Associate (SAA-C03) certification test, you ought to be sure that you are picking the right source. All of the PDF dumps from Amazonawsdumps are made by the recognized specialists and you will really need to get all the help you with expecting for clearing the SAA-C03 test. You can tirelessly interact with their preparation test material so you can achieve the best results. If you have no clue about how you can besides develop your status level for AWS Certified Solutions Architect - Associate (SAA-C03) certificate, then, you should buy the Amazonawsdumps study material and you will not at any point regret.

100% passing assurance

With the help of test dumps you can achieve 100% accomplishment in your AWS Certified Solutions Architect - Associate (SAA-C03) certification and you won't manage any issues while using PDF dumps for organizing of SAA-C03 test. Test dumps have major areas of strength for basic for a base and their clients are content with the results. They gave the entire day genuinely strong support for tackle any unquestionable issue associated with the test coordinating.

FAQ’S

What is passing score for Amazon AWS SAA-C03 Exam?

720 is the minimum passing score that required passing Amazon AWS SAA-C03 Exam.

Can I pass Amazon AWS SAA-C03 Exam with just one week preparation?

Yes with the assistance of Amazon AWS SAA-C03 Exam Dumps you can easily breeze through this exam with just one week or 3 days preparation.

From where can I buy the best exam dumps for Amazon AWS SAA-C03 Exam?

For best and excellent preparation of Amazon AWS SAA-C03 Exam you can check Amazonawsdumps exam dumps and you won’t regret anymore.

What is the passing ratio Amazon AWS SAA-C03 Exam?

The failure rate of Amazon AWS SAA-C03 Exam is above 72% that means only 28% applicant who took the Amazon AWS SAA-C03 Exam somehow manage to pass it.

Which AWS certificate is best for new comers?

The first and most demanding certificate AWS Certified Solution Architect Associate certificate is best for new comers.

Why Pass Amazon SAA-C03 Exam?

In today’s world, you need the validation of your skills to get past the competition. Amazon SAA-C03 Exam is that validation. Not only is Amazon a leading industry in IT but it also offers certification exams to prove Amazon's skills. These skills prove you capable of fulfilling the Amazon job role. To get certified you simply pass the SAA-C03 Exam. This brings us to Amazon SAA-C03 Question Answers set. Passing this certification exam from Amazon may seem easy but it’s not. Many students fail this exam only because they didn’t take it seriously. Don’t make this mistake and order your Amazon SAA-C03 Braindumps right now!

Amazonawsdumps.com is the most popular and reliable website that has helped thousands of candidates excel at Amazon Exams. You could be one of those fortunate few too. Pass your exam in one attempt with Amazon SAA-C03 PDF and own the future. Buy Now!

Superlative Amazon SAA-C03 Dumps!

We know we said passing amazon exams is hard but that’s only if you’ve been led astray. There are millions of Amazon SAA-C03 Practice Questions available online promising success but fail when it comes down to it. Choose your training material carefully and get Amazon SAA-C03 Question Answers that are valid, accurate, and approved by famous IT professionals. Our Amazon SAA-C03 Braindumps are created by experts for experts and generate first-class results in just a single attempt. Don’t believe us? Try our free demo version that contains all the features you’ll get with Amazon SAA-C03 PDF. An interactive design, easy to read format, understandable language, and concise pattern. And if you still don’t get the result you want and fail somehow, you get your money back in full. So, order your set of Amazon SAA-C03 Dumps now!

We promise our customers to take full responsibility for their learning, preparation and passing SAA-C03 Exams without a hunch. Our aim is your satisfaction and ease. That is why we demand only the reasonable cost on Amazon SAA-C03 Practice Questions. Moreover, offer 2 formats: PDF and online test engine. Also, there is always a little extra with our discount coupons.

Why Buy Amazon SAA-C03 Question Answers?

Amazonawsdumps.com the team is a bunch of experts who got lucky with Amazon SAA-C03 Braindumps. We got what we needed to pass the exam and we went through its challenges as well. That is why we want every Amazon Candidate to get success. Choosing among so many options of Amazon SAA-C03 PDF is a tricky situation. Sometimes they don’t turn out like they first appeared to be. That is the reason we offer our valued customers a free demo. They can get a test run of Amazon SAA-C03 Dumps before they buy it. When it comes to buying, the procedure is simple, secure, and hardly jeopardizing. Because our Amazon SAA-C03 Practice Questions have a 99.8% passing rate.

Amazon SAA-C03 Sample Questions

Question # 1

A media company hosts its video processing workload on AWS. The workload uses Amazon EC2 instances in an Auto Scaling group to handle varying levels of demand. The workload stores the original videos and the processed videos in an Amazon S3 bucket. The company wants to ensure that the video processing workload is scalable. The company wants to prevent failed processing attempts because of resource constraints. The architecture must be able to handle sudden spikes in video uploads without impacting the processing capability. Which solution will meet these requirements with the LEAST overhead?

A. Migrate the workload from Amazon EC2 instances to AWS Lambda functions. Configure an Amazon S3 event notification to invoke the Lambda functions when a new video is uploaded. Configure the Lambda functions to process videos directly and to save processed videos back to the S3 bucket.
B. Migrate the workload from Amazon EC2 instances to AWS Lambda functions. Use Amazon S3 to invoke an Amazon Simple Notification Service (Amazon SNS) topic when a new video is uploaded. Subscribe the Lambda functions to the SNS topic. Configure the Lambda functions to process the videos asynchronously and to save processed videos back to the S3 bucket.
C. Configure an Amazon S3 event notification to send a message to an Amazon Simple Queue Service (Amazon SQS) queue when a new video is uploaded. Configure the existing Auto Scaling group to poll the SQS queue, process the videos, and save processed videos back to the S3 bucket.
D. Configure an Amazon S3 upload trigger to invoke an AWS Step Functions state machine when a new video is uploaded. Configure the state machine to orchestrate the video processing workflow by placing a job message in the Amazon SQS queue. Configure the job message to invoke the EC2 instances to process the videos. Save processed videos back to the S3 bucket.

ANSWER : C


Question # 2

A media company hosts a web application on AWS. The application gives users the ability to upload and view videos. The application stores the videos in an Amazon S3 bucket. The company wants to ensure that only authenticated users can upload videos. Authenticated users must have the ability to upload videos only within a specified time frame after authentication. Which solution will meet these requirements with the LEAST operational overhead?

A. Configure the application to generate IAM temporary security credentials for authenticated users. 
B. Create an AWS Lambda function that generates pre-signed URLs when a user authenticates. 
C. Develop a custom authentication service that integrates with Amazon Cognito to control and log direct S3 bucket access through the application. 
D. Use AWS Security Token Service (AWS STS) to assume a pre-defined IAM role that grants authenticated users temporary permissions to upload videos directly to the S3 bucket.

ANSWER : B


Question # 3

A company uses a set of Amazon EC2 instances to host a website. The website uses an Amazon S3 bucket to store images and media files. The company wants to automate website infrastructure creation to deploy the website to multiple AWS Regions. The company also wants to provide the EC2 instances access to the S3 bucket so the instances can store and access data by using AWS Identity and Access Management (1AM). Which solution will meet these requirements MOST securely?

A. Create an AWS Cloud Format ion template for the web server EC2 instances. Save an 1AM access key in the UserData section of the AWS;:EC2::lnstance entity in the CloudFormation template.
B. Create a file that contains an 1AM secret access key and access key ID. Store the file in a new S3 bucket. Create an AWS CloudFormation template. In the template, create a parameter to specify the location of the S3 object that contains the access key and access key ID.
C. Create an 1AM role and an 1AM access policy that allows the web server EC2 instances to access the S3 bucket. Create an AWS CloudFormation template for the web server EC2 instances that contains an 1AM instance profile entity that references the 1AM role and the 1AM access policy.
D. Create a script that retrieves an 1AM secret access key and access key ID from 1AM and stores them on the web server EC2 instances. Include the script in the UserData section of the AWS::EC2::lnstance entity in an AWS CloudFormation template.

ANSWER : C


Question # 4

A digital image processing company wants to migrate its on-premises monolithic application to the AWS Cloud. The company processes thousands of images and generates large files as part of the processing workflow. The company needs a solution to manage the growing number of image processing jobs. The solution must also reduce the manual tasks in the image processing workflow. The company does not want to manage the underlying infrastructure of the solution. Which solution will meet these requirements with the LEAST operational overhead?

A. Use Amazon Elastic Container Service (Amazon ECS) with Amazon EC2 Spot Instances to process the images. Configure Amazon Simple Queue Service (Amazon SQS) to orchestrate the workflow. Store the processed files in Amazon Elastic File System (Amazon EFS)
B. Use AWS Batch jobs to process the images. Use AWS Step Functions to orchestrate the workflow. Store the processed files in an Amazon S3 bucket.
C. Use AWS Lambda functions and Amazon EC2 Spot Instances lo process the images. Store the processed files in Amazon FSx.
D. Deploy a group of Amazon EC2 instances to process the images. Use AWS Step Functions to orchestrate the workflow. Store the processed files in an Amazon Elastic Block Store (Amazon EBS) volume.

ANSWER : B


Question # 5

A company uses Amazon S3 to store customer data that contains personally identifiable information (PII) attributes. The company needs to make the customer information available to company resources through an AWS Glue Catalog. The company needs to have fine-grained access control for the data so that only specific IAM roles can access the PII data.

A. Create one IAM policy that grants access to PII. Create a second IAM policy that grants access to non-PII data. Assign the PII policy to the specified IAM roles.
B. Create one IAM role that grants access to PII. Create a second IAM role that grants access to non-PII data. Assign the PII policy to the specified IAM roles.
C. Use AWS Lake Formation to provide the specified IAM roles access to the PII data.
D. Use AWS Glue to create one view for PII data. Create a second view for non-PII data. Provide the specified IAM roles access to the PII view.

ANSWER : C


Question # 6

A company hosts a database that runs on an Amazon RDS instance deployed to multiple Availability Zones. A periodic script negatively affects a critical application by querying the database. How can application performance be improved with minimal costs?

A. Add functionality to the script to identify the instance with the fewest active connections and query that instance.
B. Create a read replica of the database. Configure the script to query only the read replica.
C. Instruct the development team to manually export new entries at the end of the day.
D. Use Amazon ElastiCache to cache the common queries the script runs.

ANSWER : B


Question # 7

A company hosts its multi-tier, public web application in the AWS Cloud. The web application runs on Amazon EC2 instances, and its database runs on Amazon RDS. The company is anticipating a large increase in sales during an upcoming holiday weekend. A solutions architect needs to build a solution to analyze the performance of the web application with a granularity of no more than 2 minutes. What should the solutions architect do to meet this requirement?

A. Send Amazon CloudWatch logs to Amazon Redshift. Use Amazon QuickSight to perform further analysis.
B. Enable detailed monitoring on all EC2 instances. Use Amazon CloudWatch metrics to perform further analysis.
C. Create an AWS Lambda function to fetch EC2 logs from Amazon CloudWatch Logs. Use Amazon CloudWatch metrics to perform further analysis.
D. Send EC2 logs to Amazon S3. Use Amazon Redshift to fetch togs from the S3 bucket to process raw data tor further analysis with Amazon QuickSight.

ANSWER : B


Question # 8

A company runs an application on Amazon EC2 instances. The instances need to access an Amazon RDS database by using specific credentials. The company uses AWS Secrets Manager to contain the credentials the EC2 instances must use. Which solution will meet this requirement?

A. Create an IAM role, and attach the role to each EC2 instance profile. Use an identitybased policy to grant the new IAM role access to the secret that contains the database credentials.
B. Create an IAM user, and attach the user to each EC2 instance profile. Use a resourcebased policy to grant the new IAM user access to the secret that contains the database credentials.
C. Create a resource-based policy for the secret that contains the database credentials. Use EC2 Instance Connect to access the secret.
D. Create an identity-based policy for the secret that contains the database credentials. Grant direct access to the EC2 instances.

ANSWER : A


Question # 9

A finance company uses backup software to back up its data to physical tape storage onpremises. To comply with regulations, the company needs to store the data for 7 years. The company must be able to restore archived data within one week when necessary. The company wants to migrate the backup data to AWS to reduce costs. The company does not want to change the current backup software. Which solution will meet these requirements MOST cost-effectively?

A. Use AWS Storage Gateway Tape Gateway to copy the data to virtual tapes. Use AWS DataSync to migrate the virtual tapes to the Amazon S3 Standard-Infrequent Access (S3 Standard-IA). Change the target of the backup software to S3 Standard-IA. 
B. Convert the physical tapes to virtual tapes. Use AWS DataSync to migrate the virtual tapes to Amazon S3 Glacier Flexible Retrieval. Change the target of the backup software to the S3 Glacier Flexible Retrieval. 
C. Use AWS Storage Gateway Tape Gateway to copy the data to virtual tapes. Migrate the virtual tapes to Amazon S3 Glacier Deep Archive. Change the target of the backup software to the virtual tapes. 
D. Convert the physical tapes to virtual tapes. Use AWS Snowball Edge storage-optimized devices to migrate the virtual tapes to Amazon S3 Glacier Flexible Retrieval. Change the target of the backup software to S3 Glacier Flexible Retrieval. 

ANSWER : C


Question # 10

A company uses an Amazon EC2 Auto Scaling group to host an API. The EC2 instances are in a target group that is associated with an Application Load Balancer (ALB). The company stores data in an Amazon Aurora PostgreSQL database. The API has a weekly maintenance window. The company must ensure that the API returns a static maintenance response during the weekly maintenance window. Which solution will meet this requirement with the LEAST operational overhead?

A. Create a table in Aurora PostgreSQL that has fields to contain keys and values. Create a key for a maintenance flag. Set the flag when the maintenance window starts. Configure the API to query the table for the maintenance flag and to return a maintenance response if the flag is set. Reset the flag when the maintenance window is finished. 
B. Create an Amazon Simple Queue Service (Amazon SQS) queue. Subscribe the EC2 instances to the queue. Publish a message to the queue when the maintenance window starts. Configure the API to return a maintenance message if the instances receive a maintenance start message from the queue. Publish another message to the queue when the maintenance window is finished to restore normal operation. 
C. Create a listener rule on the ALB to return a maintenance response when the path on a request matches a wildcard. Set the rule priority to one. Perform the maintenance. When the maintenance window is finished, delete the listener rule. 
D. Create an Amazon Simple Notification Service (Amazon SNS) topic Subscribe the EC2 instances to the topic Publish a message to the topic when the maintenance window starts. Configure the API to return a maintenance response if the instances receive the maintenance start message from the topic. Publish another message to the topic when the maintenance window finshes to restore normal operation. 

ANSWER : C


Question # 11

A company has a payroll application that runs in the AWS Cloud. The application uses an Amazon Aurora MySQL database cluster for data storage. The company’s auditing team needs to review the last 90 days of payroll data. A solutions architect needs to design a solution to provide the auditing team access to the payroll data. Which solution will meet these requirements with the MOST operational efficiency?

A. Use Aurora automated backups. Restore the database by using point-in-time recovery.
B. Create a backup plan by using AWS Backup with point-in-time recovery. Restore the database by using the backups from the backup vault.
C. Create daily manual backups of the Aurora cluster for the last 90 days. Restore the databases by using the backups. Delete the older backup files by using scripted CLI calls.
D. Create a backup plan by using AWS Backup with the daily backup option. Set the retention to 90 days. Restore the database by using the backups from the backup vault.

ANSWER : D


Question # 12

A company is building a serverless application to process large video files that users upload. The application performs multiple tasks to process each video file. Processing can take up to 30 minutes for the largest files. The company needs a scalable architecture to support the processing application. Which solution will meet these requirements?

A. Store the uploaded video files in Amazon Elastic File System (Amazon EFS). Configure a schedule in Amazon EventBridge Scheduler to invoke an AWS Lambda function periodically to check for new files. Configure the Lambda function to perform all the processing tasks.
B. Store the uploaded video files in Amazon Elastic File System (Amazon EFS). Configure an Amazon EFS event notification to start an AWS Step Functions workflow that uses AWS Fargate tasks to perform the processing tasks.
C. Store the uploaded video files in Amazon S3. Configure an Amazon S3 event notification to send an event to Amazon EventBridge when a user uploads a new video file. Configure an AWS Step Functions workflow as a target for an EventBridge rule. Use the workflow to manage AWS Fargate tasks to perform the processing tasks.
D. Store the uploaded video files in Amazon S3. Configure an Amazon S3 event notification to invoke an AWS Lambda function when a user uploads a new video file. Configure the Lambda function to perform all the processing tasks.

ANSWER : C


Question # 13

A company is building a serverless application to process large video files that users upload. The application performs multiple tasks to process each video file. Processing can take up to 30 minutes for the largest files. The company needs a scalable architecture to support the processing application. Which solution will meet these requirements?

A. Store the uploaded video files in Amazon Elastic File System (Amazon EFS). Configure a schedule in Amazon EventBridge Scheduler to invoke an AWS Lambda function periodically to check for new files. Configure the Lambda function to perform all the processing tasks.
B. Store the uploaded video files in Amazon Elastic File System (Amazon EFS). Configure an Amazon EFS event notification to start an AWS Step Functions workflow that uses AWS Fargate tasks to perform the processing tasks.
C. Store the uploaded video files in Amazon S3. Configure an Amazon S3 event notification to send an event to Amazon EventBridge when a user uploads a new video file. Configure an AWS Step Functions workflow as a target for an EventBridge rule. Use the workflow to manage AWS Fargate tasks to perform the processing tasks.
D. Store the uploaded video files in Amazon S3. Configure an Amazon S3 event notification to invoke an AWS Lambda function when a user uploads a new video file. Configure the Lambda function to perform all the processing tasks.

ANSWER : C


Question # 14

A company runs a Microsoft Windows SMB file share on-premises to support an application. The company wants to migrate the application to AWS. The company wants to share storage across multiple Amazon EC2 instances. Which solutions will meet these requirements with the LEAST operational overhead? (Select TWO.)

A. Create an Amazon Elastic File System (Amazon EFS) file system with elastic throughput.
B. Create an Amazon FSx for NetApp ONTAP file system.
C. Use Amazon Elastic Block Store (Amazon EBS) to create a self-managed Windows file share on the instances.
D. Create an Amazon FSx for Windows File Server file system.
E. Create an Amazon FSx for OpenZFS file system.

ANSWER : A,D


Question # 15

A solutions architect needs to implement a solution that can handle up to 5,000 messages per second. The solution must publish messages as events to multiple consumers. The messages are up to 500 KB in size. The message consumers need to have the ability to use multiple programming languages to consume the messages with minimal latency. The solution must retain published messages for more than 3 months. The solution must enforce strict ordering of the messages.

A. Publish messages to an Amazon Kinesis Data Streams data stream. Enable enhanced fan-out. Ensure that consumers ingest the data stream by using dedicated throughput. 
B. Publish messages to an Amazon Simple Notification Service (Amazon SNS) topic. Ensure that consumers use an Amazon Simple Queue Service (Amazon SQS) FIFO queue to subscribe to the topic. 
C. Publish messages to Amazon EventBridge. Allow each consumer to create rules to deliver messages to the consumer's own target. 
D. Publish messages to an Amazon Simple Notification Service (Amazon SNS) topic. Ensure that consumers use Amazon Data Firehose to subscribe to the topic. 

ANSWER : A


Question # 16

A company stores 5 PB of archived data on physical tapes. The company needs to preserve the data for another 10 years. The data center that stores the tapes has a 10 Gbps Direct Connect connection to an AWS Region. The company wants to migrate the data to AWS within the next 6 months.

A. Read the data from the tapes on premises. Use local storage to stage the data. Use AWS DataSync to migrate the data to Amazon S3 Glacier Flexible Retrieval storage.
B. Use an on-premises backup application to read the data from the tapes. Use the backup application to write directly to Amazon S3 Glacier Deep Archive storage.
C. Order multiple AWS Snowball Edge devices. Copy the physical tapes to virtual tapes on the Snowball Edge devices. Ship the Snowball Edge devices to AWS. Create an S3 Lifecycle policy to move the tapes to Amazon S3 Glacier Instant Retrieval storage.
D. Configure an on-premises AWS Storage Gateway Tape Gateway. Create virtual tapes in the AWS Cloud. Use backup software to copy the physical tapes to the virtual tapes. Move the virtual tapes to Amazon S3 Glacier Deep Archive storage.

ANSWER : D


Question # 17

A company stores data in Amazon S3. According to regulations, the data must not contain personally identifiable information (PII). The company recently discovered that S3 buckets have some objects that contain PII. The company needs to automatically detect PII in S3 buckets and to notify the company's security team. Which solution will meet these requirements?

A. Use Amazon Macie. Create an Amazon EventBridge rule to filter the SensitiveData event type from Macie findings and to send an Amazon Simple Notification Service (Amazon SNS) notification to the security team.
B. Use Amazon GuardDuty. Create an Amazon EventBridge rule to filter the CRITICAL event type from GuardDuty findings and to send an Amazon Simple Notification Service (Amazon SNS) notification to the security team.
C. Use Amazon Macie. Create an Amazon EventBridge rule to filter the SensitiveData:S3Object/Personal event type from Macie findings and to send an Amazon Simple Queue Service (Amazon SQS) notification to the security team.
D. Use Amazon GuardDuty. Create an Amazon EventBridge rule to filter the CRITICAL event type from GuardDuty findings and to send an Amazon Simple Queue Service (Amazon SQS) notification to the security team.

ANSWER : A


Question # 18

A company has an application that runs on an Amazon Elastic Kubernetes Service (Amazon EKS) cluster on Amazon EC2 instances. The application has a U1 that uses Amazon DynamoDB and data services that use Amazon S3 as part of the application deployment. The company must ensure that the EKS Pods for the U1 can access only Amazon DynamoDB and that the EKS Pods for the data services can access only Amazon S3. The company uses AWS Identity and Access Management |IAM). Which solution meets these requirements?

A. Create separate 1AM policies (or Amazon S3 and DynamoDB access with the required permissions. Attach both 1AM policies to the EC2 instance profile. Use role-based access control (RBAC) to control access to Amazon S3 or DynamoDB (or the respective EKS Pods.
B. Create separate 1AM policies (or Amazon S3 and DynamoDB access with the required permissions. Attach the Amazon S3 1AM policy directly to the EKS Pods (or the data services and the DynamoDB policy to the EKS Pods for the U1.
C. Create separate Kubernetes service accounts for the U1 and data services to assume an 1AM role. Attach the Amazon S3 Full Access policy to the data services account and the AmazonDynamoDBFullAccess policy to the U1 service account.
D. Create separate Kubernetes service accounts for the U1 and data services to assume an 1AM role. Use 1AM Role for Service Accounts (IRSA) to provide access to the EKS Pods for the U1 to Amazon S3 and the EKS Pods for the data services to DynamoDB.

ANSWER : A


Question # 19

A company deploys its applications on Amazon Elastic Kubernetes Service (Amazon EKS) behind an Application Load Balancer in an AWS Region. The application needs to store data in a PostgreSQL database engine. The company wants the data in the database to be highly available. The company also needs increased capacity for read workloads. Which solution will meet these requirements with the MOST operational efficiency?

A. Create an Amazon DynamoDB database table configured with global tables.
B. Create an Amazon RDS database with Multi-AZ deployments
C. Create an Amazon RDS database with Multi-AZ DB cluster deployment.
D. Create an Amazon RDS database configured with cross-Region read replicas.

ANSWER : C


Question # 20

A company deploys its applications on Amazon Elastic Kubernetes Service (Amazon EKS) behind an Application Load Balancer in an AWS Region. The application needs to store data in a PostgreSQL database engine. The company wants the data in the database to be highly available. The company also needs increased capacity for read workloads. Which solution will meet these requirements with the MOST operational efficiency?

A. Create an Amazon DynamoDB database table configured with global tables.
B. Create an Amazon RDS database with Multi-AZ deployments
C. Create an Amazon RDS database with Multi-AZ DB cluster deployment.
D. Create an Amazon RDS database configured with cross-Region read replicas.

ANSWER : C


Question # 21

A company needs to give a globally distributed development team secure access to the company's AWS resources in a way that complies with security policies. The company currently uses an on-premises Active Directory for internal authentication. The company uses AWS Organizations to manage multiple AWS accounts that support multiple projects. The company needs a solution to integrate with the existing infrastructure to provide centralized identity management and access control. Which solution will meet these requirements with the LEAST operational overhead?

A. Set up AWS Directory Service to create an AWS managed Microsoft Active Directory on AWS. Establish a trust relationship with the on-premises Active Directory. Use 1AM roles that are assigned to Active Directory groups to access AWS resources within the company's AWS accounts.
B. Create an 1AM user for each developer. Manually manage permissions for each 1AM user based on each user's involvement with each project. Enforce multi-factor authentication (MFA) as an additional layer of security.
C. Use AD Connector in AWS Directory Service to connect to the on-premises Active Directory. Integrate AD Connector with AWS 1AM Identity Center. Configure permissions sets to give each AD group access to specific AWS accounts and resources.
D. Use Amazon Cognito to deploy an identity federation solution. Integrate the identity federation solution with the on-premises Active Directory. Use Amazon Cognito to provide access tokens for developers to access AWS accounts and resources.

ANSWER : C


Question # 22

A company is designing an application on AWS that processes sensitive data. The application stores and processes financial data for multiple customers. To meet compliance requirements, the data for each customer must be encrypted separately at rest by using a secure, centralized key management solution. The company wants to use AWS Key Management Service (AWS KMS) to implement encryption. Which solution will meet these requirements with the LEAST operational overhead'

A. Generate a unique encryption key for each customer. Store the keys in an Amazon S3 bucket. Enable server-side encryption.
B. Deploy a hardware security appliance in the AWS environment that securely stores customer-provided encryption keys. Integrate the security appliance with AWS KMS to encrypt the sensitive data in the application.
C. Create a single AWS KMS key to encrypt all sensitive data across the application.
D. Create separate AWS KMS keys for each customer's data that have granular access control and logging enabled.

ANSWER : D


Question # 23

A law firm needs to make hundreds of files readable for the general public. The law firm must prevent members of the public from modifying or deleting the files before a specified future date. Which solution will meet these requirements MOST securely?

A. Upload the files to an Amazon S3 bucket that is configured for static website hosting. Grant read-only IAM permissions to any AWS principals that access the S3 bucket until the specified date. 
B. Create a new Amazon S3 bucket. Enable S3 Versioning. Use S3 Object Lock and set a retention period based on the specified date. Create an Amazon CloudFront distribution to serve content from the bucket. Use an S3 bucket policy to restrict access to the CloudFront origin access control (OAC). 
C. Create a new Amazon S3 bucket. Enable S3 Versioning. Configure an event trigger to run an AWS Lambda function if a user modifies or deletes an object. Configure the Lambda function to replace the modified or deleted objects with the original versions of the objects from a private S3 bucket. 
D. Upload the files to an Amazon S3 bucket that is configured for static website hosting. Select the folder that contains the files. Use S3 Object Lock with a retention period based on the specified date. Grant read-only IAM permissions to any AWS principals that access the S3 bucket.

ANSWER : B


Question # 24

A company is designing a new internal web application in the AWS Cloud. The new application must securely retrieve and store multiple employee usernames and passwords from an AWS managed service. Which solution will meet these requirements with the LEAST operational overhead?

A. Store the employee credentials in AWS Systems Manager Parameter Store. Use AWS Cloud Formation and the BatchGetSecretValue API to retrieve usernames and passwords from Parameter Store.
B. Store the employee credentials in AWS Secrets Manager. Use AWS Cloud Formation and AWS Batch with the BatchGetSecretValue API to retrieve the usernames and passwords from Secrets Manager.
C. Store the employee credentials in AWS Systems Manager Parameter Store. Use AWS Cloud Formation and AWS Batch with the BatchGetSecretValue API to retrieve the usernames and passwords from Parameter Store.
D. Store the employee credentials in AWS Secrets Manager. Use AWS Cloud Formation and the BatchGetSecretValue API to retrieve the usernames and passwords from Secrets Manager.

ANSWER : D


Question # 25

A company is building a new furniture inventory application. The company has deployed the application on a fleet of Amazon EC2 instances across multiple Availability Zones. The EC2 instances run behind an Application Load Balancer (ALB) in their VPC. A solutions architect has observed that incoming traffic seems to favor one EC2 instance, resulting in latency for some requests. What should the solutions architect do to resolve this issue?

A. Disable session affinity (sticky sessions) on the ALB.
B. Replace the ALB with a Network Load Balancer.
C. Increase the number of EC2 instances in each Availability Zone.
D. Adjust the frequency of the health checks on the ALB's target group.

ANSWER : A


Question # 26

A finance company is migrating its trading platform to AWS. The trading platform processes a high volume of market data and processes stock trades. The company needs to establish a consistent, low-latency network connection from its on-premises data center to AWS. The company will host resources in a VPC. The solution must not use the public internet. Which solution will meet these requirements?

A. Use AWS Client VPN to connect the on-premises data center to AWS.
B. Use AWS Direct Connect to set up a connection from the on-premises data center to AWS
C. Use AWS PrivateLink to set up a connection from the on-premises data center to AWS.
D. Use AWS Site-to-Site VPN to connect the on-premises data center to AWS.

ANSWER : B


Question # 27

How can a law firm make files publicly readable while preventing modifications or deletions until a specific future date?

A. Upload files to an Amazon S3 bucket configured for static website hosting. Grant readonly IAM permissions to any AWS principals.
B. Create an S3 bucket. Enable S3 Versioning. Use S3 Object Lock with a retention period. Create a CloudFront distribution. Use a bucket policy to restrict access.
C. Create an S3 bucket. Enable S3 Versioning. Configure an event trigger with AWS Lambda to restore modified objects from a private S3 bucket.
D. Upload files to an S3 bucket for static website hosting. Use S3 Object Lock with a retention period. Grant read-only IAM permissions.

ANSWER : B


Question # 28

A company wants to implement a data lake in the AWS Cloud. The company must ensure that only specific teams have access to sensitive data in the data lake. The company must have row-level access control for the data lake. Options:

A. Use Amazon RDS to store the data. Use IAM roles and permissions for data governance and access control. 
B. Use Amazon Redshift to store the data. Use IAM roles and permissions for data governance and access control. 
C. Use Amazon S3 to store the data. Use AWS Lake Formation for data governance and access control. 
D. Use AWS Glue Catalog to store the data. Use AWS Glue DataBrew for data governance and access control. 

ANSWER : C


Question # 29

A company is developing a new application that uses a relational database to store user data and application configurations. The company expects the application to have steady user growth. The company expects the database usage to be variable and read-heavy, with occasional writes. The company wants to cost-optimize the database solution. The company wants to use an AWS managed database solution that will provide the necessary performance. Which solution will meet these requirements MOST cost-effectively?

A. Deploy the database on Amazon RDS. Use Provisioned IOPS SSD storage to ensure consistent performance for read and write operations. 
B. Deploy the database on Amazon Aurora Serveriess to automatically scale the database capacity based on actual usage to accommodate the workload. 
C. Deploy the database on Amazon DynamoDB. Use on-demand capacity mode to automatically scale throughput to accommodate the workload. 
D. Deploy the database on Amazon RDS Use magnetic storage and use read replicas to accommodate the workload 

ANSWER : B


Question # 30

A company stores data for multiple business units in a single Amazon S3 bucket that is in the company's payer AWS account. To maintain data isolation, the business units store data in separate prefixes in the S3 bucket by using an S3 bucket policy. The company plans to add a large number of dynamic prefixes. The company does not want to rely on a single S3 bucket policy to manage data access at scale. The company wants to develop a secure access management solution in addition to the bucket policy to enforce prefix-level data isolation.

A. Configure the S3 bucket policy to deny s3:GetObject permissions for all users. Configure the bucket policy to allow s3:* access to individual business units.
B. Enable default encryption on the S3 bucket by using server-side encryption with Amazon S3 managed keys (SSE-S3).
C. Configure resource-based permissions on the S3 bucket by creating an S3 access point for each business unit.
D. Use pre-signed URLs to provide access to the S3 bucket.

ANSWER : C


Question # 31

A company's reporting system delivers hundreds of .csv files to an Amazon S3 bucket each day. The company must convert these files to Apache Parquet format and must store the files in a transformed data bucket. Which solution will meet these requirements with the LEAST development effort?

A. Create an Amazon EMR cluster with Apache Spark installed. Write a Spark application to transform the data. Use EMR File System (EMRFS) to write files to the transformed data bucket.
B. Create an AWS Glue crawler to discover the data. Create an AWS Glue extract, transform, and load (ETL) job to transform the data. Specify the transformed data bucket in the output step.
C. Use AWS Batch to create a job definition with Bash syntax to transform the data and output the data to the transformed data bucket. Use the job definition to submit a job. Specify an array job as the job type.
D. Create an AWS Lambda function to transform the data and output the data to the transformed data bucket. Configure an event notification for the S3 bucket. Specify the Lambda function as the destination for the event notification.

ANSWER : B


Question # 32

A company has developed a non-production application that is composed of multiple microservices for each of the company's business units. A single development team maintains all the microservices. The current architecture uses a static web frontend and a Java-based backend that contains the application logic. The architecture also uses a MySQL database that the company hosts on an Amazon EC2 instance. The company needs to ensure that the application is secure and available globally. Which solution will meet these requirements with the LEAST operational overhead

A. Use Amazon CloudFront and AWS Amplify to host the static web frontend. Refactor the microservices to use AWS Lambda functions that the microservices access by using Amazon API Gateway. Migrate the MySQL database to an Amazon EC2 Reserved Instance.
B. Use Amazon CloudFront and Amazon S3 to host the static web frontend. Refactor the microservices to use AWS Lambda functions that the microservices access by using Amazon API Gateway. Migrate the MySQL database to Amazon RDS for MySQL.
C. Use Amazon CloudFront and Amazon S3 to host the static web frontend. Refactor the microservices to use AWS Lambda functions that are in a target group behind a Network Load Balancer. Migrate the MySQL database to Amazon RDS for MySQL.
D. Use Amazon S3 to host the static web frontend. Refactor the microservices to use AWS Lambda functions that are in a target group behind an Application Load Balancer. Migrate the MySQL database to an Amazon EC2 Reserved Instance.

ANSWER : B


Question # 33

A company is creating an application. The company stores data from tests of the application in multiple on-premises locations. The company needs to connect the onpremises locations to VPCs in an AWS Region in the AWS Cloud. The number of accounts and VPCs will increase during the next year. The network architecture must simplify the administration of new connections and must provide the ability to scale. Which solution will meet these requirements with the LEAST administrative overhead?

A. Create a peering connection between the VPCs. Create a VPN connection between the VPCs and the on-premises locations.
B. Launch an Amazon EC2 instance. On the instance, include VPN software that uses a VPN connection to connect all VPCs and on-premises locations.
C. Create a transit gateway. Create VPC attachments for the VPC connections. Create VPN attachments for the on-premises connections.
D. Create an AWS Direct Connect connection between the on-premises locations and a central VPC. Connect the central VPC to other VPCs by using peering connections.

ANSWER : C


Question # 34

A company is developing a containerized web application that needs to be highly available and scalable. The application requires access to GPU resources.

A. Package the application as an AWS Lambda function in a container image. Use Lambda to run the containerized application on a runtime with GPU access.
B. Deploy the application container to Amazon Elastic Kubernetes Service (Amazon EKS). Use AWS Fargate to manage compute resources and access to GPU resources.
C. Deploy the application container to Amazon Elastic Container Registry (Amazon ECR). Use Amazon ECR to run the containerized application with an attached GPU.
D. Run the application on Amazon EC2 instances from a GPU instance family by using Amazon Elastic Container Service (Amazon ECS) for orchestration.

ANSWER : D


Question # 35

A company hosts its core network services, including directory services and DNS, in its on- premises data center. The data center is connected to the AWS Cloud using AWS Direct Connect (DX). Additional AWS accounts are planned that will require quick, cost-effective, and consistent access to these network services. What should a solutions architect implement to meet these requirements with the LEAST amount of operational overhead?

A. Create a DX connection in each new account. Route the network traffic to the onpremises servers. 
B. Configure VPC endpoints in the DX VPC for all required services. Route the network traffic to the on-premises servers. 
C. Create a VPN connection between each new account and the DX VPC. Route the network traffic to the on-premises servers. 
D. Configure AWS Transit Gateway between the accounts. Assign DX to the transit gateway and route network traffic to the on-premises servers. 

ANSWER : D


Question # 36

A company recently launched a new product that is highly available in one AWS Region The product consists of an application that runs on Amazon Elastic Container Service (Amazon ECS), a public Application Load Balancer (ALB), and an Amazon DynamoDB table. The company wants a solution that will make the application highly available across Regions. Which combination of steps will meet these requirements? (Select THREE.)

A. In a different Region, deploy the application to a new ECS cluster that is accessible through a new ALB.
B. Create an Amazon Route 53 failover record.
C. Modify the DynamoDB table to create a DynamoDB global table.
D. In the same Region, deploy the application to an Amazon Elastic Kubernetes Service (Amazon EKS) cluster that is accessible through a new ALB.
E. Modify the DynamoDB table to create global secondary indexes (GSIs).
F. Create an AWS PrivateLink endpoint for the application.

ANSWER : A,B,C


Question # 37

A company needs a cloud-based solution for backup, recovery, and archiving while retaining encryption key material control. Which combination of solutions will meet these requirements? (Select TWO)

A. Create an AWS Key Management Service (AWS KMS) key without key material. Import the company's key material into the KMS key.
B. Create an AWS KMS encryption key that contains key material generated by AWS KMS.
C. Store the data in Amazon S3 Standard-Infrequent Access (S3 Standard-IA). Use S3 Bucket Keys with AWS KMS keys.
D. Store the data in an Amazon S3 Glacier storage class. Use server-side encryption with customer-provided keys (SSE-C).
E. Store the data in AWS Snowball devices. Use server-side encryption with AWS KMS keys (SSE-KMS).

ANSWER : A,D


Question # 38

A company is redesigning a static website. The company needs a solution to host the new website in the company's AWS account. The solution must be secure and scalable. Which combination of solutions will meet these requirements? (Select THREE.)

A. Configure an Amazon CloudFront distribution. Set the Amazon S3 bucket as the origin.
B. Associate an AWS Certificate Manager (ACM) TLS certificate to the Amazon CloudFront distribution.
C. Enable static website hosting for the Amazon S3 bucket.
D. Create an Amazon S3 bucket to store the static website content.
E. Export the website's SSL/TLS certificate from AWS Certificate Manager (ACM) to the root of the Amazon S3 bucket.
F. Turn off Block Public Access for the Amazon S3 bucket.

ANSWER : A,B,D


Question # 39

A company is performing a security review of its Amazon EMR API usage. The company's developers use an integrated development environment (IDE) that is hosted on Amazon EC2 instances. The IDE is configured to authenticate users to AWS by using access keys. Traffic between the company's EC2 instances and EMR cluster uses public IP addresses. A solutions architect needs to improve the company's overall security posture. The solutions architect needs to reduce the company's use of long-term credentials and to limit the amount of communication that uses public IP addresses. Which combination of steps will MOST improve the security of the company's architecture? (Select TWO.)

A. Set up a gateway endpoint to the EMR cluster.
B. Set up interface VPC endpoints to connect to the EMR cluster.
C. Set up a private NAT gateway to connect to the EMR cluster.
D. Set up 1AM roles for the developers to use to connect to the Amazon EMR API.
E. Set up AWS Systems Manager Parameter Store to store access keys for each developer.

ANSWER : B,D


Question # 40

A company runs its production workload on Amazon EC2 instances with Amazon Elastic Block Store (Amazon EBS) volumes. A solutions architect needs to analyze the current EBS volume cost and to recommend optimizations. The recommendations need to include estimated monthly saving opportunities. Which solution will meet these requirements?

A. Use Amazon Inspector reporting to generate EBS volume recommendations for optimization.
B. Use AWS Systems Manager reporting to determine EBS volume recommendations for optimization.
C. Use Amazon CloudWatch metrics reporting to determine EBS volume recommendations for optimization.
D. Use AWS Compute Optimizer to generate EBS volume recommendations for optimization.

ANSWER : D


Question # 41

A company is using microservices to build an ecommerce application on AWS. The company wants to preserve customer transaction information after customers submit orders. The company wants to store transaction data in an Amazon Aurora database. The company expects sales volumes to vary throughout each year.

A. Use an Amazon API Gateway REST API to invoke an AWS Lambda function to send transaction data to the Aurora database. Send transaction data to an Amazon Simple Queue Service (Amazon SQS) queue that has a dead-letter queue. Use a second Lambda function to read from the SQS queue and to update the Aurora database. 
B. Use an Amazon API Gateway HTTP API to send transaction data to an Application Load Balancer (ALB). Use the ALB to send the transaction data to Amazon Elastic Container Service (Amazon ECS) on Amazon EC2. Use ECS tasks to store the data in Aurora database. 
C. Use an Application Load Balancer (ALB) to route transaction data to Amazon Elastic Kubernetes Service (Amazon EKS). Use Amazon EKS to send the data to the Aurora database. 
D. Use Amazon Data Firehose to send transaction data to Amazon S3. Use AWS Database Migration Service (AWS DMS) to migrate the data from Amazon S3 to the Aurora database. 

ANSWER : A


Question # 42

A solutions architect is building an Amazon S3 data lake for a company. The company uses Amazon Kinesis Data Firehose to ingest customer personally identifiable information (PII) and transactional data in near real-time to an S3 bucket. The company needs to mask all PII data before storing the data in the data lake. Which solution will meet these requirements?

A. Create an AWS Lambda function to detect and mask PII. Invoke the function from Kinesis Data Firehose.
B. Use Amazon Macie to scan the S3 bucket. Configure Macie to detect and mask PII.
C. Enable server-side encryption (SSE) on the S3 bucket.
D. Create an AWS Lambda function that integrates with AWS CloudHSM. Configure the function to detect and mask PII.

ANSWER : A


Question # 43

An ecommerce company is planning to migrate an on-premises Microsoft SQL Server database to the AWS Cloud. The company needs to migrate the database to SQL Server Always On availability groups. The cloud-based solution must be highly available. Options:

A. Deploy three Amazon EC2 instances with SQL Server across three Availability Zones. Attach one Amazon Elastic Block Store (Amazon EBS) volume to the EC2 instances. 
B. Migrate the database to Amazon RDS for SQL Server. Configure a Multi-AZ deployment and read replicas. 
C. Deploy three Amazon EC2 instances with SQL Server across three Availability Zones. Use Amazon FSx for Windows File Server as the storage tier. 
D. Deploy three Amazon EC2 instances with SQL Server across three Availability Zones. Use Amazon S3 as the storage tier. 

ANSWER : C


Question # 44

A company has developed an API by using an Amazon API Gateway REST API and AWS Lambda functions. The API serves static content and dynamic content to users worldwide. The company wants to decrease the latency of transferring the content for API requests. Which solution will meet these requirements?

A. Deploy the REST API as an edge-optimized API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.
B. Deploy the REST API as a Regional API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.
C. Deploy the REST API as an edge-optimized API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.
D. Deploy the REST API as a Regional API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.

ANSWER : A


Question # 45

A company is developing a social media application. The company anticipates rapid and unpredictable growth in users and data volume. The application needs to handle a continuous high volume of user requests. User requests include long-running processes that store large amounts of user-generated content and user profiles in a relational format. The processes must run in a specific order. The company requires an architecture that can scale resources to meet demand spikes without downtime or performance degradation. The company must ensure that the components of the application can evolve independently without affecting other parts of the system. Which combination of AWS services will meet these requirements?

A. Deploy the application on Amazon Elastic Container Service (Amazon ECS) with the AWS Fargate launch type. Use Amazon RDS as the database. Use Amazon Simple Queue Service (Amazon SQS) to decouple message processing between components.
B. Deploy the application on Amazon Elastic Container Service (Amazon ECS) with the AWS Fargate launch type. Use Amazon RDS as the database. Use Amazon Simple Notification Service (Amazon SNS) to decouple message processing between components.
C. Use Amazon DynamoDB as the database. Use AWS Lambda functions to implement the application. Configure Amazon DynamoDB Streams to invoke the Lambda functions. Use AWS Step Functions to manage workflows between services.
D. Use an AWS Elastic Beanstalk environment with auto scaling to deploy the application. Use Amazon RDS as the database. Use Amazon Simple Notification Service (Amazon SNS) to decouple message processing between components.

ANSWER : A


Question # 46

A company has developed an API using an Amazon API Gateway REST API and AWS Lambda functions. The API serves static and dynamic content to users worldwide. The company wants to decrease the latency of transferring content for API requests. Options:

A. Deploy the REST API as an edge-optimized API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.
B. Deploy the REST API as a Regional API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.
C. Deploy the REST API as an edge-optimized API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.
D. Deploy the REST API as a Regional API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.

ANSWER : A


Question # 47

A company is creating a low-latency payment processing application that supports TLS connections from IPv4 clients. The application requires outbound access to the public internet. Users must access the application from a single entry point. The bank wants to use Amazon Elastic Container Service (Amazon ECS) tasks to deploy the application. The company wants to enable AWSVPC network mode. Which solution will meet these requirements MOST securely?

A. Create a VPC that has an internet gateway, public subnets, and private subnets. Deploy a Network Load Balancer and a NAT gateway in the public subnets. Deploy the ECS tasks in the private subnets.
B. Create a VPC that has an outbound-only internet gateway, public subnets, and private subnets. Deploy an Application Load Balancer and a NAT gateway in the public subnets. Deploy the ECS tasks in the private subnets.
C. Create a VPC that has an internet gateway, public subnets, and private subnets. Deploy an Application Load Balancer in the public subnets. Deploy the ECS tasks in the public subnets.
D. Create a VPC that has an outbound-only internet gateway, public subnets, and private subnets. Deploy a Network Load Balancer in the public subnets. Deploy the ECS tasks in the public subnets.

ANSWER : A


Question # 48

A company needs to ingest and analyze telemetry data from vehicles at scale for machine learning and reporting. Which solution will meet these requirements?

A. Use Amazon Timestream for LiveAnalytics to store data points. Grant Amazon SageMaker permission to access the data. Use Amazon QuickSight to visualize the data.
B. Use Amazon DynamoDB to store data points. Use DynamoDB Connector to ingest data into Amazon EMR for processing. Use Amazon QuickSight to visualize the data.
C. Use Amazon Neptune to store data points. Use Amazon Kinesis Data Streams to ingest data into a Lambda function for processing. Use Amazon QuickSight to visualize the data.
D. Use Amazon Timestream for LiveAnalytics to store data points. Grant Amazon SageMaker permission to access the data. Use Amazon Athena to visualize the data.

ANSWER : A


Question # 49

A media company is launching a new product platform that artists from around the world can use to upload videos and images directly to an Amazon S3 bucket. The company owns and maintains the S3 bucket. The artists must be able to upload files from personal devices without the need for AWS credentials or an AWS account. Which solution will meet these requirements MOST securely?

A. Enable cross-origin resource sharing (CORS) on the S3 bucket.
B. Turn off block public access for the S3 bucket. Share the bucket URL to the artists to enable uploads without credentials.
C. Use an IAM role that has upload permissions for the S3 bucket to generate presigned URLs for S3 prefixes that are specific to each artist. Share the URLs to the artists.
D. Create a web interface that uses an IAM role that has permission to upload and view objects in the S3 bucket. Share the web interface URL to the artists.

ANSWER : C


Question # 50

A company stores petabytes of historical medical information on premises. The company has a process to manage encryption of the data to comply with regulations. The company needs a cloud-based solution for data backup, recovery, and archiving. The company must retain control over the encryption key material. Which combination of solutions will meet these requirements? (Select TWO.)

A. Create an AWS Key Management Service (AWS KMS) key without key material. Import the company's key material into the KMS key.
B. Create an AWS Key Management Service (AWS KMS) encryption key that contains key material generated by AWS KMS. 
C. Store the data in Amazon S3 Standard-Infrequent Access (S3 Standard-IA) storage. Use S3 Bucket Keys with AWS Key Management Service (AWS KMS) keys. 
D. Store the data in an Amazon S3 Glacier storage class. Use server-side encryption with customer-provided keys (SSE-C). 
E. Store the data in AWS Snowball devices. Use server-side encryption with AWS KMS keys (SSE-KMS). 

ANSWER : A,D


What our clients say about SAA-C03 Test Preparations

flags     David Peterson     Apr 29, 2025

The Amazon SAA-C03 exam seemed easy after studying with amazonawsdumps.com’s reliable practice questions.

Amazon SAA-C03 Exam Prep Made Easy | amazonawsdumps.com

Leave a comment

Your email address will not be published. Required fields are marked *

Rating / Feedback About This Exam