Amazon SAA-C03 dumps

Amazon SAA-C03 Dumps

Amazon AWS Certified Solutions Architect - Associate (SAA-C03)

Looking for Amazon SAA-C03 Practice Questions? Rejoice because you have reached your destination. Amazonawsdumps.com have prepared a special kind of test material that alters according to the individual candidate’s skillset. Our smart system presents Amazon SAA-C03 Question Answers exactly like they are in the actual exam. We report your progress at the end of each test to ensures 100% success.

discount banner
PDF Demo $35 Add to cart
Test Engine Demo $45 Add to cart
PDF + Test Engine $55 Add to cart

Here are some more features of Amazon SAA-C03 PDF:

683 questions with answers Updation Date : 21 May, 2024
Unlimited practice questions Routinely Daily Updates
Takes Just 1 Day to Prepare Exam Passing Guaranteed at First Go
Money-Back Facility 3 Months Free Updates

SAA-C03 is the new test code of the AWS Certified Solutions Architect - Associate test which is booked to be followed through on August 30, 2022. Its continuous variation is SAA-C02 and is reserved to be decommissioned by August 29, 2022 so you really want to know the differentiation between these two structures.

Introduction

The AWS Certified Solutions Architect - Associate certificate test is expected for IT Specialists who play out a solution Modeler or DevOps work and have a long time of dynamic contribution with arranging open, cost-capable, deficiency receptive, and flexible dispersed structures on the Amazon Web solutions (AWS) stage. Expecting you at this point have the SAA insistence and you wish you re-affirm, then, you want to step through the exam by August 29 or you can similarly hold on for the new SAA-C03 transformation to arise.

AMAZON AWS SAA-C03 EXAM DUMPS

 Amazon AWS SAA-C03 Dumps are best for accurate results and 100% passing guarantee methodologies. Amazonawsdumps.com provides most genuine and authentic question and answer PDF material that prepare you 100% for your actual Amazon AWS SAA-C03 Exam. Amazonawsdumps gives you full assistance, passing guarantee and total privacy and security when you purchased Amazon AWS SAA-C03 Exam Dumps from their site.

Target Audience for Amazon AWS SAA-C03 Exam

The candidate who wants to clear Amazon AWS SAA-C03 Exam should have one year working experience in developing cloud solutions by utilizing AWS services.

Exam Type

The exam comprises on two type of questions:

  • Multiple choice questions
  • Multiple response questions

Exam modules Amazon AWS SAA-C03 Exam

  • Plan Secure Structures 30%
  • Plan Tough Designs 26%
  • Plan High-Performing Structures 24%
  • Plan Cost-Improved Models 20%

Exam format for Amazon AWS SAA-C03 Exam

  • Test Code: SAA-C03
  • Delivery Date: August 30, 2022
  • Perquisite: None
  • No. of Inquiries: 65
  • Score Reach: 100/1000
  • Cost: 150 USD (Practice test: 20 USD)
  • Passing Score: 720/1000
  • Time duration: 2 hours 10 minutes (130 minutes)
  • Type of test: Situation based. Different decision/various responses.
  • Conveyance Technique: Testing focus or online delegated Test

Why Dumps are Necessary to Pass Amazon AWS SAA-C03 Exam

Most Recent Material Supplier for SAA-C03 Test

Breeze through Amazon SAA-C03 exam with Amazonawsdumps SAA-C03 Dumps PDF. All Amazon SAA-C03 Test Questions are the latest and by and large revived and cover the entire Amazon AWS Certified Solutions Architect - Associate SAA-C03 test schedule. The SAA-C03 Questions and Answers are Printable in Brilliant Guide that you can download in your PC or another device and start setting up your SAA-C03 Test.

Dumps PDF material Made by Trained Professionals

Exactly when you are attempting to consider to be trustworthy exam dumps for the course of action of AWS Certified Solutions Architect - Associate (SAA-C03) certification test, you ought to be sure that you are picking the right source. All of the PDF dumps from Amazonawsdumps are made by the recognized specialists and you will really need to get all the help you with expecting for clearing the SAA-C03 test. You can tirelessly interact with their preparation test material so you can achieve the best results. If you have no clue about how you can besides develop your status level for AWS Certified Solutions Architect - Associate (SAA-C03) certificate, then, you should buy the Amazonawsdumps study material and you will not at any point regret.

100% passing assurance

With the help of test dumps you can achieve 100% accomplishment in your AWS Certified Solutions Architect - Associate (SAA-C03) certification and you won't manage any issues while using PDF dumps for organizing of SAA-C03 test. Test dumps have major areas of strength for basic for a base and their clients are content with the results. They gave the entire day genuinely strong support for tackle any unquestionable issue associated with the test coordinating.

FAQ’S

What is passing score for Amazon AWS SAA-C03 Exam?

720 is the minimum passing score that required passing Amazon AWS SAA-C03 Exam.

Can I pass Amazon AWS SAA-C03 Exam with just one week preparation?

Yes with the assistance of Amazon AWS SAA-C03 Exam Dumps you can easily breeze through this exam with just one week or 3 days preparation.

From where can I buy the best exam dumps for Amazon AWS SAA-C03 Exam?

For best and excellent preparation of Amazon AWS SAA-C03 Exam you can check Amazonawsdumps exam dumps and you won’t regret anymore.

What is the passing ratio Amazon AWS SAA-C03 Exam?

The failure rate of Amazon AWS SAA-C03 Exam is above 72% that means only 28% applicant who took the Amazon AWS SAA-C03 Exam somehow manage to pass it.

Which AWS certificate is best for new comers?

The first and most demanding certificate AWS Certified Solution Architect Associate certificate is best for new comers.

Why Pass Amazon SAA-C03 Exam?

In today’s world, you need the validation of your skills to get past the competition. Amazon SAA-C03 Exam is that validation. Not only is Amazon a leading industry in IT but it also offers certification exams to prove Amazon's skills. These skills prove you capable of fulfilling the Amazon job role. To get certified you simply pass the SAA-C03 Exam. This brings us to Amazon SAA-C03 Question Answers set. Passing this certification exam from Amazon may seem easy but it’s not. Many students fail this exam only because they didn’t take it seriously. Don’t make this mistake and order your Amazon SAA-C03 Braindumps right now!

Amazonawsdumps.com is the most popular and reliable website that has helped thousands of candidates excel at Amazon Exams. You could be one of those fortunate few too. Pass your exam in one attempt with Amazon SAA-C03 PDF and own the future. Buy Now!

Superlative Amazon SAA-C03 Dumps!

We know we said passing amazon exams is hard but that’s only if you’ve been led astray. There are millions of Amazon SAA-C03 Practice Questions available online promising success but fail when it comes down to it. Choose your training material carefully and get Amazon SAA-C03 Question Answers that are valid, accurate, and approved by famous IT professionals. Our Amazon SAA-C03 Braindumps are created by experts for experts and generate first-class results in just a single attempt. Don’t believe us? Try our free demo version that contains all the features you’ll get with Amazon SAA-C03 PDF. An interactive design, easy to read format, understandable language, and concise pattern. And if you still don’t get the result you want and fail somehow, you get your money back in full. So, order your set of Amazon SAA-C03 Dumps now!

We promise our customers to take full responsibility for their learning, preparation and passing SAA-C03 Exams without a hunch. Our aim is your satisfaction and ease. That is why we demand only the reasonable cost on Amazon SAA-C03 Practice Questions. Moreover, offer 2 formats: PDF and online test engine. Also, there is always a little extra with our discount coupons.

Why Buy Amazon SAA-C03 Question Answers?

Amazonawsdumps.com the team is a bunch of experts who got lucky with Amazon SAA-C03 Braindumps. We got what we needed to pass the exam and we went through its challenges as well. That is why we want every Amazon Candidate to get success. Choosing among so many options of Amazon SAA-C03 PDF is a tricky situation. Sometimes they don’t turn out like they first appeared to be. That is the reason we offer our valued customers a free demo. They can get a test run of Amazon SAA-C03 Dumps before they buy it. When it comes to buying, the procedure is simple, secure, and hardly jeopardizing. Because our Amazon SAA-C03 Practice Questions have a 99.8% passing rate.

Amazon SAA-C03 Sample Questions

Question # 1

A company hosts a database that runs on an Amazon RDS instance that is deployed tomultiple Availability Zones. The company periodically runs a script against the database toreport new entries that are added to the database. The script that runs against thedatabase negatively affects the performance of a critical application. The company needsto improve application performance with minimal costs.Which solution will meet these requirements with the LEAST operational overhead?

A. Add functionality to the script to identify the instance that has the fewest activeconnections. Configure the script to read from that instance to report the total new entries.
B. Create a read replica of the database. Configure the script to query only the read replicato report the total new entries.
C. Instruct the development team to manually export the new entries for the day in thedatabase at the end of each day.
D. Use Amazon ElastiCache to cache the common queries that the script runs against thedatabase.

ANSWER : B


Question # 2

A company uses an organization in AWS Organizations to manage AWS accounts thatcontain applications. The company sets up a dedicated monitoring member account in theorganization. The company wants to query and visualize observability data across theaccounts by using Amazon CloudWatch.Which solution will meet these requirements?

A. Enable CloudWatch cross-account observability for the monitoring account. Deploy anAWS CloudFormation template provided by the monitoring account in each AWS accountto share the data with the monitoring account.
B. Set up service control policies (SCPs) to provide access to CloudWatch in themonitoring account under the Organizations root organizational unit (OU).
C. Configure a new IAM user in the monitoring account. In each AWS account, configurean 1AM policy to have access to query and visualize the CloudWatch data in the account.Attach the new 1AM policy to the new 1AM user.
D. Create a new IAM user in the monitoring account. Create cross-account 1AM policies ineach AWS account. Attach the 1AM policies to the new IAM user.

ANSWER : A


Question # 3

A company's developers want a secure way to gain SSH access on the company's Amazon EC2 instances that run the latest version of Amazon Linux. The developers workremotely and in the corporate office.The company wants to use AWS services as a part of the solution. The EC2 instances arehosted in a VPC private subnet and access the internet through a NAT gateway that isdeployed in a public subnet.What should a solutions architect do to meet these requirements MOST cost-effectively?

A. Create a bastion host in the same subnet as the EC2 instances. Grant the ec2:CreateVpnConnection 1AM permission to the developers. Install EC2 Instance Connect sothat the developers can connect to the EC2 instances.
B. Create an AWS Site-to-Site VPN connection between the corporate network and theVPC. Instruct the developers to use the Site-to-Site VPN connection to access the EC2instances when the developers are on the corporate network. Instruct the developers to setup another VPN connection for access when they work remotely.
C. Create a bastion host in the public subnet of the VPC. Configure the security groups andSSH keys of the bastion host to only allow connections and SSH authentication from thedevelopers' corporate and remote networks. Instruct the developers to connect through thebastion host by using SSH to reach the EC2 instances.
D. Attach the AmazonSSMManagedlnstanceCore 1AM policy to an 1AM role that isassociated with the EC2 instances. Instruct the developers to use AWS Systems ManagerSession Manager to access the EC2 instances.

ANSWER : D


Question # 4

A company has a web application that includes an embedded NoSQL database. Theapplication runs on Amazon EC2 instances behind an Application Load Balancer (ALB).The instances run in an Amazon EC2 Auto Scaling group in a single Availability Zone.A recent increase in traffic requires the application to be highly available and for thedatabase to be eventually consistentWhich solution will meet these requirements with the LEAST operational overhead?

A. Replace the ALB with a Network Load Balancer Maintain the embedded NoSQLdatabase with its replication service on the EC2 instances.
B. Replace the ALB with a Network Load Balancer Migrate the embedded NoSQLdatabase to Amazon DynamoDB by using AWS Database Migration Service (AWS DMS).
C. Modify the Auto Scaling group to use EC2 instances across three Availability Zones.Maintain the embedded NoSQL database with its replication service on the EC2 instances.
D. Modify the Auto Scaling group to use EC2 instances across three Availability Zones.Migrate the embedded NoSQL database to Amazon DynamoDB by using AWS DatabaseMigration Service (AWS DMS).ccccccccc

ANSWER : D


Question # 5

A company has data collection sensors at different locations. The data collection sensorsstream a high volume of data to the company. The company wants to design a platform onAWS to ingest and process high-volume streaming data. The solution must be scalable andsupport data collection in near real time. The company must store the data in Amazon S3for future reporting.Which solution will meet these requirements with the LEAST operational overhead?

A. Use Amazon Kinesis Data Firehose to deliver streaming data to Amazon S3.
B. Use AWS Glue to deliver streaming data to Amazon S3.
C. Use AWS Lambda to deliver streaming data and store the data to Amazon S3.
D. Use AWS Database Migration Service (AWS DMS) to deliver streaming data to AmazonS3.

ANSWER : A


Question # 6

A company has applications that run on Amazon EC2 instances. The EC2 instancesconnect to Amazon RDS databases by using an 1AM role that has associated policies. Thecompany wants to use AWS Systems Manager to patch the EC2 instances withoutdisrupting the running applications.Which solution will meet these requirements?

A. Create a new 1AM role. Attach the AmazonSSMManagedlnstanceCore policy to thenew 1AM role. Attach the new 1AM role to the EC2 instances and the existing 1AM role.
B. Create an 1AM user. Attach the AmazonSSMManagedlnstanceCore policy to the 1AMuser. Configure Systems Manager to use the 1AM user to manage the EC2 instances.
C. Enable Default Host Configuration Management in Systems Manager to manage theEC2 instances.
D. Remove the existing policies from the existing 1AM role. Add theAmazonSSMManagedlnstanceCore policy to the existing 1AM role.

ANSWER : C


Question # 7

A company is running a legacy system on an Amazon EC2 instance. The application codecannot be modified, and the system cannot run on more than one instance. A solutionsarchitect must design a resilient solution that can improve the recovery time for the system.What should the solutions architect recommend to meet these requirements?

A. Enable termination protection for the EC2 instance.
B. Configure the EC2 instance for Multi-AZ deployment.
C. Create an Amazon CloudWatch alarm to recover the EC2 instance in case of failure.
D. Launch the EC2 instance with two Amazon Elastic Block Store (Amazon EBS) volumesthat use RAID configurations for storage redundancy.

ANSWER : C


Question # 8

A company that uses AWS needs a solution to predict the resources needed formanufacturing processes each month. The solution must use historical values that arecurrently stored in an Amazon S3 bucket The company has no machine learning (ML)experience and wants to use a managed service for the training and predictions.Which combination of steps will meet these requirements? (Select TWO.)

A. Deploy an Amazon SageMaker model. Create a SageMaker endpoint for inference.
B. Use Amazon SageMaker to train a model by using the historical data in the S3 bucket.
C. Configure an AWS Lambda function with a function URL that uses Amazon SageMakerendpoints to create predictions based on the inputs.
D. Configure an AWS Lambda function with a function URL that uses an Amazon Forecastpredictor to create a prediction based on the inputs.
E. Train an Amazon Forecast predictor by using the historical data in the S3 bucket.

ANSWER : B,E


Question # 9

A solutions architect wants to use the following JSON text as an identity-based policy togrant specific permissions: Which IAM principals can the solutions architect attach this policy to? (Select TWO.)

A. Role
B. Group
C. Organization
D. Amazon Elastic Container Service (Amazon ECS) resource
E. Amazon EC2 resource

ANSWER : A,B


Question # 10

A company has an on-premises MySQL database that handles transactional data. Thecompany is migrating the database to the AWS Cloud. The migrated database mustmaintain compatibility with the company's applications that use the database. The migrateddatabase also must scale automatically during periods of increased demand.Which migration solution will meet these requirements?

A. Use native MySQL tools to migrate the database to Amazon RDS for MySQL. Configureelastic storage scaling.
B. Migrate the database to Amazon Redshift by using the mysqldump utility. Turn on AutoScaling for the Amazon Redshift cluster.
C. Use AWS Database Migration Service (AWS DMS) to migrate the database to AmazonAurora. Turn on Aurora Auto Scaling.
D. Use AWS Database Migration Service (AWS DMS) to migrate the database to AmazonDynamoDB. Configure an Auto Scaling policy.

ANSWER : C


Question # 11

A company has 150 TB of archived image data stored on-premises that needs to be movedto the AWS Cloud within the next month. The company's current network connection allowsup to 100 Mbps uploads for this purpose during the night only.What is the MOST cost-effective mechanism to move this data and meet the migrationdeadline?

A. Use AWS Snowmobile to ship the data to AWS.
B. Order multiple AWS Snowball devices to ship the data to AWS.
C. Enable Amazon S3 Transfer Acceleration and securely upload the data.
D. Create an Amazon S3 VPC endpoint and establish a VPN to upload the data.

ANSWER : B


Question # 12

A company has an application that delivers on-demand training videos to students aroundthe world. The application also allows authorized content developers to upload videos. Thedata is stored in an Amazon S3 bucket in the us-east-2 Region.The company has created an S3 bucket in the eu-west-2 Region and an S3 bucket in theap-southeast-1 Region. The company wants to replicate the data to the new S3 buckets.The company needs to minimize latency for developers who upload videos and studentswho stream videos near eu-west-2 and ap-southeast-1. Which combination of steps will meet these requirements with the FEWEST changes to theapplication? (Select TWO.)

A. Configure one-way replication from the us-east-2 S3 bucket to the eu-west-2 S3 bucket.Configure one-way replication from the us-east-2 S3 bucket to the ap-southeast-1 S3bucket.
B. Configure one-way replication from the us-east-2 S3 bucket to the eu-west-2 S3 bucket.Configure one-way replication from the eu-west-2 S3 bucket to the ap-southeast-1 S3bucket.
C. Configure two-way (bidirectional) replication among the S3 buckets that are in all threeRegions.
D. Create an S3 Multi-Region Access Point. Modify the application to use the AmazonResource Name (ARN) of the Multi-Region Access Point for video streaming. Do notmodify the application for video uploads.
E. Create an S3 Multi-Region Access Point Modify the application to use the AmazonResource Name (ARN) of the Multi-Region Access Point for video streaming and uploads.

ANSWER : A,E


Question # 13

A company runs its applications on Amazon EC2 instances that are backed by AmazonElastic Block Store (Amazon EBS). The EC2 instances run the most recent Amazon Linuxrelease. The applications are experiencing availability issues when the company's employees store and retrieve files that are 25 GB or larger. The company needs a solutionthat does not require the company to transfer files between EC2 instances. The files mustbe available across many EC2 instances and across multiple Availability Zones.Which solution will meet these requirements?

A. Migrate all the files to an Amazon S3 bucket. Instruct the employees to access the filesfrom the S3 bucket.
B. Take a snapshot of the existing EBS volume. Mount the snapshot as an EBS volumeacross the EC2 instances. Instruct the employees to access the files from the EC2instances.
C. Mount an Amazon Elastic File System (Amazon EFS) file system across all the EC2instances. Instruct the employees to access the files from the EC2 instances.
D. Create an Amazon Machine Image (AMI) from the EC2 instances. Configure new EC2instances from the AMI that use an instance store volume. Instruct the employees toaccess the files from the EC2 instances

ANSWER : C


Question # 14

A company is migrating its multi-tier on-premises application to AWS. The applicationconsists of a single-node MySQL database and a multi-node web tier. The company mustminimize changes to the application during the migration. The company wants to improveapplication resiliency after the migration.Which combination of steps will meet these requirements? (Select TWO.)

A. Migrate the web tier to Amazon EC2 instances in an Auto Scaling group behind anApplication Load Balancer.
B. Migrate the database to Amazon EC2 instances in an Auto Scaling group behind aNetwork Load Balancer.
C. Migrate the database to an Amazon RDS Multi-AZ deployment.
D. Migrate the web tier to an AWS Lambda function.
E. Migrate the database to an Amazon DynamoDB table.

ANSWER : A,C


Question # 15

A recent analysis of a company's IT expenses highlights the need to reduce backup costs.The company's chief information officer wants to simplify the on- premises backupinfrastructure and reduce costs by eliminating the use of physical backup tapes. Thecompany must preserve the existing investment in the on- premises backup applicationsand workflows.What should a solutions architect recommend?

A. Set up AWS Storage Gateway to connect with the backup applications using the NFSinterface.
B. Set up an Amazon EFS file system that connects with the backup applications using theNFS interface.
C. Set up an Amazon EFS file system that connects with the backup applications using theiSCSI interface.
D. Set up AWS Storage Gateway to connect with the backup applications using the iSCSIvirtualtape library (VTL) interface.

ANSWER : D


Question # 16

A company uses AWS Organizations. The company wants to operate some of its AWSaccounts with different budgets. The company wants to receive alerts and automaticallyprevent provisioning of additional resources on AWS accounts when the allocated budgetthreshold is met during a specific period.Which combination of solutions will meet these requirements? (Select THREE.)

A. Use AWS Budgets to create a budget. Set the budget amount under the Cost andUsage Reports section of the required AWS accounts.
B. Use AWS Budgets to create a budget. Set the budget amount under the Billingdashboards of the required AWS accounts.
C. Create an 1AM user for AWS Budgets to run budget actions with the requiredpermissions.
D. Create an 1AM role for AWS Budgets to run budget actions with the requiredpermissions.
E. Add an alert to notify the company when each account meets its budget threshold. Adda budget action that selects the 1AM identity created with the appropriate config rule toprevent provisioning of additional resources.
F. Add an alert to notify the company when each account meets its budget threshold. Add abudget action that selects the 1AM identity created with the appropriate service controlpolicy (SCP) to prevent provisioning of additional resources.

ANSWER : B,D,F


Question # 17

A company runs an application on AWS. The application receives inconsistent amounts ofusage. The application uses AWS Direct Connect to connect to an on-premises MySQLcompatibledatabase. The on-premises database consistently uses a minimum of 2 GiB ofmemory.The company wants to migrate the on-premises database to a managed AWS service. Thecompany wants to use auto scaling capabilities to manage unexpected workload increases.Which solution will meet these requirements with the LEAST administrative overhead?

A. Provision an Amazon DynamoDB database with default read and write capacity settings.
B. Provision an Amazon Aurora database with a minimum capacity of 1 Aurora capacityunit (ACU).
C. Provision an Amazon Aurora Serverless v2 database with a minimum capacity of 1Aurora capacity unit (ACU).
D. Provision an Amazon RDS for MySQL database with 2 GiB of memory.

ANSWER : C


Question # 18

A company plans to migrate toAWS and use Amazon EC2 On-Demand Instances for itsapplication. During the migration testing phase, a technical team observes that theapplication takes a long time to launch and load memory to become fully productive.Which solution will reduce the launch time of the application during the next testing phase?

A. Launch two or more EC2 On-Demand Instances. Turn on auto scaling features andmake the EC2 On-Demand Instances available during the next testing phase.
B. Launch EC2 Spot Instances to support the application and to scale the application so itis available during the next testing phase.
C. Launch the EC2 On-Demand Instances with hibernation turned on. Configure EC2 AutoScaling warm pools during the next testing phase.
D. Launch EC2 On-Demand Instances with Capacity Reservations. Start additional EC2instances during the next testing phase.

ANSWER : C


Question # 19

An loT company is releasing a mattress that has sensors to collect data about a user'ssleep. The sensors will send data to an Amazon S3 bucket. The sensors collectapproximately 2 MB of data every night for each mattress. The company must process andsummarize the data for each mattress. The results need to be available as soon aspossible Data processing will require 1 GB of memory and will finish within 30 seconds.Which solution will meet these requirements MOST cost-effectively?

A. Use AWS Glue with a Scalajob.
B. Use Amazon EMR with an Apache Spark script.
C. Use AWS Lambda with a Python script.
D. Use AWS Glue with a PySpark job.

ANSWER : C


Question # 20

A financial services company wants to shut down two data centers and migrate more than100 TB of data to AWS. The data has an intricate directory structure with millions of smallfiles stored in deep hierarchies of subfolders. Most of the data is unstructured, and thecompany's file storage consists of SMB-based storage types from multiple vendors. Thecompany does not want to change its applications to access the data after migration.What should a solutions architect do to meet these requirements with the LEASToperational overhead?

A. Use AWS Direct Connect to migrate the data to Amazon S3.
B. Use AWS DataSync to migrate the data to Amazon FSx for Lustre.
C. Use AWS DataSync to migrate the data to Amazon FSx for Windows File Server.
D. Use AWS Direct Connect to migrate the data on-premises file storage to an AWSStorage Gateway volume gateway.

ANSWER : C


Question # 21

A company's application runs on Amazon EC2 instances that are in multiple AvailabilityZones. The application needs to ingest real-time data from third-party applications.The company needs a data ingestion solution that places the ingested raw data in anAmazon S3 bucket.Which solution will meet these requirements?

A. Create Amazon Kinesis data streams for data ingestion. Create Amazon Kinesis DataFirehose delivery streams to consume the Kinesis data streams. Specify the S3 bucket asthe destination of the delivery streams.
B. Create database migration tasks in AWS Database Migration Service (AWS DMS).Specify replication instances of the EC2 instances as the source endpoints. Specify the S3bucket as the target endpoint. Set the migration type to migrate existing data and replicateongoing changes.
C. Create and configure AWS DataSync agents on the EC2 instances. Configure DataSynctasks to transfer data from the EC2 instances to the S3 bucket.
D. Create an AWS Direct Connect connection to the application for data ingestion. CreateAmazon Kinesis Data Firehose delivery streams to consume direct PUT operations from the application. Specify the S3 bucket as the destination of the delivery streams.

ANSWER : A


Question # 22

A gaming company wants to launch a new internet-facing application in multiple AWSRegions The application will use the TCP and UDP protocols for communication. Thecompany needs to provide high availability and minimum latency for global users.Which combination of actions should a solutions architect take to meet theserequirements? (Select TWO.)

A. Create internal Network Load Balancers in front of the application in each Region.
B. Create external Application Load Balancers in front of the application in each Region.
C. Create an AWS Global Accelerator accelerator to route traffic to the load balancers ineach Region.
D. Configure Amazon Route 53 to use a geolocation routing policy to distribute the traffic.
E. Configure Amazon CloudFront to handle the traffic and route requests to the applicationin each Region.

ANSWER : B,C


Question # 23

A company copies 200 TB of data from a recent ocean survey onto AWS Snowball EdgeStorage Optimized devices. The company has a high performance computing (HPC)cluster that is hosted on AWS to look for oil and gas deposits. A solutions architect mustprovide the cluster with consistent sub-millisecond latency and high-throughput access to the data on the Snowball Edge Storage Optimized devices. The company is sending thedevices back to AWS.Which solution will meet these requirements?

A. Create an Amazon S3 bucket. Import the data into the S3 bucket. Configure an AWSStorage Gateway file gateway to use the S3 bucket. Access the file gateway from the HPCcluster instances.
B. Create an Amazon S3 bucket. Import the data into the S3 bucket. Configure an AmazonFSx for Lustre file system, and integrate it with the S3 bucket. Access the FSx for Lustrefile system from the HPC cluster instances.
C. Create an Amazon S3 bucket and an Amazon Elastic File System (Amazon EFS) filesystem. Import the data into the S3 bucket. Copy the data from the S3 bucket to the EFSfile system. Access the EFS file system from the HPC cluster instances.
D. Create an Amazon FSx for Lustre file system. Import the data directly into the FSx forLustre file system. Access the FSx for Lustre file system from the HPC cluster instances.

ANSWER : B


Question # 24

A company is using an Application Load Balancer (ALB) to present its application to theinternet. The company finds abnormal traffic access patterns across the application. Asolutions architect needs to improve visibility into the infrastructure to help the companyunderstand these abnormalities better.What is the MOST operationally efficient solution that meets these requirements?

A. Create a table in Amazon Athena for AWS CloudTrail logs. Create a query for therelevant information.
B. Enable ALB access logging to Amazon S3. Create a table in Amazon Athena, and querythe logs.
C. Enable ALB access logging to Amazon S3 Open each file in a text editor, and searcheach line for the relevant information
D. Use Amazon EMR on a dedicated Amazon EC2 instance to directly query the ALB toacquire traffic access log information.

ANSWER : B


Question # 25

A company hosts a data lake on Amazon S3. The data lake ingests data in ApacheParquet format from various data sources. The company uses multiple transformationsteps to prepare the ingested data. The steps include filtering of anomalies, normalizing ofdata to standard date and time values, and generation of aggregates for analyses.The company must store the transformed data in S3 buckets that data analysts access.The company needs a prebuilt solution for data transformation that does not require code.The solution must provide data lineage and data profiling. The company needs to share thedata transformation steps with employees throughout the company.Which solution will meet these requirements?

A. Configure an AWS Glue Studio visual canvas to transform the data. Share thetransformation steps with employees by using AWS Glue jobs.
B. Configure Amazon EMR Serverless to transform the data. Share the transformationsteps with employees by using EMR Serveriess jobs.
C. Configure AWS Glue DataBrew to transform the data. Share the transformation stepswith employees by using DataBrew recipes.
D. Create Amazon Athena tables for the data. Write Athena SQL queries to transform thedata. Share the Athena SQL queries with employees.

ANSWER : C


Question # 26

A solutions architect needs to ensure that API calls to Amazon DynamoDB from AmazonEC2 instances in a VPC do not travel across the internet.Which combination of steps should the solutions architect take to meet this requirement?(Choose two.)

A. Create a route table entry for the endpoint.
B. Create a gateway endpoint for DynamoDB.
C. Create an interface endpoint for Amazon EC2.
D. Create an elastic network interface for the endpoint in each of the subnets of the VPC.
E. Create a security group entry in the endpoint's security group to provide access.

ANSWER : B,E


Question # 27

A company hosts multiple applications on AWS for different product lines. The applicationsuse different compute resources, including Amazon EC2 instances and Application LoadBalancers. The applications run in different AWS accounts under the same organization inAWS Organizations across multiple AWS Regions. Teams for each product line havetagged each compute resource in the individual accounts.The company wants more details about the cost for each product line from the consolidated billing feature in Organizations.Which combination of steps will meet these requirements? (Select TWO.)

A. Select a specific AWS generated tag in the AWS Billing console.
B. Select a specific user-defined tag in the AWS Billing console.
C. Select a specific user-defined tag in the AWS Resource Groups console.
D. Activate the selected tag from each AWS account.
E. Activate the selected tag from the Organizations management account.

ANSWER : B,E


Question # 28

A company stores critical data in Amazon DynamoDB tables in the company's AWSaccount. An IT administrator accidentally deleted a DynamoDB table. The deletion caused a significant loss of data and disrupted the company's operations. The company wants toprevent this type of disruption in the future.Which solution will meet this requirement with the LEAST operational overhead?

A. Configure a trail in AWS CloudTrail. Create an Amazon EventBridge rule for deleteactions. Create an AWS Lambda function to automatically restore deleted DynamoDBtables.
B. Create a backup and restore plan for the DynamoDB tables. Recover the DynamoDBtables manually.
C. Configure deletion protection on the DynamoDB tables.
D. Enable point-in-time recovery on the DynamoDB tables.

ANSWER : C


Question # 29

A company has an application that uses Docker containers in its local data center Theapplication runs on a container host that stores persistent data in a volume on the host.The container instances use the stored persistent data.The company wants to move the application to a fully managed service because thecompany does not want to manage any servers or storage infrastructure.Which solution will meet these requirements?

A. Use Amazon Elastic Kubernetes Service (Amazon EKS) with self-managed nodes.Create an Amazon Elastic Block Store (Amazon EBS) volume attached to an Amazon EC2instance. Use the EBS volume as a persistent volume mounted in the containers.
B. Use Amazon Elastic Container Service (Amazon ECS) with an AWS Fargate launchtype. Create an Amazon Elastic File System (Amazon EFS) volume. Add the EFS volumeas a persistent storage volume mounted in the containers.
C. Use Amazon Elastic Container Service (Amazon ECS) with an AWS Fargate launchtype. Create an Amazon S3 bucket. Map the S3 bucket as a persistent storage volumemounted in the containers.
D. Use Amazon Elastic Container Service (Amazon ECS) with an Amazon EC2 launchtype. Create an Amazon Elastic File System (Amazon EFS) volume. Add the EFS volumeas a persistent storage volume mounted in the containers.

ANSWER : B


Question # 30

A company has NFS servers in an on-premises data center that need to periodically backup small amounts of data to Amazon S3. Which solution meets these requirements and isMOST cost-effective?

A. Set up AWS Glue to copy the data from the on-premises servers to Amazon S3.
B. Set up an AWS DataSync agent on the on-premises servers, and sync the data toAmazon S3.
C. Set up an SFTP sync using AWS Transfer for SFTP to sync data from on premises toAmazon S3.
D. Set up an AWS Direct Connect connection between the on-premises data center and aVPC, and copy the data to Amazon S3.

ANSWER : B


Question # 31

A company is deploying a new application to Amazon Elastic Kubernetes Service (AmazonEKS) with an AWS Fargate cluster. The application needs a storage solution for datapersistence. The solution must be highly available and fault tolerant. The solution also mustbe shared between multiple application containers.Which solution will meet these requirements with the LEAST operational overhead?

A. Create Amazon Elastic Block Store (Amazon EBS) volumes in the same AvailabilityZones where EKS worker nodes are placed. Register the volumes in a StorageClass objecton an EKS cluster. Use EBS Multi-Attach to share the data between containers.
B. Create an Amazon Elastic File System (Amazon EFS) file system. Register the filesystem in a StorageClass object on an EKS cluster. Use the same file system for allcontainers.
C. Create an Amazon Elastic Block Store (Amazon EBS) volume. Register the volume in aStorageClass object on an EKS cluster. Use the same volume for all containers.
D. Create Amazon Elastic File System (Amazon EFS) file systems in the same AvailabilityZones where EKS worker nodes are placed. Register the file systems in a StorageClass object on an EKS cluster. Create an AWS Lambda function to synchronize the databetween file systems.

ANSWER : B


Question # 32

A company uses Amazon EC2 instances and Amazon Elastic Block Store (Amazon EBS)volumes to run an application. The company creates one snapshot of each EBS volumeevery day to meet compliance requirements. The company wants to implement anarchitecture that prevents the accidental deletion of EBS volume snapshots. The solutionmust not change the administrative rights of the storage administrator user.Which solution will meet these requirements with the LEAST administrative effort?

A. Create an 1AM role that has permission to delete snapshots. Attach the role to a newEC2 instance. Use the AWS CLI from the new EC2 instance to delete snapshots.
B. Create an 1AM policy that denies snapshot deletion. Attach the policy to the storageadministrator user.
C. Add tags to the snapshots. Create retention rules in Recycle Bin for EBS snapshots thathave the tags.
D. Lock the EBS snapshots to prevent deletion.

ANSWER : D


Question # 33

A manufacturing company runs its report generation application on AWS. The applicationgenerates each report in about 20 minutes. The application is built as a monolith that runson a single Amazon EC2 instance. The application requires frequent updates to its tightlycoupled modules. The application becomes complex to maintain as the company adds newfeatures.Each time the company patches a software module, the application experiences downtime.Report generation must restart from the beginning after any interruptions. The companywants to redesign the application so that the application can be flexible, scalable, andgradually improved. The company wants to minimize application downtime.Which solution will meet these requirements?

A. Run the application on AWS Lambda as a single function with maximum provisionedconcurrency.
B. Run the application on Amazon EC2 Spot Instances as microservices with a Spot Fleetdefault allocation strategy.
C. Run the application on Amazon Elastic Container Service (Amazon ECS) asmicroservices with service auto scaling.
D. Run the application on AWS Elastic Beanstalk as a single application environment withan all-at-once deployment strategy.

ANSWER : C


Question # 34

A security audit reveals that Amazon EC2 instances are not being patched regularly. Asolutions architect needs to provide a solution that will run regular security scans across alarge fleet of EC2 instances. The solution should also patch the EC2 instances on a regularschedule and provide a report of each instance's patch status.Which solution will meet these requirements?

A. Set up Amazon Macie to scan the EC2 instances for software vulnerabilities. Set up acron job on each EC2 instance to patch the instance on a regular schedule.
B. Turn on Amazon GuardDuty in the account. Configure GuardDuty to scan the EC2instances for software vulnerabilities. Set up AWS Systems Manager Session Manager topatch the EC2 instances on a regular schedule.
C. Set up Amazon Detective to scan the EC2 instances for software vulnerabilities. Set upan Amazon EventBridge scheduled rule to patch the EC2 instances on a regular schedule.
D. Turn on Amazon Inspector in the account. Configure Amazon Inspector to scan the EC2instances for software vulnerabilities. Set up AWS Systems Manager Patch Manager topatch the EC2 instances on a regular schedule.

ANSWER : D


Testimonial

Have a look at what our customers think

Thank you for your interest in Amazonawsdumps.com to pass your amazon certification.