Amazon SAP-C02 dumps

Amazon SAP-C02 Dumps

Amazon AWS Certified Solutions Architect - Professional

Looking for Amazon SAP-C02 Practice Questions? Rejoice because you have reached your destination. Amazonawsdumps.com have prepared a special kind of test material that alters according to the individual candidate’s skillset. Our smart system presents Amazon SAP-C02 Question Answers exactly like they are in the actual exam. We report your progress at the end of each test to ensures 100% success.

discount banner
PDF Demo $35 Add to cart
Test Engine Demo $45 Add to cart
PDF + Test Engine $55 Add to cart

Here are some more features of Amazon SAP-C02 PDF:

435 questions with answers Updation Date : 26 Jul, 2024
Unlimited practice questions Routinely Daily Updates
Takes Just 1 Day to Prepare Exam Passing Guaranteed at First Go
Money-Back Facility 3 Months Free Updates

By using this credential, certified professionals can demonstrate their advanced knowledge and abilities in automating manual operations, maximizing security, cost, and performance, and giving complicated solutions to difficult challenges. Through this certification, firms may find and cultivate talent that possesses these vital abilities for putting cloud projects into action.

Amazonawsdumps.com the finest way to get your SAP-C02 certificate

At Amazonawsdumps.com, we work day and night to give our clients the best and most accurate AWS SAP-C02 exam material. We have a 100% success rate, millions of positive evaluations, and a solid reputation as a trustworthy exam dumps website. You can obtain in-depth knowledge of the SAP-C02 exam as well as practical experience. All you have to do to ace the exam and get the best grades is get our AMAZON AWS SAP-C02 pdf guide.

Most accurate and up to date SAP-C02 real exam question answers

You can get free updates for the SAP-C02 Exam if you purchase braindumps from Amazonawsdumps.com. These upgrades are available to you for three months. You can easily use our AWS Certified Solutions Architect - Professional Certification dumps on a desktop, laptop, tablet, or smartphone. Customer service for inquiries and problems with the SAP-C02 Dumps PDF 24 hours a day, 7 days a week. Money-back guarantee and a 100% success rate. Obtain your SAP-C02 PDF test questions right now.

Who is qualified to sit for the SAP-C02 exam?

The ideal applicant is certified by AWS and has at least two years of experience building and implementing cloud-based systems. This rival might evaluate the demands made on cloud apps and offer architecture recommendations for sending usages to AWS. The aspirational newcomer may also offer expert guidance on designing a strategy that incorporates a variety of apps and services inside a complex organization.

Complete frame work for AWS SAP-C02 exam

  • 1:  Plan Answers for Hierarchical Intricacy 26%
  • 2:  Plan for New Arrangements 29%
  • 3:  Consistent Improvement for Existing Arrangements 25%
  • 4:  Speed up Responsibility Movement and Modernization 20%

How many different kinds of questions must be included in the SAP-C02 certification exam?

  • •  Several choices: has one accurate response (distractors) and three false ones.
  • •  Has at least two of the five potential answers—or more—correct.

You must answer as many questions as possible on this exam, as missing a question will result in a zero. Incorrect responses will not be penalized.

Complete Refund or 100% Success Guaranteed

You can be sure that the SAP-C02 Dumps PDF from Amazonawsdumps.com will help you pass the test. However, we will set you up with a complete refund if you use our products and don't pass the SAP-C02 exam on your first try. Just provide us with your SAP-C02 score report and any relevant documentation. Our staff will promptly transfer the entire amount of your legitimate funds when your information has been validated.

Why Pass Amazon SAP-C02 Exam?

In today’s world, you need the validation of your skills to get past the competition. Amazon SAP-C02 Exam is that validation. Not only is Amazon a leading industry in IT but it also offers certification exams to prove Amazon's skills. These skills prove you capable of fulfilling the Amazon job role. To get certified you simply pass the SAP-C02 Exam. This brings us to Amazon SAP-C02 Question Answers set. Passing this certification exam from Amazon may seem easy but it’s not. Many students fail this exam only because they didn’t take it seriously. Don’t make this mistake and order your Amazon SAP-C02 Braindumps right now!

Amazonawsdumps.com is the most popular and reliable website that has helped thousands of candidates excel at Amazon Exams. You could be one of those fortunate few too. Pass your exam in one attempt with Amazon SAP-C02 PDF and own the future. Buy Now!

Superlative Amazon SAP-C02 Dumps!

We know we said passing amazon exams is hard but that’s only if you’ve been led astray. There are millions of Amazon SAP-C02 Practice Questions available online promising success but fail when it comes down to it. Choose your training material carefully and get Amazon SAP-C02 Question Answers that are valid, accurate, and approved by famous IT professionals. Our Amazon SAP-C02 Braindumps are created by experts for experts and generate first-class results in just a single attempt. Don’t believe us? Try our free demo version that contains all the features you’ll get with Amazon SAP-C02 PDF. An interactive design, easy to read format, understandable language, and concise pattern. And if you still don’t get the result you want and fail somehow, you get your money back in full. So, order your set of Amazon SAP-C02 Dumps now!

We promise our customers to take full responsibility for their learning, preparation and passing SAP-C02 Exams without a hunch. Our aim is your satisfaction and ease. That is why we demand only the reasonable cost on Amazon SAP-C02 Practice Questions. Moreover, offer 2 formats: PDF and online test engine. Also, there is always a little extra with our discount coupons.

Why Buy Amazon SAP-C02 Question Answers?

Amazonawsdumps.com the team is a bunch of experts who got lucky with Amazon SAP-C02 Braindumps. We got what we needed to pass the exam and we went through its challenges as well. That is why we want every Amazon Candidate to get success. Choosing among so many options of Amazon SAP-C02 PDF is a tricky situation. Sometimes they don’t turn out like they first appeared to be. That is the reason we offer our valued customers a free demo. They can get a test run of Amazon SAP-C02 Dumps before they buy it. When it comes to buying, the procedure is simple, secure, and hardly jeopardizing. Because our Amazon SAP-C02 Practice Questions have a 99.8% passing rate.

Amazon SAP-C02 Sample Questions

Question # 1

A company has a solution that analyzes weather data from thousands of weather stations.The weather stations send the data over an Amazon API Gateway REST API that has anAWS Lambda function integration. The Lambda function calls a third-party service for datapre-processing. The third-party service gets overloaded and fails the pre-processing,causing a loss of data.A solutions architect must improve the resiliency of the solution. The solutions architectmust ensure that no data is lost and that data can be processed later if failures occur.What should the solutions architect do to meet these requirements?

A. Create an Amazon Simple Queue Service (Amazon SQS) queue. Configure the queueas the dead-letter queue for the API.
B. Create two Amazon Simple Queue Service (Amazon SQS) queues: a primary queueand a secondary queue. Configure the secondary queue as the dead-letter queue for theprimary queue. Update the API to use a new integration to the primary queue. Configurethe Lambda function as the invocation target for the primary queue.
C. Create two Amazon EventBridge event buses: a primary event bus and a secondaryevent bus. Update the API to use a new integration to the primary event bus. Configure anEventBridge rule to react to all events on the primary event bus. Specify the Lambdafunction as the target of the rule. Configure the secondary event bus as the failuredestination for the Lambda function.
D. Create a custom Amazon EventBridge event bus. Configure the event bus as the failuredestination for the Lambda function.

ANSWER : C


Question # 2

A financial company needs to create a separate AWS account for a new digital walletapplication. The company uses AWS Organizations to manage its accounts. A solutionsarchitect uses the 1AM user Supportl from the management account to create a newmember account with finance1@example.com as the email address.What should the solutions architect do to create IAM users in the new member account?

A. Sign in to the AWS Management Console with AWS account root user credentials byusing the 64-character password from the initial AWS Organizations emailsenttofinance1@example.com. Set up the IAM users as required.
B. From the management account, switch roles to assume theOrganizationAccountAccessRole role with the account ID of the new member account. Setup the IAM users as required.
C. Go to the AWS Management Console sign-in page. Choose "Sign in using root accountcredentials." Sign in in by using the email address finance1@example.com and themanagement account's root password. Set up the IAM users as required.
D. Go to the AWS Management Console sign-in page. Sign in by using the account ID ofthe new member account and the Supportl IAM credentials. Set up the IAM users as required.

ANSWER : D


Question # 3

A company is currently in the design phase of an application that will need an RPO of lessthan 5 minutes and an RTO of less than 10 minutes. The solutions architecture team isforecasting that the database will store approximately 10 TB of data. As part of the design,they are looking for a database solution that will provide the company with the ability to failover to a secondary Region.Which solution will meet these business requirements at the LOWEST cost?

A. Deploy an Amazon Aurora DB cluster and take snapshots of the cluster every 5minutes. Once a snapshot is complete, copy the snapshot to a secondary Region to serveas a backup in the event of a failure.
B. Deploy an Amazon RDS instance with a cross-Region read replica in a secondaryRegion. In the event of a failure, promote the read replica to become the primary.
C. Deploy an Amazon Aurora DB cluster in the primary Region and another in a secondaryRegion. Use AWS DMS to keep the secondary Region in sync.
D. Deploy an Amazon RDS instance with a read replica in the same Region. In the event ofa failure, promote the read replica to become the primary.

ANSWER : B


Question # 4

A financial services company runs a complex, multi-tier application on Amazon EC2instances and AWS Lambda functions. The application stores temporary data in AmazonS3. The S3 objects are valid for only 45 minutes and are deleted after 24 hours.The company deploys each version of the application by launching an AWSCloudFormation stack. The stack creates all resources that are required to run theapplication. When the company deploys and validates a new application version, thecompany deletes the CloudFormation stack of the old version.The company recently tried to delete the CloudFormation stack of an old applicationversion, but the operation failed. An analysis shows that CloudFormation failed to delete anexisting S3 bucket. A solutions architect needs to resolve this issue without making majorchanges to the application's architecture.Which solution meets these requirements?

A. Implement a Lambda function that deletes all files from a given S3 bucket. Integrate thisLambda function as a custom resource into the CloudFormation stack. Ensure that thecustom resource has a DependsOn attribute that points to the S3 bucket's resource.
B. Modify the CloudFormation template to provision an Amazon Elastic File System(Amazon EFS) file system to store the temporary files there instead of in Amazon S3.Configure the Lambda functions to run in the same VPC as the file system. Mount the filesystem to the EC2 instances and Lambda functions.
C. Modify the CloudFormation stack to create an S3 Lifecycle rule that expires all objects45 minutes after creation. Add a DependsOn attribute that points to the S3 bucket'sresource.
D. Modify the CloudFormation stack to attach a DeletionPolicy attribute with a value ofDelete to the S3 bucket.

ANSWER : D


Question # 5

A company needs to monitor a growing number of Amazon S3 buckets across two AWSRegions. The company also needs to track the percentage of objects that areencrypted in Amazon S3. The company needs a dashboard to display this information forinternal compliance teams.Which solution will meet these requirements with the LEAST operational overhead?

A. Create a new S3 Storage Lens dashboard in each Region to track bucket andencryption metrics. Aggregate data from both Region dashboards into a single dashboardin Amazon QuickSight for the compliance teams.
B. Deploy an AWS Lambda function in each Region to list the number of buckets and theencryption status of objects. Store this data in Amazon S3. Use Amazon Athena queries todisplay the data on a custom dashboard in Amazon QuickSight for the compliance teams.
C. Use the S3 Storage Lens default dashboard to track bucket and encryption metrics.Give the compliance teams access to the dashboard directly in the S3 console.
D. Create an Amazon EventBridge rule to detect AWS Cloud Trail events for S3 objectcreation. Configure the rule to invoke an AWS Lambda function to record encryptionmetrics in Amazon DynamoDB. Use Amazon QuickSight to display the metrics in adashboard for the compliance teams.

ANSWER : C


Question # 6

A North American company with headquarters on the East Coast is deploying a new web application running on Amazon EC2 in the us-east-1 Region. The application shoulddynamically scale to meet user demand and maintain resiliency. Additionally, theapplication must have disaster recover capabilities in an active-passive configuration withthe us-west-1 Region.Which steps should a solutions architect take after creating a VPC in the us-east-1 Region?

A. Create a VPC in the us-west-1 Region. Use inter-Region VPC peering to connect bothVPCs. Deploy an Application Load Balancer (ALB) spanning multiple Availability Zones(AZs) to the VPC in the us-east-1 Region. Deploy EC2 instances across multiple AZs ineach Region as part of an Auto Scaling group spanning both VPCs and served by the ALB.
B. Deploy an Application Load Balancer (ALB) spanning multiple Availability Zones (AZs)to the VPC in the us-east-1 Region. Deploy EC2 instances across multiple AZs as part ofan Auto Scaling group served by the ALB. Deploy the same solution to the us-west-1Region. Create an Amazon Route 53 record set with a failover routing policy and healthchecks enabled to provide high availability across both Regions.
C. Create a VPC in the us-west-1 Region. Use inter-Region VPC peering to connect bothVPCs. Deploy an Application Load Balancer (ALB) that spans both VPCs. Deploy EC2instances across multiple Availability Zones as part of an Auto Scaling group in each VPCserved by the ALB. Create an Amazon Route 53 record that points to the ALB.
D. Deploy an Application Load Balancer (ALB) spanning multiple Availability Zones (AZs)to the VPC in the us-east-1 Region. Deploy EC2 instances across multiple AZs as part ofan Auto Scaling group served by the ALB. Deploy the same solution to the us-west-1Region. Create separate Amazon Route 53 records in each Region that point to the ALB inthe Region. Use Route 53 health checks to provide high availability across both Regions.

ANSWER : B


Question # 7

A solutions architect is creating an application that stores objects in an Amazon S3 bucketThe solutions architect must deploy the application in two AWS Regions that will be usedsimultaneously The objects in the two S3 buckets must remain synchronized with eachother.Which combination of steps will meet these requirements with the LEAST operationaloverhead? (Select THREE)

A. Create an S3 Multi-Region Access Point. Change the application to refer to the Multi-Region Access Point
B. Configure two-way S3 Cross-Region Replication (CRR) between the two S3 buckets
C. Modify the application to store objects in each S3 bucket.
D. Create an S3 Lifecycle rule for each S3 bucket to copy objects from one S3 bucket tothe other S3 bucket.
E. Enable S3 Versioning for each S3 bucket
F. Configure an event notification for each S3 bucket to invoke an AVVS Lambda functionto copy objects from one S3 bucket to the other S3 bucket.

ANSWER : A,B,E


Question # 8

A solutions architect is creating an application that stores objects in an Amazon S3 bucket The solutions architect must deploy the application in two AWS Regions that will be used simultaneously The objects in the two S3 buckets must remain synchronized with each other. Which combination of steps will meet these requirements with the LEAST operational overhead? (Select THREE)

A. Use AWS Lambda functions to connect to the loT devices
B. Configure the loT devices to publish to AWS loT Core
C. Write the metadata to a self-managed MongoDB database on an Amazon EC2 instance
D. Write the metadata to Amazon DocumentDB (with MongoDB compatibility)
E. Use AWS Step Functions state machines with AWS Lambda tasks to prepare thereports and to write the reports to Amazon S3 Use Amazon CloudFront with an S3 origin toserve the reports
F. Use an Amazon Elastic Kubernetes Service (Amazon EKS) cluster with Amazon EC2instances to prepare the reports Use an ingress controller in the EKS cluster to serve the reports

ANSWER : B,D,E


Question # 9

A company is using Amazon API Gateway to deploy a private REST API that will provideaccess to sensitive data. The API must be accessible only from an application that is deployed in a VPC. The company deploys the API successfully. However, the API is notaccessible from an Amazon EC2 instance that is deployed in the VPC.Which solution will provide connectivity between the EC2 instance and the API?

A. Create an interface VPC endpoint for API Gateway. Attach an endpoint policy thatallows apigateway:* actions. Disable private DNS naming for the VPC endpoint. Configurean API resource policy that allows access from the VPC. Use the VPC endpoint's DNSname to access the API.
B. Create an interface VPC endpoint for API Gateway. Attach an endpoint policy thatallows the execute-api:lnvoke action. Enable private DNS naming for the VPC endpoint.Configure an API resource policy that allows access from the VPC endpoint. Use the APIendpoint's DNS names to access the API. Most Voted
C. Create a Network Load Balancer (NLB) and a VPC link. Configure private integrationbetween API Gateway and the NLB. Use the API endpoint's DNS names to access theAPI.
D. Create an Application Load Balancer (ALB) and a VPC Link. Configure privateintegration between API Gateway and the ALB. Use the ALB endpoint's DNS name toaccess the API.

ANSWER : B


Question # 10

A company migrated an application to the AWS Cloud. The application runs on twoAmazon EC2 instances behind an Application Load Balancer (ALB). Application data isstored in a MySQL database that runs on an additional EC2 instance. The application's useof the database is read-heavy.The loads static content from Amazon Elastic Block Store (Amazon EBS) volumes that are attached to each EC2 instance. The static content is updated frequently and must becopied to each EBS volume.The load on the application changes throughout the day. During peak hours, the applicationcannot handle all the incoming requests. Trace data shows that the database cannothandle the read load during peak hours.Which solution will improve the reliability of the application?

A. Migrate the application to a set of AWS Lambda functions. Set the Lambda functions astargets for the ALB. Create a new single EBS volume for the static content. Configure theLambda functions to read from the new EBS volume. Migrate the database to an AmazonRDS for MySQL Multi-AZ DB cluster.
B. Migrate the application to a set of AWS Step Functions state machines. Set the statemachines as targets for the ALB. Create an Amazon Elastic File System (Amazon EFS) filesystem for the static content. Configure the state machines to read from the EFS filesystem. Migrate the database to Amazon Aurora MySQL Serverless v2 with a reader DBinstance.
C. Containerize the application. Migrate the application to an Amazon Elastic ContainerService (Amazon ECS) Cluster. Use the AWS Fargate launch type for the tasks that hostthe application. Create a new single EBS volume the static content. Mount the new EBSvolume on the ECS duster. Configure AWS Application Auto Scaling on ECS cluster. Setthe ECS service as a target for the ALB. Migrate the database to an Amazon RDS forMySOL Multi-AZ DB cluster.
D. Containerize the application. Migrate the application to an Amazon Elastic ContainerService (Amazon ECS) cluster. Use the AWS Fargate launch type for the tasks that hostthe application. Create an Amazon Elastic File System (Amazon EFS) file system for thestatic content. Mount the EFS file system to each container. Configure AWS ApplicationAuto Scaling on the ECS cluster Set the ECS service as a target for the ALB. Migrate thedatabase to Amazon Aurora MySQL Serverless v2 with a reader DB instance.

ANSWER : D


Question # 11

A company runs an unauthenticated static website (www.example.com) that includes aregistration form for users. The website uses Amazon S3 for hosting and uses AmazonCloudFront as the content delivery network with AWS WAF configured. When theregistration form is submitted, the website calls an Amazon API Gateway API endpoint thatinvokes an AWS Lambda function to process the payload and forward the payload to anexternal API call.During testing, a solutions architect encounters a cross-origin resource sharing (CORS)error. The solutions architect confirms that the CloudFront distribution origin has theAccess-Control-Allow-Origin header set to www.example.com.What should the solutions architect do to resolve the error?

A. Change the CORS configuration on the S3 bucket. Add rules for CORS to the AllowedOrigin element for www.example.com.
B. Enable the CORS setting in AWS WAF. Create a web ACL rule in which the Access-Control-Allow-Origin header is set to www.example.com.
C. Enable the CORS setting on the API Gateway API endpoint. Ensure that the APIendpoint is configured to return all responses that have the Access-Control -Allow-Originheader set to www.example.com.
D. Enable the CORS setting on the Lambda function. Ensure that the return code of thefunction has the Access-Control-Allow-Origin header set to www.example.com.

ANSWER : C


Question # 12

A company wants to send data from its on-premises systems to Amazon S3 buckets. Thecompany created the S3 buckets in three different accounts. The company must send thedata privately without the data traveling across the internet The company has no existingdedicated connectivity to AWSWhich combination of steps should a solutions architect take to meet these requirements?(Select TWO.)

A. Establish a networking account in the AWS Cloud Create a private VPC in thenetworking account. Set up an AWS Direct Connect connection with a private VIF betweenthe on-premises environment and the private VPC.
B. Establish a networking account in the AWS Cloud Create a private VPC in thenetworking account. Set up an AWS Direct Connect connection with a public VlF betweenthe on-premises environment and the private VPC.
C. Create an Amazon S3 interface endpoint in the networking account.
D. Create an Amazon S3 gateway endpoint in the networking account.
E. Establish a networking account in the AWS Cloud Create a private VPC in thenetworking account. Peer VPCs from the accounts that host the S3 buckets with the VPCin the network account.

ANSWER : A,C


Question # 13

A company uses AWS Organizations to manage a multi-account structure. The companyhas hundreds of AWS accounts and expects the number of accounts to increase. Thecompany is building a new application that uses Docker images. The company will pushthe Docker images to Amazon Elastic Container Registry (Amazon ECR). Only accountsthat are within the company's organization should haveaccess to the images.The company has a CI/CD process that runs frequently. The company wants to retain allthe tagged images. However, the company wants to retain only the five most recent untagged images.Which solution will meet these requirements with the LEAST operational overhead?

A. Create a private repository in Amazon ECR. Create a permissions policy for therepository that allows only required ECR operations. Include a condition to allow the ECRoperations if the value of the aws:PrincipalOrglD condition key is equal to the ID of thecompany's organization. Add a lifecycle rule to the ECR repository that deletes alluntagged images over the count of five.
B. Create a public repository in Amazon ECR. Create an IAM role in the ECR account. Setpermissions so that any account can assume the role if the value of the aws:PrincipalOrglDcondition key is equal to the ID of the company's organization. Add a lifecycle rule to theECR repository that deletes all untagged images over the count of five.
C. Create a private repository in Amazon ECR. Create a permissions policy for therepository that includes only required ECR operations. Include a condition to allow the ECRoperations for all account IDs in the organization. Schedule a daily Amazon EventBridgerule to invoke an AWS Lambda function that deletes all untagged images over the count offive.
D. Create a public repository in Amazon ECR. Configure Amazon ECR to use an interfaceVPC endpoint with an endpoint policy that includes the required permissions for imagesthat the company needs to pull. Include a condition to allow the ECR operations for allaccount IDs in the company's organization. Schedule a daily Amazon EventBridge rule toinvoke an AWS Lambda function that deletes all untagged images over the count of five.

ANSWER : A


Question # 14

A company is migrating to the cloud. It wants to evaluate the configurations of virtualmachines in its existing data center environment to ensure that it can size new AmazonEC2 instances accurately. The company wants to collect metrics, such as CPU. memory,and disk utilization, and it needs an inventory of what processes are running on eachinstance. The company would also like to monitor network connections to mapcommunications between servers.Which would enable the collection of this data MOST cost effectively?

A. Use AWS Application Discovery Service and deploy the data collection agent to eachvirtual machine in the data center.
B. Configure the Amazon CloudWatch agent on all servers within the local environmentand publish metrics to Amazon CloudWatch Logs.
C. Use AWS Application Discovery Service and enable agentless discovery in the existingvisualization environment.
D. Enable AWS Application Discovery Service in the AWS Management Console andconfigure the corporate firewall to allow scans over a VPN.

ANSWER : A


Question # 15

A company is developing a web application that runs on Amazon EC2 instances in an AutoScaling group behind a public-facing Application Load Balancer (ALB). Only users from aspecific country are allowed to access the application. The company needs the ability to logthe access requests that have been blocked. The solution should require the least possiblemaintenance.Which solution meets these requirements?

A. Create an IPSet containing a list of IP ranges that belong to the specified country.Create an AWS WAF web ACL. Configure a rule to block any requests that do not originatefrom an IP range in the IPSet. Associate the rule with the web ACL. Associate the web ACLwith the ALB.
B. Create an AWS WAF web ACL. Configure a rule to block any requests that do notoriginate from the specified country. Associate the rule with the web ACL. Associate theweb ACL with the ALB.
C. Configure AWS Shield to block any requests that do not originate from the specifiedcountry. Associate AWS Shield with the ALB.
D. Create a security group rule that allows ports 80 and 443 from IP ranges that belong tothe specified country. Associate the security group with the ALB.

ANSWER : B


Question # 16

A company has an organization in AWS Organizations that includes a separate AWSaccount for each of the company's departments. Application teams from differentdepartments develop and deploy solutions independently.The company wants to reduce compute costs and manage costs appropriately acrossdepartments. The company also wants to improve visibility into billing for individual departments. The company does not want to lose operational flexibility when the companyselects compute resources.Which solution will meet these requirements?

A. Use AWS Budgets for each department. Use Tag Editor to apply tags to appropriateresources. Purchase EC2 Instance Savings Plans.
B. Configure AWS Organizations to use consolidated billing. Implement a tagging strategythat identifies departments. Use SCPs to apply tags to appropriate resources. PurchaseEC2 Instance Savings Plans.
C. Configure AWS Organizations to use consolidated billing. Implement a tagging strategythat identifies departments. Use Tag Editor to apply tags to appropriate resources.Purchase Compute Savings Plans.
D. Use AWS Budgets for each department. Use SCPs to apply tags to appropriateresources. Purchase Compute Savings Plans.

ANSWER : C


Question # 17

A company is running multiple workloads in the AWS Cloud. The company has separateunits for software development. The company uses AWS Organizations and federation withSAML to give permissions to developers to manage resources in their AWS accounts. Thedevelopment units each deploy their production workloads into a common productionaccount.Recently, an incident occurred in the production account in which members of adevelopment unit terminated an EC2 instance that belonged to a different developmentunit. A solutions architect must create a solution that prevents a similar incident fromhappening in the future. The solution also must allow developers the possibility to managethe instances used for their workloads.Which strategy will meet these requirements?

A. Create separate OUs in AWS Organizations for each development unit. Assign thecreated OUs to the company AWS accounts. Create separate SCPs with a deny action anda StringNotEquals condition for the DevelopmentUnit resource tag that matches thedevelopment unit name. Assign the SCP to the corresponding OU.
B. Pass an attribute for DevelopmentUnit as an AWS Security Token Service (AWS STS)session tag during SAML federation. Update the IAM policy for the developers' assumedIAM role with a deny action and a StringNotEquals condition for the DevelopmentUnitresource tag and aws:PrincipalTag/ DevelopmentUnit.
C. Pass an attribute for DevelopmentUnit as an AWS Security Token Service (AWS STS)session tag during SAML federation. Create an SCP with an allow action and aStringEquals condition for the DevelopmentUnit resource tag andaws:PrincipalTag/DevelopmentUnit. Assign the SCP to the root OU.
D. Create separate IAM policies for each development unit. For every IAM policy, add anallow action and a StringEquals condition for the DevelopmentUnit resource tag and thedevelopment unit name. During SAML federation, use AWS Security Token Service (AWSSTS) to assign the IAM policy and match the development unit name to the assumed IAMrole.

ANSWER : B


Question # 18

A company is running a serverless application that consists of several AWS Lambdafunctions and Amazon DynamoDB tables. The company has created new functionality thatrequires the Lambda functions to access an Amazon Neptune DB cluster. The Neptune DBcluster is located in three subnets in a VPC.Which of the possible solutions will allow the Lambda functions to access the Neptune DBcluster and DynamoDB tables? (Select TWO.)

A. Create three public subnets in the Neptune VPC, and route traffic through an internetgateway. Host the Lambda functions in the three new public subnets.
B. Create three private subnets in the Neptune VPC, and route internet traffic through aNAT gateway. Host the Lambda functions in the three new private subnets.
C. Host the Lambda functions outside the VPC. Update the Neptune security group to allowaccess from the IP ranges of the Lambda functions.
D. Host the Lambda functions outside the VPC. Create a VPC endpoint for the Neptunedatabase, and have the Lambda functions access Neptune over the VPC endpoint.
E. Create three private subnets in the Neptune VPC. Host the Lambda functions in thethree new isolated subnets. Create a VPC endpoint for DynamoDB, and route DynamoDBtraffic to the VPC endpoint.

ANSWER : B,E


Question # 19

A company uses an organization in AWS Organizations to manage the company's AWSaccounts. The company uses AWS CloudFormation to deploy all infrastructure. A financeteam wants to buikJ a chargeback model The finance team asked each business unit to tagresources by using a predefined list of project values.When the finance team used the AWS Cost and Usage Report in AWS Cost Explorer andfiltered based on project, the team noticed noncompliant project values. The companywants to enforce the use of project tags for new resources.Which solution will meet these requirements with the LEAST effort?

A. Create a tag policy that contains the allowed project tag values in the organization'smanagement account. Create an SCP that denies the cloudformation:CreateStack APIoperation unless a project tag is added. Attach the SCP to each OU.
B. Create a tag policy that contains the allowed project tag values in each OU. Create anSCP that denies the cloudformation:CreateStack API operation unless a project tag isadded. Attach the SCP to each OU.
C. Create a tag policy that contains the allowed project tag values in the AWS managementaccount. Create an 1AM policy that denies the cloudformation:CreateStack API operationunless a project tag is added. Assign the policy to each user.
D. Use AWS Service Catalog to manage the CloudFoanation stacks as products. Use aTagOptions library to control project tag values. Share the portfolio with all OUs that are inthe organization.

ANSWER : A


Question # 20

An online survey company runs its application in the AWS Cloud. The application isdistributed and consists of microservices that run in an automatically scaled AmazonElastic Container Service (Amazon ECS) cluster. The ECS cluster is a target for anApplication Load Balancer (ALB). The ALB is a custom origin for an Amazon CloudFrontdistribution.The company has a survey that contains sensitive data. The sensitive data must beencrypted when it moves through the application. The application's data-handlingmicroservice is the only microservice that should be able to decrypt the data.Which solution will meet these requirements?

A. Create a symmetric AWS Key Management Service (AWS KMS) key that is dedicated tothe data-handling microservice. Create a field-level encryption profile and a configuration.Associate the KMS key and the configuration with the CloudFront cache behavior.
B. Create an RSA key pair that is dedicated to the data-handling microservice. Upload thepublic key to the CloudFront distribution. Create a field-level encryption profile and aconfiguration. Add the configuration to the CloudFront cache behavior.
C. Create a symmetric AWS Key Management Service (AWS KMS) key that is dedicated tothe data-handling microservice. Create a Lambda@Edge function. Program the function touse the KMS key to encrypt the sensitive data.
D. Create an RSA key pair that is dedicated to the data-handling microservice. Create aLambda@Edge function. Program the function to use the private key of the RSA key pair toencrypt the sensitive data.

ANSWER : B


Question # 21

A data analytics company has an Amazon Redshift cluster that consists of several reservednodes. The cluster is experiencing unexpected bursts of usage because a team ofemployees is compiling a deep audit analysis report. The queries to generate the report arecomplex read queries and are CPU intensive.Business requirements dictate that the cluster must be able to service read and writequeries at all times. A solutions architect must devise a solution that accommodates thebursts of usage.Which solution meets these requirements MOST cost-effectively?

A. Provision an Amazon EMR cluster. Offload the complex data processing tasks.
B. Deploy an AWS Lambda function to add capacity to the Amazon Redshift cluster byusing a classic resize operation when the cluster's CPU metrics in Amazon CloudWatchreach 80%.
C. Deploy an AWS Lambda function to add capacity to the Amazon Redshift cluster byusing an elastic resize operation when the cluster's CPU metrics in Amazon CloudWatchreach 80%.
D. Turn on the Concurrency Scaling feature for the Amazon Redshift cluster.

ANSWER : C


Question # 22

A solutions architect has an operational workload deployed on Amazon EC2 instances inan Auto Scaling Group The VPC architecture spans two Availability Zones (AZ) with asubnet in each that the Auto Scaling group is targeting. The VPC is connected to an onpremisesenvironment and connectivity cannot be interrupted The maximum size of theAuto Scaling group is 20 instances in service. The VPC IPv4 addressing is as follows:VPCCIDR 10 0 0 0/23AZ1 subnet CIDR: 10 0 0 0724AZ2 subnet CIDR: 10.0.1 0724Since deployment, a third AZ has become available in the Region The solutions architectwants to adopt the new AZ without adding additional IPv4 address space and withoutservice downtime. Which solution will meet these requirements?

A. Update the Auto Scaling group to use the AZ2 subnet only Delete and re-create the AZ1subnet using half the previous address space Adjust the Auto Scaling group to also use the new AZI subnet When the instances are healthy, adjust the Auto Scaling group to use theAZ1 subnet only Remove the current AZ2 subnet Create a new AZ2 subnet using thesecond half of the address space from the original AZ1 subnet Create a new AZ3 subnetusing half the original AZ2 subnet address space, then update the Auto Scaling group totarget all three new subnets.
B. Terminate the EC2 instances in the AZ1 subnet Delete and re-create the AZ1 subnetusing hall the address space. Update the Auto Scaling group to use this new subnet.Repeat this for the second AZ. Define a new subnet in AZ3: then update the Auto Scalinggroup to target all three new subnets
C. Create a new VPC with the same IPv4 address space and define three subnets, withone for each AZ Update the existing Auto Scaling group to target the new subnets in thenew VPC
D. Update the Auto Scaling group to use the AZ2 subnet only Update the AZ1 subnet tohave halt the previous address space Adjust the Auto Scaling group to also use the AZ1subnet again. When the instances are healthy, adjust the Auto Seating group to use theAZ1 subnet only. Update the current AZ2 subnet and assign the second half of the addressspace from the original AZ1 subnet Create a new AZ3 subnet using half the original AZ2subnet address space, then update the Auto Scaling group to target all three new subnets

ANSWER : A


Testimonial

Have a look at what our customers think

Thank you for your interest in Amazonawsdumps.com to pass your amazon certification.