Amazon SAP-C02 dumps

Amazon SAP-C02 Dumps

Amazon AWS Certified Solutions Architect - Professional

Looking for Amazon SAP-C02 Practice Questions? Rejoice because you have reached your destination. Amazonawsdumps.com have prepared a special kind of test material that alters according to the individual candidate’s skillset. Our smart system presents Amazon SAP-C02 Question Answers exactly like they are in the actual exam. We report your progress at the end of each test to ensures 100% success.

discount banner
PDF Demo $35 Add to cart
Test Engine Demo $45 Add to cart
PDF + Test Engine $55 Add to cart

Here are some more features of Amazon SAP-C02 PDF:

483 questions with answers Updation Date : 19 Dec, 2024
Unlimited practice questions Routinely Daily Updates
Takes Just 1 Day to Prepare Exam Passing Guaranteed at First Go
Money-Back Facility 3 Months Free Updates

By using this credential, certified professionals can demonstrate their advanced knowledge and abilities in automating manual operations, maximizing security, cost, and performance, and giving complicated solutions to difficult challenges. Through this certification, firms may find and cultivate talent that possesses these vital abilities for putting cloud projects into action.

Amazonawsdumps.com the finest way to get your SAP-C02 certificate

At Amazonawsdumps.com, we work day and night to give our clients the best and most accurate AWS SAP-C02 exam material. We have a 100% success rate, millions of positive evaluations, and a solid reputation as a trustworthy exam dumps website. You can obtain in-depth knowledge of the SAP-C02 exam as well as practical experience. All you have to do to ace the exam and get the best grades is get our AMAZON AWS SAP-C02 pdf guide.

Most accurate and up to date SAP-C02 real exam question answers

You can get free updates for the SAP-C02 Exam if you purchase braindumps from Amazonawsdumps.com. These upgrades are available to you for three months. You can easily use our AWS Certified Solutions Architect - Professional Certification dumps on a desktop, laptop, tablet, or smartphone. Customer service for inquiries and problems with the SAP-C02 Dumps PDF 24 hours a day, 7 days a week. Money-back guarantee and a 100% success rate. Obtain your SAP-C02 PDF test questions right now.

Who is qualified to sit for the SAP-C02 exam?

The ideal applicant is certified by AWS and has at least two years of experience building and implementing cloud-based systems. This rival might evaluate the demands made on cloud apps and offer architecture recommendations for sending usages to AWS. The aspirational newcomer may also offer expert guidance on designing a strategy that incorporates a variety of apps and services inside a complex organization.

Complete frame work for AWS SAP-C02 exam

  • 1:  Plan Answers for Hierarchical Intricacy 26%
  • 2:  Plan for New Arrangements 29%
  • 3:  Consistent Improvement for Existing Arrangements 25%
  • 4:  Speed up Responsibility Movement and Modernization 20%

How many different kinds of questions must be included in the SAP-C02 certification exam?

  • •  Several choices: has one accurate response (distractors) and three false ones.
  • •  Has at least two of the five potential answers—or more—correct.

You must answer as many questions as possible on this exam, as missing a question will result in a zero. Incorrect responses will not be penalized.

Complete Refund or 100% Success Guaranteed

You can be sure that the SAP-C02 Dumps PDF from Amazonawsdumps.com will help you pass the test. However, we will set you up with a complete refund if you use our products and don't pass the SAP-C02 exam on your first try. Just provide us with your SAP-C02 score report and any relevant documentation. Our staff will promptly transfer the entire amount of your legitimate funds when your information has been validated.

Why Pass Amazon SAP-C02 Exam?

In today’s world, you need the validation of your skills to get past the competition. Amazon SAP-C02 Exam is that validation. Not only is Amazon a leading industry in IT but it also offers certification exams to prove Amazon's skills. These skills prove you capable of fulfilling the Amazon job role. To get certified you simply pass the SAP-C02 Exam. This brings us to Amazon SAP-C02 Question Answers set. Passing this certification exam from Amazon may seem easy but it’s not. Many students fail this exam only because they didn’t take it seriously. Don’t make this mistake and order your Amazon SAP-C02 Braindumps right now!

Amazonawsdumps.com is the most popular and reliable website that has helped thousands of candidates excel at Amazon Exams. You could be one of those fortunate few too. Pass your exam in one attempt with Amazon SAP-C02 PDF and own the future. Buy Now!

Superlative Amazon SAP-C02 Dumps!

We know we said passing amazon exams is hard but that’s only if you’ve been led astray. There are millions of Amazon SAP-C02 Practice Questions available online promising success but fail when it comes down to it. Choose your training material carefully and get Amazon SAP-C02 Question Answers that are valid, accurate, and approved by famous IT professionals. Our Amazon SAP-C02 Braindumps are created by experts for experts and generate first-class results in just a single attempt. Don’t believe us? Try our free demo version that contains all the features you’ll get with Amazon SAP-C02 PDF. An interactive design, easy to read format, understandable language, and concise pattern. And if you still don’t get the result you want and fail somehow, you get your money back in full. So, order your set of Amazon SAP-C02 Dumps now!

We promise our customers to take full responsibility for their learning, preparation and passing SAP-C02 Exams without a hunch. Our aim is your satisfaction and ease. That is why we demand only the reasonable cost on Amazon SAP-C02 Practice Questions. Moreover, offer 2 formats: PDF and online test engine. Also, there is always a little extra with our discount coupons.

Why Buy Amazon SAP-C02 Question Answers?

Amazonawsdumps.com the team is a bunch of experts who got lucky with Amazon SAP-C02 Braindumps. We got what we needed to pass the exam and we went through its challenges as well. That is why we want every Amazon Candidate to get success. Choosing among so many options of Amazon SAP-C02 PDF is a tricky situation. Sometimes they don’t turn out like they first appeared to be. That is the reason we offer our valued customers a free demo. They can get a test run of Amazon SAP-C02 Dumps before they buy it. When it comes to buying, the procedure is simple, secure, and hardly jeopardizing. Because our Amazon SAP-C02 Practice Questions have a 99.8% passing rate.

Amazon SAP-C02 Sample Questions

Question # 1

A company is developing an application that will display financial reports. The companyneeds a solution that can store financial Information that comes from multiple systems. Thesolution must provide the reports through a web interface and must serve the data will lessman 500 milliseconds or latency to end users. The solution also must be highly availableand must have an RTO or 30 seconds.Which solution will meet these requirements?

A. Use an Amazon Redshift cluster to store the data. Use a state website that is hosted onAmazon S3 with backend APIs that ate served by an Amazon Elastic Cubemates Service(Amazon EKS) cluster to provide the reports to the application.
B. Use Amazon S3 to store the data Use Amazon Athena to provide the reports to theapplication. Use AWS App Runner to serve the application to view the reports.
C. Use Amazon DynamoDB to store the data, use an embedded Amazon QuickStightdashboard with direct Query datasets to provide the reports to the application.
D. Use Amazon Keyspaces (for Apache Cassandra) to store the data, use AWS ElasticBeanstalk to provide the reports to the application.

ANSWER : C


Question # 2

A company has a Windows-based desktop application that is packaged and deployed to the users' Windows machines. The company recently acquired another company that hasemployees who primarily use machines with a Linux operating system. The acquiringcompany has decided to migrate and rehost the Windows-based desktop application loAWS.All employees must be authenticated before they use the application. The acquiringcompany uses Active Directory on premises but wants a simplified way to manage accessto the application on AWS (or all the employees.Which solution will rehost the application on AWS with the LEAST development effort?

A. Set up and provision an Amazon Workspaces virtual desktop for every employee.Implement authentication by using Amazon Cognito identity pools. Instruct employees torun the application from their provisioned Workspaces virtual desktops.
B. Create an Auto Scarlet group of Windows-based Ama7on EC2 instances. Join eachEC2 instance to the company's Active Directory domain. Implement authentication by usingthe Active Directory That is running on premises. Instruct employees to run the applicationby using a Windows remote desktop.
C. Use an Amazon AppStream 2.0 image builder to create an image that includes theapplication and the required configurations. Provision an AppStream 2.0 On-Demand fleetwith dynamic Fleet Auto Scaling process for running the image. Implement authenticationby using AppStream 2.0 user pools. Instruct the employees to access the application bystarling browse'-based AppStream 2.0 streaming sessions.
D. Refactor and containerize the application to run as a web-based application. Run theapplication in Amazon Elastic Container Service (Amazon ECS) on AWS Fargate with stepscaling policies Implement authentication by using Amazon Cognito user pools. Instruct theemployees to run the application from their browsers.

ANSWER : C


Question # 3

A company has Linux-based Amazon EC2 instances. Users must access the instances byusing SSH with EC2 SSH Key pairs. Each machine requires a unique EC2 Key pair.The company wants to implement a key rotation policy that will, upon request,automatically rotate all the EC2 key pairs and keep the key in a securely encrypted place.The company will accept less than 1 minute of downtime during key rotation.Which solution will meet these requirement?

A. Store all the keys in AWS Secrets Manager. Define a Secrets Manager rotationschedule to invoke an AWS Lambda function to generate new key pairs. Replace publicKeys on EC2 instances. Update the private keys in Secrets Manager.
B. Store all the keys in Parameter. Store, a capability of AWS Systems Manager, as astring. Define a Systems Manager maintenance window to invoke an AWS Lambdafunction to generate new key pairs. Replace public keys on EC2 instance. Update theprivate keys in parameter.
C. Import the EC2 key pairs into AWS Key Management Service (AWS KMS). Configureautomatic key rotation for these key pairs. Create an Amazon EventlBridge scheduled ruleto invoke an AWS Lambda function to initiate the key rotation AWS KMS.
D. Add all the EC2 instances to Feet Manager, a capability of AWS Systems Manager.Define a Systems Manager maintenance window to issue a Systems Manager RunCommand document to generate new Key pairs and to rotate public keys to all theinstances in Feet Manager.

ANSWER : A


Question # 4

A company needs to gather data from an experiment in a remote location that does nothave internet connectivity. During the experiment, sensors that are connected to a totalnetwork will generate 6 TB of data in a preprimary formal over the course of 1 week. Thesensors can be configured to upload their data files to an FTP server periodically, but thesensors do not have their own FTP server. The sensors also do not support otherprotocols. The company needs to collect the data centrally and move lie data to objectstorage in the AWS Cloud as soon. as possible after the experiment.Which solution will meet these requirements?

A. Order an AWS Snowball Edge Compute Optimized device. Connect the device to thelocal network. Configure AWS DataSync with a target bucket name, and unload the dataover NFS to the device. After the experiment return the device to AWS so that the data canbe loaded into Amazon S3.
B. Order an AWS Snowcone device, including an Amazon Linux 2 AMI. Connect the deviceto the local network. Launch an Amazon EC2 instance on the device. Create a shell script that periodically downloads data from each sensor. After the experiment, return the deviceto AWS so that the data can be loaded as an Amazon Elastic Block Store [Amazon EBS)volume.
C. Order an AWS Snowcone device, including an Amazon Linux 2 AMI. Connect the deviceto the local network. Launch an Amazon EC2 instance on the device. Install and configurean FTP server on the EC2 instance. Configure the sensors to upload data to the EC2instance. After the experiment, return the device to AWS so that the data can be loadedinto Amazon S3.
D. Order an AWS Snowcone device. Connect the device to the local network. Configurethe device to use Amazon FSx. Configure the sensors to upload data to the device.Configure AWS DataSync on the device to synchronize the uploaded data with an AmazonS3 bucket Return the device to AWS so that the data can be loaded as an Amazon ElasticBlock Store (Amazon EBS) volume.

ANSWER : C


Question # 5

A company is using an organization in AWS organization to manage AWS accounts. Foreach new project the company creates a new linked account. After the creation of a newaccount, the root user signs in to the new account and creates a service request to increase the service quota for Amazon EC2 instances. A solutions architect needs toautomate this process.Which solution will meet these requirements with tie LEAST operational overhead?

A. Create an Amazon EventBridge rule to detect creation of a new account Send the eventto an Amazon Simple Notification Service (Amazon SNS) topic that invokes an AWSLambda function. Configure the Lambda function to run the request-service-quota-increasecommand to request a service quota increase for EC2 instances.
B. Create a Service Quotas request template in the management account. Configure thedesired service quota increases for EC2 instances.
C. Create an AWS Config rule in the management account to set the service quota for EC2instances.
D. Create an Amazon EventBridge rule to detect creation of a new account. Send the eventto an Amazon simple Notification service (Amazon SNS) topic that involves an AWSLambda function. Configure the Lambda function to run the create-case command torequest a service quota increase for EC2 instances.

ANSWER : A


Question # 6

A company is currently in the design phase of an application that will need an RPO of lessthan 5 minutes and an RTO of less than 10 minutes. The solutions architecture team isforecasting that the database will store approximately 10 TB of data. As part of the design, they are looking for a database solution that will provide the company with the ability to failover to a secondary Region.Which solution will meet these business requirements at the LOWEST cost?

A. Deploy an Amazon Aurora DB cluster and take snapshots of the cluster every 5minutes. Once a snapshot is complete, copy the snapshot to a secondary Region to serveas a backup in the event of a failure.
B. Deploy an Amazon RDS instance with a cross-Region read replica in a secondaryRegion. In the event of a failure, promote the read replica to become the primary.
C. Deploy an Amazon Aurora DB cluster in the primary Region and another in a secondaryRegion. Use AWS DMS to keep the secondary Region in sync.
D. Deploy an Amazon RDS instance with a read replica in the same Region. In the event ofa failure, promote the read replica to become the primary.

ANSWER : B


Question # 7

A company needs to improve the reliability ticketing application. The application runs on anAmazon Elastic Container Service (Amazon ECS) cluster. The company uses AmazonCloudFront to servo the application. A single ECS service of the ECS cluster is theCloudFront distribution's origin.The application allows only a specific number of active users to enter a ticket purchasingflow. These users are identified by an encrypted attribute in their JSON Web Token (JWT).All other users are redirected to a waiting room module until there is available capacity forpurchasing.The application is experiencing high loads. The waiting room modulo is working asdesigned, but load on the waiting room is disrupting the application's availability. Thisdisruption is negatively affecting the application's ticket sale Transactions.Which solution will provide the MOST reliability for ticket sale transactions during periods ofhigh load? '

A. Create a separate service in the ECS cluster for the waiting room. Use a separatescaling configuration. Ensure that the ticketing service uses the JWT info-nation andappropriately forwards requests to the waring room service.
B. Move the application to an Amazon Elastic Kubernetes Service (Amazon EKS) cluster.Split the wailing room module into a pod that is separate from the ticketing pod. Make theticketing pod part of a StatefuISeL Ensure that the ticketing pod uses the JWT informationand appropriately forwards requests to the waiting room pod.
C. Create a separate service in the ECS cluster for the waiting room. Use a separatescaling configuration. Create a CloudFront function That inspects the JWT information andappropriately forwards requests to the ticketing service or the waiting room service
D. Move the application to an Amazon Elastic Kubernetes Service (Amazon EKS) cluster.Split the wailing room module into a pod that is separate from the ticketing pod. Use AWSApp Mesh by provisioning the App Mesh controller for Kubermetes. Enable mTLSauthentication and service-to-service authentication for communication between theticketing pod and the waiting room pod. Ensure that the ticketing pod uses The JWTinformation and appropriately forwards requests to the waiting room pod.

ANSWER : C


Question # 8

A software development company has multiple engineers who ate working remotely. Thecompany is running Active Directory Domain Services (AD DS) on an Amazon EC2instance. The company's security policy states that al internal, nonpublic services that aredeployed in a VPC must be accessible through a VPN. Multi-factor authentication (MFA)must be used for access to a VPN.What should a solutions architect do to meet these requirements?

A. Create an AWS Sire-to-Site VPN connection. Configure Integration between a VPN andAD DS. Use an Amazon Workspaces client with MFA support enabled to establish a VPNconnection.
B. Create an AWS Client VPN endpoint Create an AD Connector directory tor integrationwith AD DS. Enable MFA tor AD Connector. Use AWS Client VPN to establish a VPNconnection.
C. Create multiple AWS Site-to-Site VPN connections by using AWS VPN CloudHub.Configure integration between AWS VPN CloudHub and AD DS. Use AWS Copilot toestablish a VPN connection.
D. Create an Amazon WorkLink endpoint. Configure integration between AmazonWorkLink and AD DS. Enable MFA in Amazon WorkLink. Use AWS Client VPN to establisha VPN connection.

ANSWER : B


Question # 9

A company wants to use Amazon Workspaces in combination with thin client devices toreplace aging desktops. Employees use the desktops to access applications that work withclinical trial data. Corporate security policy states that access to the applications must be restricted to only company branch office locations. The company is considering adding anadditional branch office in the next 6 months.Which solution meets these requirements with the MOST operational efficiency?

A. Create an IP access control group rule with the list of public addresses from the branchoffices. Associate the IP access control group with the Workspaces directory.
B. Use AWS Firewall Manager to create a web ACL rule with an IPSet with the list to publicaddresses from the branch office Locations-Associate the web ACL with the Workspacesdirectory.
C. Use AWS Certificate Manager (ACM) to issue trusted device certificates to the machinesdeployed in the branch office locations. Enable restricted access on the Workspacesdirectory.
D. Create a custom Workspace image with Windows Firewall configured to restrict accessto the public addresses of the branch offices. Use the image to deploy the Workspaces.

ANSWER : A


Question # 10

A company needs to implement disaster recovery for a critical application that runs in asingle AWS Region. The application's users interact with a web frontend that is hosted onAmazon EC2 Instances behind an Application Load Balancer (ALB). The application writesto an Amazon RD5 tor MySQL DB instance. The application also outputs processeddocuments that are stored in an Amazon S3 bucketThe company's finance team directly queries the database to run reports. During busyperiods, these queries consume resources and negatively affect application performance.A solutions architect must design a solution that will provide resiliency during a disaster.The solution must minimize data loss and must resolve the performance problems thatresult from the finance team's queries.Which solution will meet these requirements?

A. Migrate the database to Amazon DynamoDB and use DynamoDB global tables. Instructthe finance team to query a global table in a separate Region. Create an AWS Lambdafunction to periodically synchronize the contents of the original S3 bucket to a new S3bucket in the separate Region. Launch EC2 instances and create an ALB in the separateRegion. Configure the application to point to the new S3 bucket.
B. Launch additional EC2 instances that host the application in a separate Region. Add theadditional instances to the existing ALB. In the separate Region, create a read replica ofthe RDS DB instance. Instruct the finance team to run queries ageist the read replica. UseS3 Cross-Region Replication (CRR) from the original S3 bucket to a new S3 Docket in theseparate Region. During a disaster, promote the read replace to a standalone DB instance.Configure the application to point to the new S3 bucket and to the newly project readreplica.
C. Create a read replica of the RDS DB instance in a separate Region. Instruct the financeteam to run queries against the read replica. Create AMIs of the EC2 instances mat hostthe application frontend- Copy the AMIs to the separate Region. Use S3 Cross-RegionReplication (CRR) from the original S3 bucket to a new S3 bucket in the separate Region.During a disaster, promote the read replica to a standalone DB instance. Launch EC2instances from the AMIs and create an ALB to present the application to end users.Configure the application to point to the new S3 bucket.
D. Create hourly snapshots of the RDS DB instance. Copy the snapshots to a separateRegion. Add an Amazon Elastic ache cluster m front of the existing RDS database. CreateAMIs of the EC2 instances that host the application frontend Copy the AMIs to the separateRegion. Use S3 Cross-Region Replication (CRR) from the original S3 bucket to a new S3bucket in the separate Region. During a disaster, restore The database from the latestRDS snapshot. Launch EC2 Instances from the AMIs and create an ALB to present theapplication to end users. Configure the application to point to the new S3 bucket

ANSWER : C


Question # 11

A public retail web application uses an Application Load Balancer (ALB) in front of AmazonEC2 instances running across multiple Availability Zones (AZs) in a Region backed by anAmazon RDS MySQL Multi-AZ deployment. Target group health checks are configured touse HTTP and pointed at the product catalog page. Auto Scaling is configured to maintainthe web fleet size based on the ALB health check.Recently, the application experienced an outage. Auto Scaling continuously replaced theinstances during the outage. A subsequent investigation determined that the web servermetrics were within the normal range, but the database tier was experiencing high toad,resulting in severely elevated query response times.Which of the following changes together would remediate these issues while improvingmonitoring capabilities for the availability and functionality of the entire application stack forfuture growth? (Select TWO.)

A. Configure read replicas for Amazon RDS MySQL and use the single reader endpoint inthe web application to reduce the load on the backend database tier.
B. Configure the target group health check to point at a simple HTML page instead of aproduct catalog page and the Amazon Route 53 health check against the product page toevaluate full application functionality. Configure Ama7on CloudWatch alarms to notifyadministrators when the site fails.
C. Configure the target group health check to use a TCP check of the Amazon EC2 webserver and the Amazon Route S3 health check against the product page to evaluate fullapplication functionality. Configure Amazon CloudWatch alarms to notify administratorswhen the site fails.
D. Configure an Amazon CtoudWatch alarm for Amazon RDS with an action to recover ahigh-load, impaired RDS instance in the database tier.
E. Configure an Amazon Elastic ache cluster and place it between the web application andRDS MySQL instances to reduce the load on the backend database tier.

ANSWER : A,E


Question # 12

A company hosts an intranet web application on Amazon EC2 instances behind anApplication Load Balancer (ALB). Currently, users authenticate to the application againstan internal user database.The company needs to authenticate users to the application by using an existing AWSDirectory Service for Microsoft Active Directory directory. All users with accounts in thedirectory must have access to the application.Which solution will meet these requirements?

A. Create a new app client in the directory. Create a listener rule for the ALB. Specify theauthenticate-oidc action for the listener rule. Configure the listener rule with the appropriateissuer, client ID and secret, and endpoint details for the Active Directory service. Configurethe new app client with the callback URL that the ALB provides.
B. Configure an Amazon Cognito user pool. Configure the user pool with a federatedidentity provider (IdP) that has metadata from the directory. Create an app client. Associatethe app client with the user pool. Create a listener rule for the ALB. Specify theauthenticate-cognito action for the listener rule. Configure the listener rule to use the userpool and app client.
C. Add the directory as a new 1AM identity provider (IdP). Create a new 1AM role that hasan entity type of SAML 2.0 federation. Configure a role policy that allows access to theALB. Configure the new role as the default authenticated user role for the IdP. Create alistener rule for the ALB. Specify the authenticate-oidc action for the listener rule.
D. Enable AWS 1AM Identity Center (AWS Single Sign-On). Configure the directory as anexternal identity provider (IdP) that uses SAML. Use the automatic provisioning method.Create a new 1AM role that has an entity type of SAML 2.0 federation. Configure a rolepolicy that allows access to the ALB. Attach the new role to all groups. Create a listenerrule for the ALB. Specify the authenticate-cognito action for the listener rule.

ANSWER : A


Question # 13

A company wants to establish a dedicated connection between its on-premisesinfrastructure and AWS. The company is setting up a 1 Gbps AWS Direct Connectconnection to its account VPC. The architecture includes a transit gateway and a DirectConnect gateway to connect multiple VPCs and the on-premises infrastructure.The company must connect to VPC resources over a transit VIF by using the DirectConnect connection.Which combination of steps will meet these requirements? (Select TWO.)

A. Update the 1 Gbps Direct Connect connection to 10 Gbps.
B. Advertise the on-premises network prefixes over the transit VIF.
C. Adverse the VPC prefixes from the Direct Connect gateway to the on-premises networkover the transit VIF.
D. Update the Direct Connect connection's MACsec encryption mode attribute to mustencrypt.
E. Associate a MACsec Connection Key Name-Connectivity Association Key (CKN/CAK)pair with the Direct Connect connection.

ANSWER : B,C


Question # 14

A company has many services running in its on-premises data center. The data center isconnected to AWS using AWS Direct Connect (DX)and an IPsec VPN. The service data issensitive and connectivity cannot traverse the interne. The company wants to expand to a new market segment and begin offering Is services to other companies that are usingAWS.Which solution will meet these requirements?

A. Create a VPC Endpoint Service that accepts TCP traffic, host it behind a Network LoadBalancer, and make the service available over DX.
B. Create a VPC Endpoint Service that accepts HTTP or HTTPS traffic, host it behind anApplication Load Balancer, and make the service available over DX.
C. Attach an internet gateway to the VPC. and ensure that network access control andsecurity group rules allow the relevant inbound and outbound traffic.
D. Attach a NAT gateway to the VPC. and ensue that network access control and securitygroup rules allow the relevant inbound and outbound traffic.

ANSWER : B


Question # 15

A flood monitoring agency has deployed more than 10.000 water-level monitoring sensors.Sensors send continuous data updates, and each update is less than 1 MB in size. Theagency has a fleet of on-premises application servers. These servers receive upda.es 'onthe sensors, convert the raw data into a human readable format, and write the results loanon-premises relational database server. Data analysts then use simple SOL queries tomonitor the data.The agency wants to increase overall application availability and reduce the effort that isrequired to perform maintenance tasks These maintenance tasks, which include updatesand patches to the application servers, cause downtime. While an application server isdown, data is lost from sensors because the remaining servers cannot handle the entireworkload.The agency wants a solution that optimizes operational overhead and costs. A solutionsarchitect recommends the use of AWS loT Core to collect the sensor data. What else should the solutions architect recommend to meet these requirements?

A. Send the sensor data to Amazon Kinesis Data Firehose. Use an AWS Lambda functionto read the Kinesis Data Firehose data, convert it to .csv format, and insert it into anAmazon Aurora MySQL DB instance. Instruct the data analysts to query the data directlyfrom the DB instance.
B. Send the sensor data to Amazon Kinesis Data Firehose. Use an AWS Lambda functionto read the Kinesis Data Firehose data, convert it to Apache Parquet format and save it toan Amazon S3 bucket. Instruct the data analysts to query the data by using AmazonAthena.
C. Send the sensor data to an Amazon Managed Service for Apache Flink {previouslyknown as Amazon Kinesis Data Analytics) application to convert the data to .csv formatand store it in an Amazon S3 bucket. Import the data into an Amazon Aurora MySQL DBinstance. Instruct the data analysts to query the data directly from the DB instance.
D. Send the sensor data to an Amazon Managed Service for Apache Flink (previouslyknown as Amazon Kinesis Data Analytics) application to convert the data to ApacheParquet format and store it in an Amazon S3 bucket Instruct the data analysis to query thedata by using Amazon Athena.

ANSWER : B


Question # 16

A company that is developing a mobile game is making game assets available in two AWSRegions. Game assets are served from a set of Amazon EC2 instances behind anApplication Load Balancer (ALB) in each Region. The company requires game assets to befetched from the closest Region. If game assess become unavailable in the closest Region,they should the fetched from the other Region. What should a solutions architect do to meet these requirement?

A. Create an Amazon CloudFront distribution. Create an origin group with one origin foreach ALB. Set one of the origins as primary.
B. Create an Amazon Route 53 health check tor each ALB. Create a Route 53 failoverrouting record pointing to the two ALBs. Set the Evaluate Target Health value Yes.
C. Create two Amazon CloudFront distributions, each with one ALB as the origin. Createan Amazon Route 53 failover routing record pointing to the two CloudFront distributions.Set the Evaluate Target Health value to Yes.
D. Create an Amazon Route 53 health check tor each ALB. Create a Route 53 latency aliasrecord pointing to the two ALBs. Set the Evaluate Target Health value to Yes.

ANSWER : A


Question # 17

An e-commerce company is revamping its IT infrastructure and is planning to use AWSservices. The company's CIO has asked a solutions architect to design a simple, highlyavailable, and loosely coupled order processing application. The application is responsiblefor receiving and processing orders before storing them in an Amazon DynamoDB table.The application has a sporadic traffic pattern and should be able to scale during marketingcampaigns to process the orders with minimal delays.Which of the following is the MOST reliable approach to meet the requirements?

A. Receive the orders in an Amazon EC2-hosted database and use EC2 instances toprocess them.
B. Receive the orders in an Amazon SQS queue and invoke an AWS Lambda function toprocess them.
C. Receive the orders using the AWS Step Functions program and launch an Amazon ECScontainer to process them.
D. Receive the orders in Amazon Kinesis Data Streams and use Amazon EC2 instances toprocess them.

ANSWER : B


Question # 18

A company deploys workloads in multiple AWS accounts. Each account has a VPC withVPC flow logs published in text log format to a centralized Amazon S3 bucket. Each log fileis compressed with gzjp compression. The company must retain the log files indefinitely.A security engineer occasionally analyzes the togs by using Amazon Athena to query theVPC flow logs. The query performance is degrading over time as the number of ingestedtogs is growing. A solutions architect: must improve the performance of the tog analysis and reduce the storage space that the VPC flow logs use.Which solution will meet these requirements with the LARGEST performanceimprovement?

A. Create an AWS Lambda function to decompress the gzip flies and to compress the tileswith bzip2 compression. Subscribe the Lambda function to an s3: ObiectCrealed;Put S3event notification for the S3 bucket.
B. Enable S3 Transfer Acceleration for the S3 bucket. Create an S3 Lifecycle configurationto move files to the S3 Intelligent-Tiering storage class as soon as the ties are uploaded
C. Update the VPC flow log configuration to store the files in Apache Parquet format.Specify Hourly partitions for the log files.
D. Create a new Athena workgroup without data usage control limits. Use Athena engineversion 2.

ANSWER : C


Question # 19

A company is designing an AWS environment tor a manufacturing application. Theapplication has been successful with customers, and the application's user base hasincreased. The company has connected the AWS environment to the company's onpremisesdata center through a 1 Gbps AWS Direct Connect connection. The company hasconfigured BGP for the connection.The company must update the existing network connectivity solution to ensure that thesolution is highly available, fault tolerant, and secure.Which solution win meet these requirements MOST cost-effectively?

A. Add a dynamic private IP AWS Site-to-Site VPN as a secondary path to secure data intransit and provide resilience for the Direct Conned connection. Configure MACsec toencrypt traffic inside the Direct Connect connection.
B. Provision another Direct Conned connection between the company's on-premises datacenter and AWS to increase the transfer speed and provide resilience. Configure MACsecto encrypt traffic inside the Dried Conned connection.
C. Configure multiple private VIFs. Load balance data across the VIFs between the onpremisesdata center and AWS to provide resilience.
D. Add a static AWS Site-to-Site VPN as a secondary path to secure data in transit and toprovide resilience for the Direct Connect connection.

ANSWER : A


Question # 20

A company has an loT platform that runs in an on-premises environment. The platformconsists of a server that connects to loT devices by using the MQTT protocol. The platformcollects telemetry data from the devices at least once every 5 minutes The platform alsostores device metadata in a MongoDB clusterAn application that is installed on an on-premises machine runs periodic jobs to aggregateand transform the telemetry and device metadata The application creates reports thatusers view by using another web application that runs on the same on-premises machineThe periodic jobs take 120-600 seconds to run However, the web application is alwaysrunning.The company is moving the platform to AWS and must reduce the operational overhead ofthe stack.Which combination of steps will meet these requirements with the LEAST operationaloverhead? (Select THREE.)

A. Use AWS Lambda functions to connect to the loT devices
B. Configure the loT devices to publish to AWS loT Core
C. Write the metadata to a self-managed MongoDB database on an Amazon EC2 instance
D. Write the metadata to Amazon DocumentDB (with MongoDB compatibility)
E. Use AWS Step Functions state machines with AWS Lambda tasks to prepare thereports and to write the reports to Amazon S3 Use Amazon CloudFront with an S3 origin toserve the reports
F. Use an Amazon Elastic Kubernetes Service (Amazon EKS) cluster with Amazon EC2instances to prepare the reports Use an ingress controller in the EKS cluster to serve thereports

ANSWER : B,D,E


Question # 21

A solutions architect is preparing to deploy a new security tool into several previouslyunused AWS Regions. The solutions architect will deploy the tool by using an AWSCloudFormation stack set. The stack set's template contains an 1AM role that has acustom name. Upon creation of the stack set. no stack instances are created successfully.What should the solutions architect do to deploy the stacks successfully?

A. Enable the new Regions in all relevant accounts. Specify theCAPABILITY_NAMED_IAM capability during the creation of the stack set.
B. Use the Service Quotas console to request a quota increase for the number ofCloudFormation stacks in each new Region in all relevant accounts. Specify theCAPABILITYJAM capability during the creation of the stack set.
C. Specify the CAPABILITY_NAMED_IAM capability and the SELF_MANAGEDpermissions model during the creation of the stack set.
D. Specify an administration role ARN and the CAPABILITYJAM capability during thecreation of the stack set.

ANSWER : A


Question # 22

A company is planning a migration from an on-premises data center to the AWS cloud. Thecompany plans to use multiple AWS accounts that are managed in an organization in AWSorganizations. The company will cost a small number of accounts initially and will addaccounts as needed. A solution architect must design a solution that turns on AWSaccounts.What is the MOST operationally efficient solution that meets these requirements.

A. Create an AWS Lambda function that creates a new cloudTrail trail in all AWS accountin the organization. Invoke the Lambda function dally by using a scheduled action inAmazon EventBridge.
B. Create a new CloudTrail trail in the organizations management account. Configure the trail to log all events for all AYYS accounts in the organization.
C. Create a new CloudTrail trail in all AWS accounts in the organization. Create new trailswhenever a new account is created.
D. Create an AWS systems Manager Automaton runbook that creates a cloud trail in allAWS accounts in the organization. Invoke the automation by using Systems Manager StateManager.

ANSWER : B


Question # 23

A company wants to migrate an Amazon Aurora MySQL DB cluster from an existing AWSaccount to a new AWS account in the same AWS Region. Both accounts are members ofthe same organization in AWS Organizations.The company must minimize database service interruption before the company performsDNS cutover to the new database.Which migration strategy will meet this requirement?

A. Take a snapshot of the existing Aurora database. Share the snapshot with the new AWSaccount. Create an Aurora DB cluster in the new account from the snapshot.
B. Create an Aurora DB cluster in the new AWS account. Use AWS Database MigrationService (AWS DMS) to migrate data between the two Aurora DB clusters.
C. Use AWS Backup to share an Aurora database backup from the existing AWS accountto the new AWS account. Create an Aurora DB cluster in the new AWS account from thesnapshot.
D. Create an Aurora DB cluster in the new AWS account. Use AWS Application MigrationService to migrate data between the two Aurora DB clusters.

ANSWER : B


Question # 24

A company has a web application that uses Amazon API Gateway. AWS Lambda andAmazon DynamoDB A recent marketing campaign has increased demand Monitoringsoftware reports that many requests have significantly longer response times than beforethe marketing campaignA solutions architect enabled Amazon CloudWatch Logs for API Gateway and noticed thaterrors are occurring on 20% of the requests. In CloudWatch. the Lambda function.Throttles metric represents 1% of the requests and the Errors metric represents 10% of therequests Application logs indicate that, when errors occur there is a call to DynamoDBWhat change should the solutions architect make to improve the current response times asthe web application becomes more popular'?

A. Increase the concurrency limit of the Lambda function
B. Implement DynamoDB auto scaling on the table
C. Increase the API Gateway throttle limit
D. Re-create the DynamoDB table with a better-partitioned primary index.

ANSWER : B


What our clients say about SAP-C02 Learning Materials

Leave a comment

Your email address will not be published. Required fields are marked *

Rating / Feedback About This Exam