Pre-Summer Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 65pass65

SAP-C02 AWS Certified Solutions Architect - Professional is now Stable and With Pass Result | Test Your Knowledge for Free

Exams4sure Dumps

SAP-C02 Practice Questions

AWS Certified Solutions Architect - Professional

Last Update 17 hours ago
Total Questions : 645

Dive into our fully updated and stable SAP-C02 practice test platform, featuring all the latest AWS Certified Professional exam questions added this week. Our preparation tool is more than just a Amazon Web Services study aid; it's a strategic advantage.

Our free AWS Certified Professional practice questions crafted to reflect the domains and difficulty of the actual exam. The detailed rationales explain the 'why' behind each answer, reinforcing key concepts about SAP-C02. Use this test to pinpoint which areas you need to focus your study on.

SAP-C02 PDF

SAP-C02 PDF (Printable)
$43.75
$124.99

SAP-C02 Testing Engine

SAP-C02 PDF (Printable)
$50.75
$144.99

SAP-C02 PDF + Testing Engine

SAP-C02 PDF (Printable)
$63.7
$181.99
Question # 61

Question:

How should a companyefficiently processinfrequently uploaded S3 data using a long-running (up to 25 minutes) custom application?

Options:

A.  

ECS on Fargate triggered by EventBridge

B.  

Lambda in Step Functions with 30-min timeout

C.  

ECS with EC2 and Glue crawler

D.  

Lambda triggered by fan-out HTTP EventBridge logic

Discussion 0
Question # 62

A company has an organization in AWS Organizations. In a linked AWS account, the company has Amazon EC2 instances that are running for different projects. The company uses cost allocation tags for the EC2 instances and the Amazon EBS volumes to map the resources to the different projects. After a project is finished, the company terminates the EC2 instances and the EBS volumes.

Occasionally, EC2 instances and EBS volumes are not tagged. In these cases, the company struggles to map the resources to the correct project. A solutions architect must create an SCP to prevent the creation of EC2 instances and EBS volumes without the project tags.

Which SCP will meet these requirements?

Options:

A.  

Create an SCP that denies ec2:RunInstances when the aws:RequestTag/Project condition key is null for EC2 instance and EBS volume resources.

B.  

Create an SCP that denies ec2:StartInstances when the aws:RequestTag/Project condition key is null for EC2 instance and EBS volume resources.

C.  

Create an SCP that denies ec2:RunInstances when the aws:CostAllocationTag/Project condition key is null for EC2 instance and EBS volume resources.

D.  

Create an SCP that denies ec2:StartInstances when the aws:CostAllocationTag/Project condition key is null for EC2 instance and EBS volume resources.

Discussion 0
Question # 63

A software development company has multiple engineers who ate working remotely. The company is running Active Directory Domain Services (AD DS) on an Amazon EC2 instance. The company ' s security policy states that all internal, nonpublic services that are deployed in a VPC must be accessible through a VPN. Multi-factor authentication (MFA) must be used for access to a VPN.

What should a solutions architect do to meet these requirements?

Options:

A.  

Create an AWS Sire-to-Site VPN connection. Configure Integration between a VPN and AD DS. Use an Amazon Workspaces client with MFA support enabled to establish a VPN connection.

B.  

Create an AWS Client VPN endpoint Create an AD Connector directory tor integration with AD DS. Enable MFA tor AD Connector. Use AWS Client VPN to establish a VPN connection.

C.  

Create multiple AWS Site-to-Site VPN connections by using AWS VPN CloudHub. Configure integration between AWS VPN CloudHub and AD DS. Use AWS Copilot to establish a VPN connection.

D.  

Create an Amazon WorkLink endpoint. Configure integration between Amazon WorkLink and AD DS. Enable MFA in Amazon WorkLink. Use AWS Client VPN to establish a VPN connection.

Discussion 0
Question # 64

A company has created an OU in AWS Organizations for each of its engineering teams Each OU owns multiple AWS accounts. The organization has hundreds of AWS accounts A solutions architect must design a solution so that each OU can view a breakdown of usage costs across its AWS accounts. Which solution meets these requirements?

Options:

A.  

Create an AWS Cost and Usage Report (CUR) for each OU by using AWS Resource Access Manager Allow each team to visualize the CUR through an Amazon QuickSight dashboard.

B.  

Create an AWS Cost and Usage Report (CUR) from the AWS Organizations management account- Allow each team to visualize the CUR through an Amazon QuickSight dashboard

C.  

Create an AWS Cost and Usage Report (CUR) in each AWS Organizations member account Allow each team to visualize the CUR through an Amazon QuickSight dashboard.

D.  

Create an AWS Cost and Usage Report (CUR) by using AWS Systems Manager Allow each team to visualize the CUR through Systems Manager OpsCenter dashboards

Discussion 0
Question # 65

A financial services company loaded millions of historical stock trades into an Amazon DynamoDB table. The table uses on-demand capacity mode. Once each day at midnight, a few million new records are loaded into the table. Application read activity against the table happens in bursts throughout the day. and a limited set of keys are repeatedly looked up. The company needs to reduce costs associated with DynamoD

B.  

Which strategy should a solutions architect recommend to meet this requirement?

Options:

A.  

Deploy an Amazon ElastiCache cluster in front of the DynamoDB table.

B.  

Deploy DynamoDB Accelerator (DAX). Configure DynamoDB auto scaling. Purchase Savings Plans in Cost Explorer

C.  

Use provisioned capacity mode. Purchase Savings Plans in Cost Explorer.

D.  

Deploy DynamoDB Accelerator (DAX). Use provisioned capacity mode. Configure DynamoDB auto scaling.

Discussion 0
Question # 66

A solutions architect is creating an AWS CloudFormation template from an existing manually created non-production AWS environment The CloudFormation template can be destroyed and recreated as needed The environment contains an Amazon EC2 instance The EC2 instance has an instance profile that the EC2 instance uses to assume a role in a parent account

The solutions architect recreates the role in a CloudFormation template and uses the same role name When the CloudFormation template is launched in the child account, the EC2 instance can no longer assume the role in the parent account because of insufficient permissions

What should the solutions architect do to resolve this issue?

Options:

A.  

In the parent account edit the trust policy for the role that the EC2 instance needs to assume Ensure that the target role ARN in the existing statement that allows the sts AssumeRole action is correct Save the trust policy

B.  

In the parent account edit the trust policy for the role that the EC2 instance needs to assume Add a statement that allows the sts AssumeRole action for the root principal of the child account Save the trust policy

C.  

Update the CloudFormation stack again Specify only the CAPABILITY_NAMED_IAM capability

D.  

Update the CloudFormation stack again Specify the CAPABIUTYJAM capability and the CAPABILITY_NAMEDJAM capability

Discussion 0
Question # 67

A solutions architect needs to implement a client-side encryption mechanism for objects that will be stored in a new Amazon S3 bucket. The solutions architect created a CMK that is stored in AWS Key Management Service (AWS KMS) for this purpose.

The solutions architect created the following IAM policy and attached it to an IAM role:

During tests, me solutions architect was able to successfully get existing test objects m the S3 bucket However, attempts to upload a new object resulted in an error message. The error message stated that me action was forbidden.

Which action must me solutions architect add to the IAM policy to meet all the requirements?

Options:

A.  

Kms:GenerateDataKey

B.  

KmsGetKeyPolpcy

C.  

kmsGetPubKKey

D.  

kms:SKjn

Discussion 0
Question # 68

A company manufactures smart vehicles. The company uses a custom application to collect vehicle data. The vehicles use the MQTT protocol to connect to the application.

The company processes the data in 5-minute intervals. The company then copies vehicle telematics data to on-premises storage. Custom applications analyze this data to detect anomalies.

The number of vehicles that send data grows constantly. Newer vehicles generate high volumes of data. The on-premises storage solution is not able to scale for peak traffic, which results in data loss. The company must modernize the solution and migrate the solution to AWS to resolve the scaling challenges.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Use AWS IOT Greengrass to send the vehicle data to Amazon Managed Streaming for Apache Kafka (Amazon MSK). Create an Apache Kafka application to store the data in Amazon S3. Use a pretrained model in Amazon SageMaker to detect anomalies.

B.  

Use AWS IOT Core to receive the vehicle data. Configure rules to route data to an Amazon Kinesis Data Firehose delivery stream that stores the data in Amazon S3. Create an Amazon Kinesis Data Analytics application that reads from the delivery stream to detect anomalies.

C.  

Use AWS IOT FleetWise to collect the vehicle data. Send the data to an Amazon Kinesis data stream. Use an Amazon Kinesis Data Firehose delivery stream to store the data in Amazon S3. Use the built-in machine learning transforms in AWS Glue to detect anomalies.

D.  

Use Amazon MQ for RabbitMQ to collect the vehicle data. Send the data to an Amazon Kinesis Data Firehose delivery stream to store the data in Amazon S3. Use Amazon Lookout for Metrics to detect anomalies.

Discussion 0
Question # 69

A company uses AWS Organizations for a multi-account setup in the AWS Cloud. The company ' s finance team has a data processing application that uses AWS Lambda and Amazon DynamoD

B.  

The company ' s marketing team wants to access the data that is stored in the DynamoDB table.

The DynamoDB table contains confidential data. The marketing team can have access to only specific attributes of data in the DynamoDB table. The fi-nance team and the marketing team have separate AWS accounts.

What should a solutions architect do to provide the marketing team with the appropriate access to the DynamoDB table?

Options:

A.  

Create an SCP to grant the marketing team ' s AWS account access to the specific attributes of the DynamoDB table. Attach the SCP to the OU of the finance team.

B.  

Create an IAM role in the finance team ' s account by using IAM policy conditions for specific DynamoDB attributes (fine-grained access con-trol). Establish trust with the marketing team ' s account. In the mar-keting team ' s account, create an IAM role that has permissions to as-sume the IAM role in the finance team ' s account.

C.  

Create a resource-based IAM policy that includes conditions for spe-cific DynamoDB attributes (fine-grained access control). Attach the policy to the DynamoDB table. In the marketing team ' s account, create an IAM role that has permissions to access the DynamoDB table in the finance team ' s account.

D.  

Create an IAM role in the finance team ' s account to access the Dyna-moDB table. Use an IAM permissions boundary to limit the access to the specific attributes. In the marketing team ' s account, create an IAM role that has permissions to assume the IAM role in the finance team ' s account.

Discussion 0
Question # 70

A company is migrating its on-premises file transfer solution to AWS Transfer Family. The on-premises host includes an SFTP server to receive files, an application that performs a transformation of the files, and a messaging server. The transformations run every 5 minutes. When a transformation is complete, the application sends a message to a queue on the messaging server. The company needs to simplify the solution and reduce the management of the components. What should the company do to meet these requirements with the LEAST operational overhead?

Options:

A.  

Configure Transfer Family to use Amazon EFS storage. Use a cron job on Amazon EFS to perform the transformations. Configure the cron job to publish a message to an Amazon SNS topic when a file has been transformed.

B.  

Configure Transfer Family to use Amazon S3 storage. Use Amazon EMR to perform the transformations. Configure Amazon EMR to send a message to an Amazon SNS topic when a file has been transformed.

C.  

Configure Transfer Family to use Amazon S3 storage. Use AWS Glue to perform the transformations after S3 event notifications. Configure AWS Glue to send a message to an Amazon SQS queue when a file has been transformed.

D.  

Configure Transfer Family to use Amazon EFS storage. Create an AWS Glue time-based job to run every 5 minutes to initiate an AWS Glue transformation. Configure AWS Glue to send a message to an Amazon SQS queue when a file has been transformed.

Discussion 0
Question # 71

A company is using multiple AWS accounts The DNS records are stored in a private hosted zone for Amazon Route 53 in Account A The company ' s applications and databases are running in Account

B.  

A solutions architect win deploy a two-net application In a new VPC To simplify the configuration, the db.example com CNAME record set tor the Amazon RDS endpoint was created in a private hosted zone for Amazon Route 53.

During deployment, the application failed to start. Troubleshooting revealed that db.example com is not resolvable on the Amazon EC2 instance The solutions architect confirmed that the record set was created correctly in Route 53.

Which combination of steps should the solutions architect take to resolve this issue? (Select TWO )

Options:

A.  

Deploy the database on a separate EC2 instance in the new VPC Create a record set for the instance ' s private IP in the private hosted zone

B.  

Use SSH to connect to the application tier EC2 instance Add an RDS endpoint IP address to the /eto/resolv.conf file

C.  

Create an authorization lo associate the private hosted zone in Account A with the new VPC In Account B

D.  

Create a private hosted zone for the example.com domain m Account B Configure Route 53 replication between AWS accounts

E.  

Associate a new VPC in Account B with a hosted zone in Account

A.  

Delete the association authorization In Account

A.  

Discussion 0
Question # 72

A company manages hundreds of AWS accounts centrally in an organization in AWS Organizations. The company recently started to allow product teams to create and manage their own S3 access points in their accounts. The S3 access points can be accessed only within VPCs not on the internet.

What is the MOST operationally efficient way to enforce this requirement?

Options:

A.  

Set the S3 access point resource policy to deny the s3 CreateAccessPoint action unless the s3: AccessPointNetworkOngm condition key evaluates to VP

C.  

B.  

Create an SCP at the root level in the organization to deny the s3 CreateAccessPoint action unless the s3 AccessPomtNetworkOngin condition key evaluates to VP

C.  

C.  

Use AWS CloudFormation StackSets to create a new 1AM policy in each AVVS account that allows the s3: CreateAccessPoint action only if the s3 AccessPointNetworkOrigin condition key evaluates to VP

C.  

D.  

Set the S3 bucket policy to deny the s3: CreateAccessPoint action unless the s3AccessPointNetworkOrigin condition key evaluates to VP

C.  

Discussion 0
Question # 73

Question:

A company needs to copy backups of 40 RDS for MySQL databases from a production account to a central backup account within AWS Organizations. The databases usedefault AWS-managed KMS encryption keys. The backups must be stored in aWORM (Write Once Read Many)backup account.

What is the correct approach to enable cross-account backup?

Options:

A.  

Restore the databases with customer-managed KMS keys and use AWS Backup with cross-account vault sharing.

B.  

Share the default KMS keys with the central account and create backup vaults in the central account.

C.  

Use a Lambda function to decrypt and copy the snapshots to the central account.

D.  

Use a Lambda function to share and re-encrypt snapshots across accounts using the default KMS key.

Discussion 0
Question # 74

A company has a payment gateway that processes millions of daily transactions on AWS. The solution uses Amazon ECS with a single Amazon EC2 instance that is not configured for auto scaling and an Amazon Aurora PostgreSQL database. All the solution ' s resources are deployed in the same Availability Zone. The company uses Amazon Route 53 to manage its domain name resolution.

The company needs to implement a new strategy to make the application more highly available.

Which solution will meet this requirement with the LEAST operational overhead?

Options:

A.  

Set up an Amazon RDS Proxy in front of the Aurora database. Modify the Aurora database to a Multi-AZ DB cluster by adding a read replica in a second Availability Zone.

B.  

Configure Amazon ECS services to distribute tasks across multiple Availability Zones. Create a cross-Region read replica of the Aurora database in a second AWS Region. Create a script to perform a manual failover process.

C.  

Configure Amazon ECS services on AWS Fargate to distribute tasks across multiple Availability Zones. Modify the Aurora database to a Multi-AZ DB cluster by adding a read replica in a second Availability Zone.

D.  

Deploy the gateway application into a second AWS Region. Migrate the Aurora database to an Aurora global database. Configure Route 53 for active-active gateway request routing.

Discussion 0
Question # 75

A company needs to migrate a 2 TB MySQL database from an on-premises data center to an Amazon Aurora cluster. The database receives hundreds of updates every minute. The on-premises database server is not accessible through the internet.

The migration solution must ensure that no data is lost between the start of migration and cutover. The migration must begin as soon as possible and must minimize downtime.

Which solution will meet these requirements?

Options:

A.  

Create an AWS Site-to-Site VPN connection between the on-premises data center and the VPC that hosts the Aurora duster. Create a dump of the on-premises database by using mysqldump. Upload the dump to Amazon S3 by using multipart upload. Use an Amazon EC2 instance with appropriate permissions to import the dump to the Aurora cluster.

B.  

Create an AWS Site-to-Site VPN connection between the on-premises data center and the VPC that hosts the Aurora cluster. Specify the on-premises database as the source endpoint in AWS DMS. Specify the Aurora duster as the target endpoint. Configure a DMS task with ongoing replication.

C.  

Set up an AWS Direct Connect connection between the on-premises data center and the VPC that hosts the Aurora duster. Create a dump of the on-premises database by using mysqldump. Upload the dump to Amazon S3 by using multipart upload. Use an Amazon EC2 instance with appropriate permissions to import the dump to the Aurora cluster. Set up replication between the data center and the Aurora cluster.

D.  

Set up an AWS Direct Connect connection between the on-premises data center and the VPC that hosts the Aurora cluster. Specify the on-premises database as the source endpoint in AWS DMS. Specify the Aurora duster as the target endpoint Configure a DMS task with ongoing replication.

Discussion 0
Get SAP-C02 dumps and pass your exam in 24 hours!

Free Exams Sample Questions