Month End Sale Special Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: scxmas70

SAA-C03 Exam Dumps - AWS Certified Solutions Architect - Associate (SAA-C03)

Searching for workable clues to ace the Amazon Web Services SAA-C03 Exam? You’re on the right place! ExamCert has realistic, trusted and authentic exam prep tools to help you achieve your desired credential. ExamCert’s SAA-C03 PDF Study Guide, Testing Engine and Exam Dumps follow a reliable exam preparation strategy, providing you the most relevant and updated study material that is crafted in an easy to learn format of questions and answers. ExamCert’s study tools aim at simplifying all complex and confusing concepts of the exam and introduce you to the real exam scenario and practice it with the help of its testing engine and real exam dumps

Go to page:
Question # 65

An ecommerce company runs an application that uses an Amazon DynamoDB table in a single AWS Region. The company wants to deploy the application to a second Region. The company needs to support multi-active replication with low latency reads and writes to the existing DynamoDB table in both Regions.

Which solution will meet these requirements in the MOST operationally efficient way?

A.

Create a DynamoDB global secondary index (GSI) for the existing table. Create a new table in the second Region. Convert the existing DynamoDB table to a global table. Specify the new table as the secondary table.

B.

Enable Amazon DynamoDB Streams for the existing table. Create a new table in the second Region. Create a new application that uses the DynamoDB Streams Kinesis Adapter and the Amazon Kinesis Client Library (KCL). Configure the new application to read data from the DynamoDB table in the first Region and to write the data to the new table in the second Region.

C.

Convert the existing DynamoDB table to a global table. Choose the appropriate second Region to achieve active-active write capabilities in both Regions.

D.

Enable Amazon DynamoDB Streams for the existing table. Create a new table in the second Region. Create an AWS Lambda function in the first Region that reads data from the table in the first Region and writes the data to the new table in the second Region. Set a DynamoDB stream as the input trigger for the Lambda function.

Full Access
Question # 66

A company is migrating mobile banking applications to run on Amazon EC2 instances in a VPC. Backend service applications run in an on-premises data center. The data center has an AWS Direct Connect connection into AWS. The applications that run in the VPC need to resolve DNS requests to an on-premises Active Directory domain that runs in the data center.

Which solution will meet these requirements with the LEAST administrative overhead?

A.

Provision a set of EC2 instances across two Availability Zones in the VPC as caching DNS servers to resolve DNS queries from the application servers within the VPC.

B.

Provision an Amazon Route 53 private hosted zone. Configure NS records that point to on-premises DNS servers.

C.

Create DNS endpoints by using Amazon Route 53 Resolver. Add conditional forwarding rules to resolve DNS namespaces between the on-premises data center and the VPC.

D.

Provision a new Active Directory domain controller in the VPC with a bidirectional trust between this new domain and the on-premises Active Directory domain.

Full Access
Question # 67

A mining company is using Amazon S3 as its data lake. The company wants to analyze the data collected by the sensors in its mines. A data pipeline is being built to capture data from the sensors, ingest the data into an S3 bucket, and convert the data to Apache Parquet format. The data pipeline must be processed in near-real time. The data will be used for on-demand queries with Amazon Athena.

Which solution will meet these requirements?

A.

Use Amazon Data Firehose to invoke an AWS Lambda function that converts the data to Parquet format and stores the data in Amazon S3.

B.

Use Amazon Kinesis Data Streams to invoke an AWS Lambda function that converts the data to Parquet format and stores the data in Amazon S3.

C.

Use AWS DataSync to invoke an AWS Lambda function that converts the data to Parquet format and stores the data in Amazon S3.

D.

Use Amazon Simple Queue Service (Amazon SQS) to stream data directly to an AWS Glue job that converts the data to Parquet format and stores the data in Amazon S3.

Full Access
Question # 68

A company has a single AWS account. The company runs workloads on Amazon EC2 instances in multiple VPCs in one AWS Region. The company also runs workloads in an on-premises data center that connects to the company's AWS account by using AWS Direct Connect.

The company needs all EC2 instances in the VPCs to resolve DNS queries for the internal.example.com domain to the authoritative DNS server that is located in the on-premises data center. The solution must use private communication between the VPCs and the on-premises network. All route tables, network ACLs, and security groups are configured correctly between AWS and the on-premises data center.

Which combination of actions will meet these requirements? (Select THREE.)

A.

Create an Amazon Route 53 inbound endpoint in all the workload VPCs.

B.

Create an Amazon Route 53 outbound endpoint in one of the workload VPCs.

C.

Create an Amazon Route 53 Resolver rule with the Forward type configured to forward queries for internal.example.com to the on-premises DNS server.

D.

Create an Amazon Route 53 Resolver rule with the System type configured to forward queries for internal.example.com to the on-premises DNS server.

E.

Associate the Amazon Route 53 Resolver rule with all the workload VPCs.

F.

Associate the Amazon Route 53 Resolver rule with the workload VPC with the new Route 53 endpoint.

Full Access
Question # 69

A company is developing a content sharing platform that currently handles 500 GB of user-generated media files. The company expects the amount of content to grow significantly in the future. The company needs a storage solution that can automatically scale, provide high durability, and allow direct user uploads from web browsers.

A.

Store the data in an Amazon Elastic Block Store (Amazon EBS) volume with Multi-Attach enabled.

B.

Store the data in an Amazon Elastic File System (Amazon EFS) Standard file system.

C.

Store the data in an Amazon S3 Standard bucket.

D.

Store the data in an Amazon S3 Express One Zone bucket.

Full Access
Question # 70

A company is launching a new application that requires a structured database to store user profiles, application settings, and transactional data. The database must be scalable with application traffic and must offer backups.

Which solution will meet these requirements MOST cost-effectively?

A.

Deploy a self-managed database on Amazon EC2 instances by using open-source software. Use Spot Instances for cost optimization. Configure automated backups to Amazon S3.

B.

Use Amazon RDS. Use on-demand capacity mode for the database with General Purpose SSD storage. Configure automatic backups with a retention period of 7 days.

C.

Use Amazon Aurora Serverless for the database. Use serverless capacity scaling. Configure automated backups to Amazon S3.

D.

Deploy a self-managed NoSQL database on Amazon EC2 instances. Use Reserved Instances for cost optimization. Configure automated backups directly to Amazon S3 Glacier Flexible Retrieval.

Full Access
Question # 71

A company runs an application on a group of Amazon EC2 instances behind an Application Load Balancer (ALB). The company wants to protect the application against layer 7 DDoS attacks.

Which solution will meet this requirement?

A.

Associate AWS Shield Standard with the ALB.

B.

Create an AWS WAF web ACL and add a custom rule. Associate the web ACL with the ALB.

C.

Create an AWS WAF web ACL and add an AWS managed rule. Associate the web ACL with the ALB.

D.

Create an Amazon CloudFront distribution and set the ALB as the origin. Configure the application DNS record to point to the CloudFront distribution instead of the ALB.

Full Access
Question # 72

A solutions architect is building an Amazon S3 data lake for a company. The company uses Amazon Kinesis Data Firehose to ingest customer personally identifiable information (PII) and transactional data in near real-time to an S3 bucket. The company needs to mask all PII data before storing thedata in the data lake.

Which solution will meet these requirements?

A.

Create an AWS Lambda function to detect and mask PII. Invoke the function from Kinesis Data Firehose.

B.

Use Amazon Macie to scan the S3 bucket. Configure Macie to detect and mask PII.

C.

Enable server-side encryption (SSE) on the S3 bucket.

D.

Create an AWS Lambda function that integrates with AWS CloudHSM. Configure the function to detect and mask PII.

Full Access
Go to page: