- 368 Actual Exam Questions
- Compatible with all Devices
- Printable Format
- No Download Limits
- 90 Days Free Updates
Get All AWS Certified Developer - Associate Exam Questions with Validated Answers
Vendor: | Amazon |
---|---|
Exam Code: | DVA-C02 |
Exam Name: | AWS Certified Developer - Associate |
Exam Questions: | 368 |
Last Updated: | October 5, 2025 |
Related Certifications: | Amazon Associate, AWS Certified Developer Associate |
Exam Tags: | Professional AWS Developers |
Looking for a hassle-free way to pass the Amazon AWS Certified Developer - Associate exam? DumpsProvider provides the most reliable Dumps Questions and Answers, designed by Amazon certified experts to help you succeed in record time. Available in both PDF and Online Practice Test formats, our study materials cover every major exam topic, making it possible for you to pass potentially within just one day!
DumpsProvider is a leading provider of high-quality exam dumps, trusted by professionals worldwide. Our Amazon DVA-C02 exam questions give you the knowledge and confidence needed to succeed on the first attempt.
Train with our Amazon DVA-C02 exam practice tests, which simulate the actual exam environment. This real-test experience helps you get familiar with the format and timing of the exam, ensuring you're 100% prepared for exam day.
Your success is our commitment! That's why DumpsProvider offers a 100% money-back guarantee. If you don’t pass the Amazon DVA-C02 exam, we’ll refund your payment within 24 hours no questions asked.
Don’t waste time with unreliable exam prep resources. Get started with DumpsProvider’s Amazon DVA-C02 exam dumps today and achieve your certification effortlessly!
A developer needs to set up an API to provide access to an application and its resources. The developer has a TLS certificate. The developer must have the ability to change the default base URL of the API to a custom domain name. The API users are distributed globally. The solution must minimize API latency.
Comprehensive and Detailed Step-by-Step
Option C: Edge-Optimized API Gateway with Custom Domain Name:
Edge-Optimized API Gateway: This endpoint type automatically leverages the Amazon CloudFront global distribution network, minimizing latency for API users distributed globally.
Custom Domain Name: API Gateway supports custom domain names for APIs. Importing the TLS certificate into AWS Certificate Manager (ACM) and associating it with the custom domain name ensures secure connections.
Disabling the Default Endpoint: Prevents direct access via the default API Gateway URL, enforcing the use of the custom domain name.
Why Other Options Are Incorrect:
Option A: While CloudFront can distribute API requests globally, API Gateway with edge-optimized endpoints already provides this functionality natively without requiring Lambda@Edge.
Option B: Private endpoint types are used for internal access via VPC, which does not meet the global distribution and low-latency requirement.
Option D: CloudFront Functions are not needed because API Gateway's edge-optimized endpoints handle global distribution efficiently.
A developer has built an application that inserts data into an Amazon DynamoDB table. The table is configured to use provisioned capacity. The application is deployed on a burstable nano Amazon EC2 instance. The application logs show that the application has been failing because of a ProvisionedThroughputExceededException error.
Which actions should the developer take to resolve this issue? (Select TWO.)
Requirement Summary:
DynamoDB Provisioned Mode
ProvisionedThroughputExceededException error occurring
App hosted on a small burstable EC2 instance
Option A: Move to a larger EC2 instance
May improve local performance, but does not solve DynamoDB provisioned throughput limits.
Option B: Increase DynamoDB RCUs
Valid: This directly increases the read throughput capacity of the table.
Helps handle more traffic and reduce throughput exceptions.
Option C: Reduce frequency via exponential backoff
Best practice: Using exponential backoff and jitter reduces request pressure and spreads retries out to avoid spiking.
Option D: Increase frequency of retries by reducing delay
Opposite of best practice. Increases load, worsening the issue.
Option E: Change to on-demand mode
Also valid in general, but question is scoped around provisioned capacity.
Since minimizing cost or workload type isn't mentioned, and scaling the existing model is the focus, prefer B and C.
A developer is managing an application that uploads user files to an Amazon S3 bucket named companybucket. The company wants to maintain copies of all the files uploaded by users for compliance purposes, while ensuring users still have access to the data through the application. Which IAM permissions should be applied to users to ensure they can create but not remove files from the bucket?
A.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "statement1",
"Effect": "Allow",
"Action": ["s3:GetObject", "s3:PutObject", "s3:DeleteObject"],
"Resource": ["arn:aws:s3:::companybucket"]
}
]
}
B.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "statement1",
"Effect": "Allow",
"Action": ["s3:CreateBucket", "s3:GetBucketLocation"],
"Resource": "arn:aws:s3:::companybucket"
}
]
}
C.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "statement1",
"Effect": "Allow",
"Action": ["s3:GetObject", "s3:PutObject", "s3:DeleteObject", "s3:PutObjectRetention"],
"Resource": "arn:aws:s3:::companybucket"
}
]
}
D.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "statement1",
"Effect": "Allow",
"Action": ["s3:GetObject", "s3:PutObject"],
"Resource": ["arn:aws:s3:::companybucket"]
}
]
}
To meet the requirement:
Users must be able to upload (PutObject) and read (GetObject) files but not delete them.
Option D ensures users cannot delete files by omitting the s3:DeleteObject action while allowing s3:GetObject and s3:PutObject.
Option A: Includes s3:DeleteObject, which allows users to delete files and does not meet the requirement.
Option B: Contains unrelated actions like CreateBucket, which is not relevant here.
Option C: Adds s3:PutObjectRetention, which is unnecessary and does not restrict DeleteObject.
Reference: AWS S3 Permissions Documentation
A developer is investigating an issue in part of a company's application. In the application messages are sent to an Amazon Simple Queue Service (Amazon SQS) queue The AWS Lambda function polls messages from the SQS queue and sends email messages by using Amazon Simple Email Service (Amazon SES) Users have been receiving duplicate email messages during periods of high traffic.
Which reasons could explain the duplicate email messages? (Select TWO.)
SQS Delivery Behavior:Standard SQS queues guarantee at-least-once delivery, meaning messages may be processed more than once. This can lead to duplicate emails in this scenario.
Visibility Timeout:If the visibility timeout on the SQS queue is too short, a message might become visible for another consumer before the first Lambda function finishes processing it. This can also lead to duplicates.
Amazon SQS Delivery Semantics:[invalid URL removed]
In a move toward using microservices, a company's management team has asked all development teams to build their services so that API requests depend only on that service's data store. One team is building a Payments service which has its own database; the service needs data that originates in the Accounts database. Both are using Amazon DynamoDB.
What approach will result in the simplest, decoupled, and reliable method to get near-real time updates from the Accounts database?
Security & Privacy
Satisfied Customers
Committed Service
Money Back Guranteed