Download Amazon.DVA-C02.VCEplus.2025-01-26.111q.tqb

Download Exam

File Info

Exam AWS Certified Developer - Associate
Number DVA-C02
File Name Amazon.DVA-C02.VCEplus.2025-01-26.111q.tqb
Size 1003 KB
Posted Jan 26, 2025
Download Amazon.DVA-C02.VCEplus.2025-01-26.111q.tqb

How to open VCEX & EXAM Files?

Files with VCEX & EXAM extensions can be opened by ProfExam Simulator.

Purchase

Coupon: MASTEREXAM
With discount: 20%



Exam Hub discount


Demo Questions

Question 1

A developer needs to store configuration variables for an application. The developer needs to set an expiration date and time for me configuration. The developer wants to receive notifications. Before the configuration expires. Which solution will meet these requirements with the LEAST operational overhead?


  1. Create a standard parameter in AWS Systems Manager Parameter Store Set Expiation and Expiration Notification policy types.
  2. Create a standard parameter in AWS Systems Manager Parameter Store Create an AWS Lambda function to expire the configuration and to send Amazon Simple Notification Service (Amazon SNS) notifications.
  3. Create an advanced parameter in AWS Systems Manager Parameter Store Set Expiration and Expiration Notification policy types.
  4. Create an advanced parameter in AWS Systems Manager Parameter Store Create an Amazon EC2 instance with a corn job to expire the configuration and to send notifications.
Correct answer: C
Explanation:
This solution will meet the requirements by creating an advanced parameter in AWS Systems Manager Parameter Store, which is a secure and scalable service for storing and managing configuration data and secrets. The advanced parameter allows setting expiration and expiration notification policy types, which enable specifying an expiration date and time for the configuration and receiving notifications before the configuration expires. The Lambda code will be refactored to load the Root CA Cert from the parameter store and modify the runtime trust store outside the Lambda function handler, which will improve performance and reduce latency by avoiding repeated calls to Parameter Store and trust store modifications for each invocation of the Lambda function.Option A is not optimal because it will create a standard parameter in AWS Systems Manager Parameter Store, which does not support expiration and expiration notification policy types. Option B is not optimal because it will create a secret access key and access key ID with permission to access the S3 bucket, which will introduce additional security risks and complexity for storing and managing credentials. Option D is not optimal because it will create a Docker container from Node.js base image to invoke Lambda functions, which will incur additional costs and overhead for creating and running Docker containers.Reference: AWS Systems Manager Parameter Store, [Using SSL/TLS to Encrypt a Connection to a DB Instance]
This solution will meet the requirements by creating an advanced parameter in AWS Systems Manager Parameter Store, which is a secure and scalable service for storing and managing configuration data and secrets. The advanced parameter allows setting expiration and expiration notification policy types, which enable specifying an expiration date and time for the configuration and receiving notifications before the configuration expires. The Lambda code will be refactored to load the Root CA Cert from the parameter store and modify the runtime trust store outside the Lambda function handler, which will improve performance and reduce latency by avoiding repeated calls to Parameter Store and trust store modifications for each invocation of the Lambda function.
Option A is not optimal because it will create a standard parameter in AWS Systems Manager Parameter Store, which does not support expiration and expiration notification policy types. Option B is not optimal because it will create a secret access key and access key ID with permission to access the S3 bucket, which will introduce additional security risks and complexity for storing and managing credentials. Option D is not optimal because it will create a Docker container from Node.js base image to invoke Lambda functions, which will incur additional costs and overhead for creating and running Docker containers.
Reference: AWS Systems Manager Parameter Store, [Using SSL/TLS to Encrypt a Connection to a DB Instance]



Question 2

When using the AWS Encryption SDK how does the developer keep track of the data encryption keys used to encrypt data?


  1. The developer must manually keep Hack of the data encryption keys used for each data object.
  2. The SDK encrypts me data encryption key and stores it (encrypted) as part of the resumed ophertext.
  3. The SDK stores the data encryption keys automaticity in Amazon S3.
  4. The data encryption key is stored m the user data for the EC2 instance.
Correct answer: B
Explanation:
This solution will meet the requirements by using AWS Encryption SDK, which is a client-side encryption library that enables developers to encrypt and decrypt data using data encryption keys that are protected by AWS Key Management Service (AWS KMS). The SDK encrypts the data encryption key with a customer master key (CMK) that is managed by AWS KMS, and stores it (encrypted) as part of the returned ciphertext. The developer does not need to keep track of the data encryption keys used to encrypt data, as they are stored with the encrypted data and can be retrieved and decrypted by using AWS KMS when needed. Option A is not optimal because it will require manual tracking of the data encryption keys used for each data object, which is error-prone and inefficient. Option C is not optimal because it will store the data encryption keys automatically in Amazon S3, which is unnecessary and insecure as Amazon S3 is not designed for storing encryption keys. Option D is not optimal because it will store the data encryption key in the user data for the EC2 instance, which is also unnecessary and insecure as user data is not encrypted by default.Reference: [AWS Encryption SDK], [AWS Key Management Service]
This solution will meet the requirements by using AWS Encryption SDK, which is a client-side encryption library that enables developers to encrypt and decrypt data using data encryption keys that are protected by AWS Key Management Service (AWS KMS). The SDK encrypts the data encryption key with a customer master key (CMK) that is managed by AWS KMS, and stores it (encrypted) as part of the returned ciphertext. The developer does not need to keep track of the data encryption keys used to encrypt data, as they are stored with the encrypted data and can be retrieved and decrypted by using AWS KMS when needed. Option A is not optimal because it will require manual tracking of the data encryption keys used for each data object, which is error-prone and inefficient. Option C is not optimal because it will store the data encryption keys automatically in Amazon S3, which is unnecessary and insecure as Amazon S3 is not designed for storing encryption keys. Option D is not optimal because it will store the data encryption key in the user data for the EC2 instance, which is also unnecessary and insecure as user data is not encrypted by default.
Reference: [AWS Encryption SDK], [AWS Key Management Service]



Question 3

A company stores customer credit reports in an Amazon S3 bucket. An analytics service uses standard Amazon S3 GET requests to access the reports. A developer must implement a solution to redact personally identifiable information (PII) from the reports before the reports reach the analytics service.


  1. Load the S3 objects into Amazon Redshift by using a COPY command. Implement dynamic data masking. Refactor the analytics service to read from Amazon Redshift.
  2. Set up an S3 Object Lambda function. Attach the function to an S3 Object Lambda Access Point. Program the function to call a PII redaction API.
  3. Use AWS Key Management Service (AWS KMS) to implement encryption in the S3 bucket. Re-upload all the existing S3 objects. Give the kms permission to the analytics service.
  4. Create an Amazon Simple Notification Service (Amazon SNS) topic. Implement message data protection. Refactor the analytics service to publish data access requests to the SNS topic.
Correct answer: B
Explanation:
Comprehensive Detailed Step by Step Explanation with All AWS DeveloperReference:To redact PII from S3 objects before they are accessed by the analytics service, the most efficient solution is to use S3 Object Lambda. S3 Object Lambda allows you to add your own code (Lambda function) to process and transform data when it is retrieved from Amazon S3. You can attach a Lambda function to an S3 Object Lambda Access Point, which in this case would run a redaction API to remove PII from the reports.Operational Efficiency: S3 Object Lambda handles data processing on the fly, without requiring the data to be permanently transformed or moved to another service (like Amazon Redshift).Alternatives:Option A: Loading the data into Amazon Redshift would require refactoring the analytics service and maintaining an additional data pipeline, increasing complexity.Option C: Using AWS KMS for encryption protects data at rest and in transit, but it does not address PII redaction.Option D: SNS is a messaging service and does not support direct data transformation.
Comprehensive Detailed Step by Step Explanation with All AWS Developer
Reference:
To redact PII from S3 objects before they are accessed by the analytics service, the most efficient solution is to use S3 Object Lambda. S3 Object Lambda allows you to add your own code (Lambda function) to process and transform data when it is retrieved from Amazon S3. You can attach a Lambda function to an S3 Object Lambda Access Point, which in this case would run a redaction API to remove PII from the reports.
Operational Efficiency: S3 Object Lambda handles data processing on the fly, without requiring the data to be permanently transformed or moved to another service (like Amazon Redshift).
Alternatives:
Option A: Loading the data into Amazon Redshift would require refactoring the analytics service and maintaining an additional data pipeline, increasing complexity.
Option C: Using AWS KMS for encryption protects data at rest and in transit, but it does not address PII redaction.
Option D: SNS is a messaging service and does not support direct data transformation.



Question 4

A company is using the AWS Serverless Application Model (AWS SAM) to develop a social media application. A developer needs a quick way to test AWS Lambda functions locally by using test event payloads. The developer needs the structure of these test event payloads to match the actual events that AWS services create.


  1. Create shareable test Lambda events. Use these test Lambda events for local testing. 
  2. Store manually created test event payloads locally. Use the sam local invoke command with the file path to the payloads.
  3. Store manually created test event payloads in an Amazon S3 bucket. Use the sam local invoke command with the S3 path to the payloads.
  4. Use the sam local generate-event command to create test payloads for local testing.
Correct answer: D
Explanation:
Comprehensive Detailed Step by Step Explanation with All AWS DeveloperReference:The AWS Serverless Application Model (SAM) includes features for local testing and debugging of AWS Lambda functions. One of the most efficient ways to generate test payloads that match actual AWS event structures is by using the sam local generate-event command.sam local generate-event: This command allows developers to create pre-configured test event payloads for various AWS services (e.g., S3, API Gateway, SNS). These generated events accurately reflect the format that the service would use in a live environment, reducing the manual work required to create these events from scratch.Operational Overhead: This approach reduces overhead since the developer does not need to manually create or maintain test events. It ensures that the structure is correct and up-to-date with the latest AWS standards.Alternatives:Option A suggests using shareable test events, but manually creating or sharing these events introduces more overhead.Option B and C both involve manually storing and maintaining test events, which adds unnecessary complexity compared to using sam local generate-event.AWS SAM CLI documentation
Comprehensive Detailed Step by Step Explanation with All AWS Developer
Reference:
The AWS Serverless Application Model (SAM) includes features for local testing and debugging of AWS Lambda functions. One of the most efficient ways to generate test payloads that match actual AWS event structures is by using the sam local generate-event command.
sam local generate-event: This command allows developers to create pre-configured test event payloads for various AWS services (e.g., S3, API Gateway, SNS). These generated events accurately reflect the format that the service would use in a live environment, reducing the manual work required to create these events from scratch.
Operational Overhead: This approach reduces overhead since the developer does not need to manually create or maintain test events. It ensures that the structure is correct and up-to-date with the latest AWS standards.
Alternatives:
Option A suggests using shareable test events, but manually creating or sharing these events introduces more overhead.
Option B and C both involve manually storing and maintaining test events, which adds unnecessary complexity compared to using sam local generate-event.
AWS SAM CLI documentation



Question 5

An application that runs on AWS Lambda requires access to specific highly confidential objects in an Amazon S3 bucket. In accordance with the principle of least privilege a company grants access to the S3 bucket by using only temporary credentials.
How can a developer configure access to me S3 bucket in the MOST secure way?


  1. Hardcode the credentials that are required to access the S3 objects in the application code. Use the credentials to access me required S3 objects.
  2. Create a secret access key and access key ID with permission to access the S3 bucket. Store the key and key ID in AWS Secrets Manager. Configure the application to retrieve the Secrets Manager secret and use thecredentials to access me S3 objects.
  3. Create a Lambda function execution role Attach a policy to the rote that grants access to specific objects in the S3 bucket.
  4. Create a secret access key and access key ID with permission to access the S3 bucket Store the key and key ID as environment variables m Lambda. Use the environment variables to access the required S3 objects.
Correct answer: C
Explanation:
This solution will meet the requirements by creating a Lambda function execution role, which is an IAM role that grants permissions to a Lambda function to access AWS resources such as Amazon S3 objects. The developer can attach a policy to the role that grants access to specific objects in the S3 bucket that are required by the application, following the principle of least privilege. Option A is not optimal because it will hardcode the credentials that are required to access S3 objects in the application code, which is insecure and difficult to maintain. Option B is not optimal because it will create a secret access key and access key ID with permission toaccess the S3 bucket, which will introduce additional security risks and complexity for storing and managing credentials. Option D is not optimal because it will store the secret access key and access key ID as environment variables in Lambda, which is also insecure and difficult to maintain.Reference: [AWS Lambda Execution Role], [Using AWS Lambda with Amazon S3]
This solution will meet the requirements by creating a Lambda function execution role, which is an IAM role that grants permissions to a Lambda function to access AWS resources such as Amazon S3 objects. The developer can attach a policy to the role that grants access to specific objects in the S3 bucket that are required by the application, following the principle of least privilege. Option A is not optimal because it will hardcode the credentials that are required to access S3 objects in the application code, which is insecure and difficult to maintain. Option B is not optimal because it will create a secret access key and access key ID with permission to
access the S3 bucket, which will introduce additional security risks and complexity for storing and managing credentials. Option D is not optimal because it will store the secret access key and access key ID as environment variables in Lambda, which is also insecure and difficult to maintain.
Reference: [AWS Lambda Execution Role], [Using AWS Lambda with Amazon S3]



Question 6

A developer has code that is stored in an Amazon S3 bucket. The code must be deployed as an AWS Lambda function across multiple accounts in the same AWS Region as the S3 bucket an AWS CloudPormation template that runs for each account will deploy the Lambda function.
What is the MOST secure way to allow CloudFormaton to access the Lambda Code in the S3 bucket?


  1. Grant the CloudFormation service role the S3 ListBucket and GetObject permissions. Add a bucket policy to Amazon S3 with the principal of "AWS" (account numbers)
  2. Grant the CloudFormation service row the S3 GetObfect permission. Add a Bucket policy to Amazon S3 with the principal of "'"
  3. Use a service-based link to grant the Lambda function the S3 ListBucket and GetObject permissions by explicitly adding the S3 bucket's account number in the resource. 
  4. Use a service-based link to grant the Lambda function the S3 GetObject permission Add a resource of "** to allow access to the S3 bucket.
Correct answer: B
Explanation:
This solution allows the CloudFormation service role to access the S3 bucket from any account, as long as it has the S3 GetObject permission. The bucket policy grants access to any principal with the GetObject permission, which is the least privilege needed to deploy the Lambda code. This is more secure than granting ListBucket permission, which is not required for deploying Lambda code, or using a service-based link, which is not supported for Lambda functions.Reference: AWS CloudFormation Service Role, Using AWS Lambda with Amazon S3
This solution allows the CloudFormation service role to access the S3 bucket from any account, as long as it has the S3 GetObject permission. The bucket policy grants access to any principal with the GetObject permission, which is the least privilege needed to deploy the Lambda code. This is more secure than granting ListBucket permission, which is not required for deploying Lambda code, or using a service-based link, which is not supported for Lambda functions.
Reference: AWS CloudFormation Service Role, Using AWS Lambda with Amazon S3



Question 7

A developer warns to add request validation to a production environment Amazon API Gateway API. The developer needs to test the changes before the API is deployed to the production environment. For the lest the developer will send test requests to the API through a testing tool.
Which solution will meet these requirements with the LEAST operational overhead?


  1. Export the existing API to an OpenAPI file. Create a new API Import the OpenAPI file Modify the new API to add request validation. Perform the tests Modify the existing API to add request validation. Deploy the existingAPI to production.
  2. Modify the existing API to add request validation. Deploy the updated API to a new API Gateway stage Perform the tests Deploy the updated API to the API Gateway production stage.
  3. Create a new API Add the necessary resources and methods including new request validation. Perform the tests Modify the existing API to add request validation. Deploy the existing API to production.
  4. Clone the exiting API Modify the new API lo add request validation. Perform the tests Modify the existing API to add request validation Deploy the existing API to production.
Correct answer: D
Explanation:
This solution allows the developer to test the changes without affecting the production environment. Cloning an API creates a copy of the API definition that can be modified independently. The developer can then add request validation to the new API and test it using a testing tool. After verifying that the changes work as expected, the developer can apply the same changes to the existing API and deploy it to production.Reference: Clone an API, [Enable Request Validation for an API in API Gateway]
This solution allows the developer to test the changes without affecting the production environment. Cloning an API creates a copy of the API definition that can be modified independently. The developer can then add request validation to the new API and test it using a testing tool. After verifying that the changes work as expected, the developer can apply the same changes to the existing API and deploy it to production.
Reference: Clone an API, [Enable Request Validation for an API in API Gateway]



Question 8

A company is running Amazon EC2 instances in multiple AWS accounts. A developer needs to implement an application that collects all the lifecycle events of the EC2 instances. The application needs to store the lifecycle events in a single Amazon Simple Queue Service (Amazon SQS) queue in the company's main AWS account for further processing.
Which solution will meet these requirements?


  1. Configure Amazon EC2 to deliver the EC2 instance lifecycle events from all accounts to the Amazon EventBridge event bus of the main account. Add an EventBridge rule to the event bus of the main account that matchesall EC2 instance lifecycle events. Add the SQS queue as a target of the rule.
  2. Use the resource policies of the SQS queue in the main account to give each account permissions to write to that SQS queue. Add to the Amazon EventBridge event bus of each account an EventBridge rule that matchesall EC2 instance lifecycle events. Add the SQS queue in the main account as a target of the rule.
  3. Write an AWS Lambda function that scans through all EC2 instances in the company accounts to detect EC2 instance lifecycle changes. Configure the Lambda function to write a notification message to the SQS queue inthe main account if the function detects an EC2 instance lifecycle change. Add an Amazon EventBridge scheduled rule that invokes the Lambda function every minute.
  4. Configure the permissions on the main account event bus to receive events from all accounts. Create an Amazon EventBridge rule in each account to send all the EC2 instance lifecycle events to the main account eventbus. Add an EventBridge rule to the main account event bus that matches all EC2 instance lifecycle events. Set the SQS queue as a target for the rule.
Correct answer: D
Explanation:
Amazon EC2 instances can send the state-change notification events to Amazon EventBridge.https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/monitoring-instance-state-changes.htmlAmazon EventBridge can send and receive events between event buses in AWS accounts.https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-cross-account.html 
Amazon EC2 instances can send the state-change notification events to Amazon EventBridge.
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/monitoring-instance-state-changes.htmlAmazon EventBridge can send and receive events between event buses in AWS accounts.
https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-cross-account.html 



Question 9

An application is using Amazon Cognito user pools and identity pools for secure access. A developer wants to integrate the user-specific file upload and download features in the application with Amazon S3. The developer must ensure that the files are saved and retrieved in a secure manner and that users can access only their own files. The file sizes range from 3 KB to 300 MB.
Which option will meet these requirements with the HIGHEST level of security?


  1. Use S3 Event Notifications to validate the file upload and download requests and update the user interface (UI).
  2. Save the details of the uploaded files in a separate Amazon DynamoDB table. Filter the list of files in the user interface (UI) by comparing the current user ID with the user ID associated with the file in the table.
  3. Use Amazon API Gateway and an AWS Lambda function to upload and download files. Validate each request in the Lambda function before performing the requested operation.
  4. Use an IAM policy within the Amazon Cognito identity prefix to restrict users to use their own folders in Amazon S3.
Correct answer: D
Explanation:
https://docs.aws.amazon.com/cognito/latest/developerguide/amazon-cognito-integrating-userpools-with-identity-pools.html
https://docs.aws.amazon.com/cognito/latest/developerguide/amazon-cognito-integrating-userpools-with-identity-pools.html



Question 10

A company is building a scalable data management solution by using AWS services to improve the speed and agility of development. The solution will ingest large volumes of data from various sources and will process this data through multiple business rules and transformations.
The solution requires business rules to run in sequence and to handle reprocessing of data if errors occur when the business rules run. The company needs the solution to be scalable and to require the least possible maintenance.
Which AWS service should the company use to manage and automate the orchestration of the data flows to meet these requirements?


  1. AWS Batch
  2. AWS Step Functions
  3. AWS Glue
  4. AWS Lambda
Correct answer: B
Explanation:
Reference:https://docs.aws.amazon.com/step-functions/latest/dg/welcome.html
Reference:
https://docs.aws.amazon.com/step-functions/latest/dg/welcome.html









CONNECT US

Facebook

Twitter

PROFEXAM WITH A 20% DISCOUNT

You can buy ProfExam with a 20% discount!



HOW TO OPEN VCEX FILES

Use ProfExam Simulator to open VCEX files