Amazon SCS-C01 Dumps - AWS Certified Security - Specialty PDF Sample Questions

Exam Code:
SCS-C01
Exam Name:
AWS Certified Security - Specialty
532 Questions
Last Update Date : 20 March, 2023
PDF + Test Engine
$89 $115.7
Test Engine Only Demo
$79 $102.7
PDF Only Demo
$59 $76.7

Amazon SCS-C01 This Week Result

0

They can't be wrong

0

Score in Real Exam at Testing Centre

0

Questions came word by word from this dumps

Best Amazon SCS-C01 Dumps - pass your exam In First Attempt

Our SCS-C01 dumps are better than all other cheap SCS-C01 study material.

Only best way to pass your Amazon SCS-C01 is that if you will get reliable exam study materials. We ensure you that realexamdumps is one of the most authentic website for Amazon AWS Certified Specialty exam question answers. Pass your SCS-C01 AWS Certified Security - Specialty with full confidence. You can get free AWS Certified Security - Specialty demo from realexamdumps. We ensure 100% your success in SCS-C01 Exam with the help of Amazon Dumps. you will feel proud to become a part of realexamdumps family.

Our success rate from past 5 year very impressive. Our customers are able to build their carrier in IT field.

Owl
Search

45000+ Exams

Buy

Desire Exam

Download

Exam

and pass your exam...

Related Exam

Realexamdumps Providing most updated AWS Certified Specialty Question Answers. Here are a few exams:


Sample Questions

Realexamdumps Providing most updated AWS Certified Specialty Question Answers. Here are a few sample questions:

Amazon SCS-C01 Sample Question 1

A company became aware that one of its access keys was exposed on a code sharing website 11 days ago. A Security Engineer must review all use of the exposed access keys to determine the extent of the exposure. The company enabled IAM CloudTrail m an regions when it opened the account

Which of the following will allow (he Security Engineer 10 complete the task?


Options:

A. Filter the event history on the exposed access key in the CloudTrail console Examine the data from the past 11 days.
B. Use the IAM CLI lo generate an IAM credential report Extract all the data from the past 11 days.
C. Use Amazon Athena to query the CloudTrail logs from Amazon S3 Retrieve the rows for the exposed access key tor the past 11 days.
D. Use the Access Advisor tab in the IAM console to view all of the access key activity for the past 11 days.

Answer: D

Amazon SCS-C01 Sample Question 2

Your team is designing a web application. The users for this web application would need to sign in via an external ID provider such asfacebook or Google. Which of the following IAM service would you use for authentication?

Please select:


Options:

A. IAM Cognito
B. IAM SAML
C. IAM IAM
D. IAM Config

Answer: A Explanation: Explanation: The IAM Documentation mentions the followingAmazon Cognito provides authentication, authorization, and user management for your web and mobile apps. Your users ca sign in directly with a user name and password, or through a third party such as Facebook, Amazon, or Google.Option B is incorrect since this is used for identity federationOption C is incorrect since this is pure Identity and Access managementOption D is incorrect since IAM is a configuration serviceFor more information on IAM Cognito please refer to the below Link:https://docs.IAM.amazon.com/coenito/latest/developerguide/what-is-amazon-cognito.html The correct answer is: IAM CognitoSubmit your Feedback/Queries to our Expertt

Amazon SCS-C01 Sample Question 3

There is a set of Ec2 Instances in a private subnet. The application hosted on these EC2 Instances need to access a DynamoDB table. It needs to be ensured that traffic does not flow out to the internet. How can this be achieved?

Please select:


Options:

A. Use a VPC endpoint to the DynamoDB table
B. Use a VPN connection from the VPC
C. Use a VPC gateway from the VPC
D. Use a VPC Peering connection to the DynamoDB table

Answer: A Explanation: Explanation: The following diagram from the IAM Documentation shows how you can access the DynamoDB service from within a V without going to the Internet This can be done with the help of a VPC endpointC:UserswkDesktopmudassarUntitled.jpgOption B is invalid because this is used for connection between an on-premise solution and IAMOption C is invalid because there is no such optionOption D is invalid because this is used to connect 2 VPCsFor more information on VPC endpointsfor DynamoDB, please visit the URL:The correct answer is: Use a VPC endpoint to the DynamoDB table Submit your Feedback/Queries to our Expertt

Amazon SCS-C01 Sample Question 4

Developers in an organization have moved from a standard application deployment to containers. The Security Engineer is tasked with ensuring that the containers are secure. Which strategies will reduce the attack surface and enhance the security of the containers? (Select TWO.)


Options:

A. Use the containers to automate security deployments.
B. Limit resource consumption (CPU, memory), networking connections, ports, and unnecessary container libraries.
C. Segregate containers by host, function, and data classification.
D. Use Docker Notary framework to sign task definitions.
E. Enable container breakout at the host kernel.

Answer: A, D

Amazon SCS-C01 Sample Question 5

An IT department currently has a Java web application deployed on Apache Tomcat running on Amazon EC2 instances. All traffic to the EC2 instances is sent through an internet-facing Application Load Balancer (ALB) The Security team has noticed during the past two days thousands of unusual read requests coming from hundreds of IP addresses. This is causing the Tomcat server to run out of threads and reject new connections

Which the SIMPLEST change that would address this server issue?


Options:

A. Create an Amazon CloudFront distribution and configure the ALB as the origin
B. Block the malicious IPs with a network access list (NACL).
C. Create an IAM Web Application Firewall (WAF). and attach it to the ALB
D. Map the application domain name to use Route 53

Answer: B

Amazon SCS-C01 Sample Question 6

A user has created a VPC with the public and private subnets using the VPC wizard. The VPC has CIDR 20.0.0.0/16. The public subnet uses CIDR 20.0.1.0/24. The user is planning to host a web server in the public subnet with port 80 and a Database server in the private subnet with port 3306. The user is configuring a security group for the public subnet (WebSecGrp) and the private subnet (DBSecGrp). which of the below mentioned entries is required in the private subnet database security group DBSecGrp?

Please select:


Options:

A. Allow Inbound on port 3306 for Source Web Server Security Group WebSecGrp.
B. Allow Inbound on port 3306 from source 20.0.0.0/16
C. Allow Outbound on port 3306 for Destination Web Server Security Group WebSecGrp.
D. Allow Outbound on port 80 for Destination NAT Instance IP

Answer: A Explanation: Explanation: Since the Web server needs to talk to the database server on port 3306 that means that the database server should allow incoming traffic on port 3306. The below table from the IAM documentation shows how the security groups should be set up.C:UserswkDesktopmudassarUntitled.jpgOption B is invalid because you need to allow incoming access for the database server from the WebSecGrp security group.Options C and D are invalid because you need to allow Outbound traffic and not inbound traffic For more information on security groups please visit the below Link:http://docs.IAM.amazon.com/AmazonVPC/latest/UserGuide/VPC Scenario2.html The correct answer is: Allow Inbound on port 3306 for Source Web Server Security Group WebSecGrp. Submit your Feedback/Queries to our Expertt

Amazon SCS-C01 Sample Question 7

An organization has a multi-petabyte workload that it is moving to Amazon S3, but the CISO is concerned about cryptographic wear-out and the blast radius if a key is compromised. How can the CISO be assured that IAM KMS and Amazon S3 are addressing the concerns? (Select TWO )


Options:

A. There is no API operation to retrieve an S3 object in its encrypted form.
B. Encryption of S3 objects is performed within the secure boundary of the KMS service.
C. S3 uses KMS to generate a unique data key for each individual object.
D. Using a single master key to encrypt all data includes having a single place to perform audits and usage validation.
E. The KMS encryption envelope digitally signs the master key during encryption to prevent cryptographic wear-out

Answer: C, F

Amazon SCS-C01 Sample Question 8

You need to have a cloud security device which would allow to generate encryption keys based on FIPS 140-2 Level 3. Which of the following can be used for this purpose.

Please select:


Options:

A. IAM KMS
B. IAM Customer Keys
C. IAM managed keys
D. IAM Cloud HSM

Answer: A, D Explanation: Explanation: IAM Key Management Service (KMS) now uses FIPS 140-2 validated hardware security modules (HSM) and supports FIPS 140-2 validated endpoints, which provide independent assurances about the confidentiality and integrity of your keys.All master keys in IAM KMS regardless of their creation date or origin are automatically protected using FIPS 140-2 validatedHSMs. defines four levels of security, simply named "Level 1' to "Level 4". It does not specify in detail what level of security is required by any particular application.• FIPS 140-2 Level 1 the lowest, imposes very limited requirements; loosely, all components must be "production-grade" anc various egregious kinds of insecurity must be absent• FIPS 140-2 Level 2 adds requirements for physical tamper-evidence and role-based authentication.• FIPS 140-2 Level 3 adds requirements for physical tamper-resistance (making it difficult for attackers to gain access to sensitive information contained in the module) and identity-based authentication, and for a physical or logical separation between the interfaces by which "critical security parameters" enter and leave the module, and its other interfaces.• FIPS 140-2 Level 4 makes the physical security requirements more stringent and requires robustness against environmental attacks.IAMCIoudHSM provides you with a FIPS 140-2 Level 3 validated single-tenant HSM cluster in your Amazon Virtual Private Cloud (VPQ to store and use your keys. You have exclusive control over how your keys are used via an authentication mechanism independent from IAM. You interact with keys in your IAM CloudHSM cluster similar to the way you interact with your applications running in Amazon EC2.IAM KMS allows you to create and control the encryption keys used by your applications and supported IAM services in multiple regions around the world from a single console. The service uses a FIPS 140-2 validated HSM to protect the security of your keys. Centralized management of all your keys in IAM KMS lets you enforce who can use your keys under which conditions, when they get rotated, and who can manage them.IAM KMS HSMs are validated at level 2 overall and at level 3 in the following areas:• Cryptographic Module Specification• Roles, Services, and Authentication• Physical Security• Design AssuranceSo I think that we can have 2 answers for this question. Both A & D.• https://IAM.amazon.com/blo15s/security/IAM-key-management-service- now-ffers-flps-140-2-validated-cryptographic-m • https://a ws.amazon.com/cloudhsm/faqs/ • https://IAM.amazon.com/kms/faqs/ • https://en.wikipedia.org/wiki/RPS The IAM Documentation mentions the followingIAM CloudHSM is a cloud-based hardware security module (HSM) that enables you to easily generate and use your own encryption keys on the IAM Cloud. With CloudHSM, you can manage your own encryption keys using FIPS 140-2 Level 3 validated HSMs. CloudHSM offers you the flexibility to integrate with your applications using industry-standard APIs, such as PKCS#11, Java Cryptography Extensions ()CE). and Microsoft CryptoNG (CNG) libraries. CloudHSM is also standards-compliant and enables you to export all of your keys to most other commercially-available HSMs. It is a fully-managed service that automates time-consuming administrative tasks for you, such as hardware provisioning, software patching, high-availability, and backups. CloudHSM also enables you to scale quickly by adding and removing HSM capacity on-demand, with no up-front costs.All other options are invalid since IAM Cloud HSM is the prime service that offers FIPS 140-2 Level 3 complianceFor more information on CloudHSM, please visit the following urlhttps://IAM.amazon.com/cloudhsm; The correct answers are: IAM KMS, IAM Cloud HSM Submit your Feedback/Queries to our Expertt

Amazon SCS-C01 Sample Question 9

A company has two teams, and each team needs to access its respective Amazon S3 buckets. The company anticipates adding more teams that also will have their own S3 buckets. When the company adds these teams, team members will need the ability to be assigned to multiple teams. Team members also will need the ability to change teams. Additional S3 buckets can be created or deleted.

An IAM administrator must design a solution to accomplish these goals. The solution also must be scalable and must require the least possible operational overhead.

Which solution meets these requirements?


Options:

A. Add users to groups that represent the teams. Create a policy for each team that allows the team to access its respective S3 buckets only. Attach the policy to the corresponding group.
B. Create an IAM role for each team. Create a policy for each team that allows the team to access its respective S3 buckets only. Attach the policy to the corresponding role.
C. Create IAM roles that are labeled with an access tag value of a team. Create one policy that allows dynamic access to S3 buckets with the same tag. Attach the policy to the IAM roles. Tag the S3 buckets accordingly.
D. Implement a role-based access control (RBAC) authorization model. Create the corresponding policies, and attach them to the IAM users.

Answer: B

Amazon SCS-C01 Sample Question 10

A company has recently recovered from a security incident that required the restoration of Amazon EC2 instances from snapshots.

After performing a gap analysis of its disaster recovery procedures and backup strategies, the company is concerned that, next time, it will not be able to recover the EC2 instances if the IAM account was compromised and Amazon EBS snapshots were deleted.

All EBS snapshots are encrypted using an IAM KMS CMK.

Which solution would solve this problem?


Options:

A. Create a new Amazon S3 bucket Use EBS lifecycle policies to move EBS snapshots to the new S3 bucket. Move snapshots to Amazon S3 Glacier using lifecycle policies, and apply Glacier Vault Lock policies to prevent deletion
B. Use IAM Systems Manager to distribute a configuration that performs local backups of all attached disks to Amazon S3.
C. Create a new IAM account with limited privileges. Allow the new account to access the IAM KMS key used to encrypt the EBS snapshots, and copy the encrypted snapshots to the new account on a recuning basis
D. Use IAM Backup to copy EBS snapshots to Amazon S3.

Answer: B

Amazon SCS-C01 Sample Question 11

A company hosts data in S3. There is a requirement to control access to the S3 buckets. Which are the 2 ways in which this can be achieved?

Please select:


Options:

A. Use Bucket policies
B. Use the Secure Token service
C. Use IAM user policies
D. Use IAM Access Keys

Answer: A, C Explanation: Explanation: The IAM Documentation mentions the followingAmazon S3 offers access policy options broadly categorized as resource-based policies and user policies. Access policies you attach to your resources (buckets and objects) are referred to as resource-based policies. For example, bucket policies and access control lists (ACLs) are resource-based policies. You can also attach access policies to users in your account. These are called user policies. You may choose to use resource-based policies, user policies, or some combination of these to manage permissions to your Amazon S3 resources.Option B and D are invalid because these cannot be used to control access to S3 bucketsFor more information on S3 access control, please refer to the below Link:https://docs.IAM.amazon.com/AmazonS3/latest/dev/s3-access-control.htmll The correct answers are: Use Bucket policies. Use IAM user policies Submit your Feedback/Queries to our Expertt

Amazon SCS-C01 Sample Question 12

A company hosts its public website on Amazon EC2 instances behind an Application Load Balancer (ALB). The instances are in an EC2 Auto Scaling group across multiple Availability Zones. The website is under a DDoS attack by a specific loT device brand that is visible in the user agent A security engineer needs to mitigate the attack without impacting the availability of the public website.

What should the security engineer do to accomplish this?


Options:

A. Configure a web ACL rule for IAM WAF to block requests with a string match condition for the user agent of the loT device. Associate the v/eb ACL with the ALB.
B. Configure an Amazon CloudFront distribution to use the ALB as an origin. Configure a web ACL rule for IAM WAF to block requests with a string match condition for the user agent of the loT device. Associate the web ACL with the ALB Change the public DNS entry of the website to point to the CloudFront distribution.
C. Configure an Amazon CloudFront distribution to use a new ALB as an origin. Configure a web ACL rule for IAM WAF to block requests with a string match condition for the user agent of the loT device. Change the ALB security group to alow access from CloudFront IP address ranges only Change the public DNS entry of the website to point to the CloudFront distribution.
D. Activate IAM Shield Advanced to enable DDoS protection. Apply an IAM WAF ACL to the ALB. and configure a listener rule on the ALB to block loT devices based on the user agent.

Answer: E

Amazon SCS-C01 Sample Question 13

A Security Engineer is troubleshooting a connectivity issue between a web server that is writing log files to the logging server in another VPC. The Engineer has confirmed that a peering relationship exists between the two VPCs. VPC flow logs show that requests sent from the web server are accepted by the togging server but the web server never receives a reply

Which of the following actions could fix this issue1?


Options:

A. Add an inbound rule to the security group associated with the logging server that allows requests from the web server
B. Add an outbound rule to the security group associated with the web server that allows requests to the logging server.
C. Add a route to the route table associated with the subnet that hosts the logging server that targets the peering connection
D. Add a route to the route table associated with the subnet that hosts the web server that targets the peering connection

Answer: D

Amazon SCS-C01 Sample Question 14

A company wants to encrypt the private network between its orvpremises environment and IAM. The company also wants a consistent network experience for its employees.

What should the company do to meet these requirements?


Options:

A. Establish an IAM Direct Connect connection with IAM and set up a Direct Connect gateway. In the Direct Connect gateway configuration, enable IPsec and BGP, and then leverage native IAM network encryption between Availability Zones and Regions,
B. Establish an IAM Direct Connect connection with IAM and set up a Direct Connect gateway. Using the Direct Connect gateway, create a private virtual interface and advertise the customer gateway private IP addresses. Create a VPN connection using the customer gateway and the virtual private gateway
C. Establish a VPN connection with the IAM virtual private cloud over the internet
D. Establish an IAM Direct Connect connection with IAM and establish a public virtual interface. For prefixes that need to be advertised, enter the customer gateway public IP addresses. Create a VPN connection over Direct Connect using the customer gateway and the virtual private gateway.

Answer: E

Amazon SCS-C01 Sample Question 15

A company had one of its Amazon EC2 key pairs compromised. A Security Engineer must identify which current Linux EC2 instances were deployed and used the compromised key pair.

How can this task be accomplished?


Options:

A. Obtain the list of instances by directly querying Amazon EC2 using: IAM ec2 describe-instances --fi1ters "Name=key-name,Values=KEYNAMEHERE".
B. Obtain the fingerprint for the key pair from the IAM Management Console, then search for the fingerprint in the Amazon Inspector logs.
C. Obtain the output from the EC2 instance metadata using: curl http: //169.254.169.254/latest/meta-data/public- keys/0/.
D. Obtain the fingerprint for the key pair from the IAM Management Console, then search for the fingerprint in Amazon CloudWatch Logs using: IAM logs filter-log-events.

Answer: B

Amazon SCS-C01 Sample Question 16

A company has an encrypted Amazon S3 bucket. An Application Developer has an IAM policy that allows access to the S3 bucket, but the Application Developer is unable to access objects within the bucket.

What is a possible cause of the issue?


Options:

A. The S3 ACL for the S3 bucket fails to explicitly grant access to the Application Developer
B. The IAM KMS key for the S3 bucket fails to list the Application Developer as an administrator
C. The S3 bucket policy fails to explicitly grant access to the Application Developer
D. The S3 bucket policy explicitly denies access to the Application Developer

Answer: D

Amazon SCS-C01 Sample Question 17

A security engineer need to ensure their company’s uses of IAM meets IAM security best practices. As part of this, the IAM account root user must not be used for daily work. The root user must be monitored for use, and the Security team must be alerted as quickly as possible if the root user is used.

Which solution meets these requirements?


Options:

A. Set up an Amazon CloudWatch Events rule that triggers an Amazon SNS notification.
B. Set up an Amazon CloudWatch Events rule that triggers an Amazon SNS notification logs from S3 and generate notifications using Amazon SNS.
C. Set up a rule in IAM config to trigger root user events. Trigger an IAM Lambda function and generate notifications using Amazon SNS.
D. Use Amazon Inspector to monitor the usage of the root user and generate notifications using Amazon SNS

Answer: B

Amazon SCS-C01 Sample Question 18

A company has implemented centralized logging and monitoring of IAM CloudTrail logs from all Regions in an Amazon S3 bucket. The log Hies are encrypted using IAM KMS. A Security Engineer is attempting to review the log files using a third-party tool hosted on an Amazon EC2 instance The Security Engineer is unable to access the logs in the S3 bucket and receives an access denied error message

What should the Security Engineer do to fix this issue?


Options:

A. Check that the role the Security Engineer uses grants permission to decrypt objects using the KMS CMK.
B. Check that the role the Security Engineer uses grants permission to decrypt objects using the KMS CMK and gives access to the S3 bucket and objects
C. Check that the role the EC2 instance profile uses grants permission lo decrypt objects using the KMS CMK and gives access to the S3 bucket and objects
D. Check that the role the EC2 instance profile uses grants permission to decrypt objects using the KMS CMK

Answer: D

Amazon SCS-C01 Sample Question 19

A security engineer is designing an incident response plan to address the risk of a compromised Amazon EC2 instance. The plan must recommend a solution to meet the following requirements:

• A trusted forensic environment must be provisioned

• Automated response processes must be orchestrated

Which IAM services should be included in the plan? {Select TWO)


Options:

A. IAM CloudFormation
B. Amazon GuardDuty
C. Amazon Inspector
D. Amazon Macie
E. IAM Step Functions

Answer: A, F

Amazon SCS-C01 Sample Question 20

A website currently runs on Amazon EC2 with mostly static content on the site. Recently, the site was subjected to a DDoS attack, and a Security Engineer was tasked with redesigning the edge security to help mitigate this risk in the future

What are some ways the Engineer could achieve this? (Select THREE )


Options:

A. Use IAM X-Ray to inspect the traffic going 10 the EC2 instances
B. Move the state content to Amazon S3 and font this with an Amazon CloudFront distribution
C. Change the security group configuration to block the source of the attack traffic
D. Use IAM WAF security rules to inspect the inbound traffic
E. Use Amazon inspector assessment templates to inspect the inbound traffic
F. Use Amazon Route 53 to distribute traffic

Answer: B, D, G

Amazon SCS-C01 Sample Question 21

A company recently performed an annual security assessment of its IAM environment. The assessment showed that audit logs are not available beyond 90 days and that unauthorized changes to IAM policies are made without detection.

How should a security engineer resolve these issues?


Options:

A. Create an Amazon S3 lifecycle policy that archives IAM CloudTrail trail logs to Amazon S3 Glacier after 90 days. Configure Amazon Inspector to provide a notification when a policy change is made to resources.
B. Configure IAM Artifact to archive IAM CloudTrail logs Configure IAM Trusted Advisor to provide a notification when a policy change is made to resources.
C. Configure Amazon CloudWatch to export log groups to Amazon S3. Configure IAM CloudTrail to provide a notification when a policy change is made to resources.
D. Create an IAM CloudTrail trail that stores audit logs in Amazon S3. Configure an IAM Config rule to provide a notification when a policy change is made to resources.

Answer: D Explanation: Explanation: https://docs. IAM.amazon.com/IAMcloudtrail/latest/userguide/best-practices-security.html"For an ongoing record of events in your IAM account, you must create a trail. Although CloudTrail provides 90 days of event history information for management events in the CloudTrail console without creating a trail, it is not a permanent record, and it does not provide information about all possible types of events. For an ongoing record, and for a record that contains all the event types you specify, you must create a trail, which delivers log files to an Amazon S3 bucket that you specify."https:// IAM.amazon.com/blogs/security/how-to-record-and-govern-your-iam-resource-configurations-using-IAM-config/

Amazon SCS-C01 Sample Question 22

A company has decided to migrate sensitive documents from on-premises data centers to Amazon S3. Currently, the hard drives are encrypted to meet a compliance requirement regarding data encryption. The CISO wants to improve security by encrypting each file using a different key instead of a single key. Using a different key would limit the security impact of a single exposed key.

Which of the following requires the LEAST amount of configuration when implementing this approach?


Options:

A. Place each file into a different S3 bucket. Set the default encryption of each bucket to use a different IAM KMS customer managed key.
B. Put all the files in the same S3 bucket. Using S3 events as a trigger, write an IAM Lambda function to encrypt each file as it is added using different IAM KMS data keys.
C. Use the S3 encryption client to encrypt each file individually using S3-generated data keys
D. Place all the files in the same S3 bucket. Use server-side encryption with IAM KMS-managed keys (SSE-KMS) to encrypt the data

Answer: D Explanation: Explanation: References:https://docs. IAM.amazon.com/AmazonS3/latest/dev/serv-side-encryption.htmlServer-Side Encryption with Amazon S3-Managed Keys (SSE-S3) When you use Server-Side Encryption with Amazon S3-Managed Keys (SSE-S3), each object is encrypted with a unique key. Server-Side Encryption with Customer Master Keys (CMKs) Stored in IAM Key Management Service (SSE-KMS) is similar to SSE-S3, but with some additional benefits and charges for using this service.When you use SSE-KMS to protect your data without an S3 Bucket Key, Amazon S3 uses an individual IAM KMS data key for every object. It makes a call to IAM KMS every time a request is made against a KMS-encrypted object. https://docs. IAM.amazon.com/AmazonS3/latest/dev/bucket-key.htmlhttps://docs. IAM.amazon.com/kms/latest/developerguide/symmetric-asymmetric.htmm

Amazon SCS-C01 Sample Question 23

An organization policy states that all encryption keys must be automatically rotated every 12 months.

Which IAM Key Management Service (KMS) key type should be used to meet this requirement?


Options:

A. IAM managed Customer Master Key (CMK)
B. Customer managed CMK with IAM generated key material
C. Customer managed CMK with imported key material
D. IAM managed data key

Answer: C

Amazon SCS-C01 Sample Question 24

A security engineer has created an Amazon Cognito user pool. The engineer needs to manually verify the ID and access token sent by the application for troubleshooting purposes

What is the MOST secure way to accomplish this?


Options:

A. Extract the subject (sub), audience (aud), and cognito:username from the ID token payload Manually check the subject and audience for the user name In the user pool
B. Search for the public key with a key ID that matches the key ID In the header of the token. Then use a JSON Web Token (JWT) library to validate the signature of the token and extract values, such as the expiry date
C. Verify that the token is not expired. Then use the token_use claim function In Amazon Cognito to validate the key IDs
D. Copy the JSON Web Token (JWT) as a JSON document Obtain the public JSON Web Key (JWK) and convert It to a pem file. Then use the file to validate the original JWT.

Answer: B

Amazon SCS-C01 Sample Question 25

A Security Engineer launches two Amazon EC2 instances in the same Amazon VPC but in separate Availability Zones. Each instance has a public IP address and is able to connect to external hosts on the internet. The two instances are able to communicate with each other by using their private IP addresses, but they are not able to communicate with each other when using their public IP addresses.

Which action should the Security Engineer take to allow communication over the public IP addresses?


Options:

A. Associate the instances to the same security groups.
B. Add 0.0.0.0/0 to the egress rules of the instance security groups.
C. Add the instance IDs to the ingress rules of the instance security groups.
D. Add the public IP addresses to the ingress rules of the instance security groups.

Answer: D Explanation: Explanation: https://docs. IAM.amazon.com/IAMEC2/latest/UserGuide/security-group-rules-reference.html#sg-rules-other-instancet

Amazon SCS-C01 Sample Question 26

A company has several critical applications running on a large fleet of Amazon EC2 instances. As part of a security operations review, the company needs to apply a critical operating system patch to EC2 instances within 24 hours of the patch becoming available from the operating system vendor. The company does not have a patching solution deployed on IAM, but does have IAM Systems Manager configured. The solution must also minimize administrative overhead.

What should a security engineer recommend to meet these requirements?


Options:

A. Create an IAM Config rule defining the patch as a required configuration for EC2 instances.
B. Use the IAM Systems Manager Run Command to patch affected instances.
C. Use an IAM Systems Manager Patch Manager predefined baseline to patch affected instances.
D. Use IAM Systems Manager Session Manager to log in to each affected instance and apply the patch.

Answer: C

Amazon SCS-C01 Sample Question 27

An IAM Lambda function was misused to alter data, and a Security Engineer must identify who invoked the function and what output was produced. The Engineer cannot find any logs created by the Lambda function in Amazon CloudWatch Logs.

Which of the following explains why the logs are not available?


Options:

A. The execution role for the Lambda function did not grant permissions to write log data to CloudWatch Logs.
B. The Lambda function was executed by using Amazon API Gateway, so the logs are not stored in CloudWatch Logs.
C. The execution role for the Lambda function did not grant permissions to write to the Amazon S3 bucket where CloudWatch Logs stores the logs.
D. The version of the Lambda function that was executed was not current.

Answer: B

Amazon SCS-C01 Sample Question 28

The Security Engineer has discovered that a new application that deals with highly sensitive data is storing Amazon S3 objects with the following key pattern, which itself contains highly sensitive data.

Pattern:

"randomID_datestamp_PII.csv"

Example:

"1234567_12302017_000-00-0000 csv"

The bucket where these objects are being stored is using server-side encryption (SSE).

Which solution is the most secure and cost-effective option to protect the sensitive data?


Options:

A. Remove the sensitive data from the object name, and store the sensitive data using S3 user-defined metadata.
B. Add an S3 bucket policy that denies the action s3:GetObject
C. Use a random and unique S3 object key, and create an S3 metadata index in Amazon DynamoDB using client-side encrypted attributes.
D. Store all sensitive objects in Binary Large Objects (BLOBS) in an encrypted Amazon RDS instance.

Answer: C Explanation: Explanation: https://docs.IAM. amazon.com/AmazonS3/latest/dev/UsingMetadata.html https://IAM.amazon.com/blogs/database/best-practices-for-securing-sensitive-data-in-IAM-data-stores/

Amazon SCS-C01 Sample Question 29

You are hosting a web site via website hosting on an S3 bucket - http://demo.s3-website-us-east-l .amazonIAM.com. You have some web pages that use Javascript that access resources in another bucket which has web site hosting also enabled. But when users access the web pages , they are getting a blocked Javascript error. How can you rectify this?

Please select:


Options:

A. Enable CORS for the bucket
B. Enable versioning for the bucket
C. Enable MFA for the bucket
D. Enable CRR for the bucket

Answer: A Explanation: Explanation: Your answer is incorrectAnswer-ASuch a scenario is also given in the IAM Documentation Cross-Origin Resource Sharing: Use-case ScenariosThe following are example scenarios for using CORS:• Scenario 1: Suppose that you are hosting a website in an Amazon S3 bucket named website as described in Hosting a Static Website on Amazon S3. Your users load the website endpoint http://website.s3-website-us-east-1 .amazonIAM.com. Now you want to use JavaScript on the webpages that are stored in this bucket to be able to make authenticated GET and PUT requests against the same bucket by using the Amazon S3 API endpoint for the bucket website.s3.amazonIAM.com. A browser would normally block JavaScript from allowing those requests, but with CORS you can configure your bucket to explicitly enable cross-origin requests from website.s3-website-us-east-1 .amazonIAM.com. • Scenario 2: Suppose that you want to host a web font from your S3 bucket. Again, browsers require a CORS check (also called a preflight check) for loading web fonts. You would configure the bucket that is hosting the web font to allow any origin to make these requests.Option Bis invalid because versioning is only to create multiple versions of an object and can help in accidental deletion of objectsOption C is invalid because this is used as an extra measure of caution for deletion of objectsOption D is invalid because this is used for Cross region replication of objectsFor more information on Cross Origin Resource sharing, please visit the following URL• ittps://docs.IAM.amazon.com/AmazonS3/latest/dev/cors.htmlThe correct answer is: Enable CORS for the bucketSubmit your Feedback/Queries to our Expertt

Amazon SCS-C01 Sample Question 30

An organization operates a web application that serves users globally. The application runs on Amazon EC2 instances behind an Application Load Balancer. There is an Amazon CloudFront distribution in front of the load balancer, and the organization uses IAM WAF. The application is currently experiencing a volumetric attack whereby the attacker is exploiting a bug in a popular mobile game.

The application is being flooded with HTTP requests from all over the world with the User-Agent set to the following string: Mozilla/5.0 (compatible; ExampleCorp; ExampleGame/1.22; Mobile/1.0)

What mitigation can be applied to block attacks resulting from this bug while continuing to service legitimate requests?


Options:

A. Create a rule in IAM WAF rules with conditions that block requests based on the presence of ExampleGame/1.22 in the User-Agent header
B. Create a geographic restriction on the CloudFront distribution to prevent access to the application from most geographic regions
C. Create a rate-based rule in IAM WAF to limit the total number of requests that the web application services.
D. Create an IP-based blacklist in IAM WAF to block the IP addresses that are originating from requests that contain ExampleGame/1.22 in the User-Agent header.

Answer: A Explanation: Explanation: Since all the attack has http header- User-Agent set to string: Mozilla/5.0 (compatible; ExampleCorp;) it would be much more easier to block these attack by simply denying traffic with the header match . HTH ExampleGame/1.22; Mobile/1.0)

Amazon SCS-C01 Sample Question 31

Which of the following is used as a secure way to log into an EC2 Linux Instance?

Please select:


Options:

A. IAM User name and password
B. Key pairs
C. IAM Access keys
D. IAM SDK keys

Answer: B Explanation: Explanation: The IAM Documentation mentions the followingKey pairs consist of a public key and a private key. You use the private key to create a digital signature, and then IAM uses the corresponding public key to validate the signature. Key pairs are used only for Amazon EC2 and Amazon CloudFront.Option A.C and D are all wrong because these are not used to log into EC2 Linux InstancesFor more information on IAM Security credentials, please visit the below URL:https://docs.IAM.amazon.com/eeneral/latest/er/IAM-sec-cred-types.html The correct answer is: Key pairsSubmit your Feedback/Queries to our Expertt

Amazon SCS-C01 Sample Question 32

Your company has defined privileged users for their IAM Account. These users are administrators for key resources defined in the company. There is now a mandate to enhance the security authentication for these users. How can this be accomplished?

Please select:


Options:

A. Enable MFA for these user accounts
B. Enable versioning for these user accounts
C. Enable accidental deletion for these user accounts
D. Disable root access for the users

Answer: A Explanation: Explanation: The IAM Documentation mentions the following as a best practices for IAM users. For extra security, enable multi-factor authentication (MFA) for privileged IAM users (users who are allowed access to sensitive resources or APIs). With MFA, users have a device that generates unique authentication code (a one-time password, or OTP). Users must provide both their normal credentials (like their user name and password) and the OTP. The MFA device can either be a special piece of hardware, or it can be a virtual device (for example, it can run in an app on a smartphone).Option B,C and D are invalid because no such security options are available in IAM For more information on IAM best practices, please visit the below URL https://docs.IAM.amazon.com/IAM/latest/UserGuide/best-practices.html The correct answer is: Enable MFA for these user accounts Submit your Feedback/Queries to our Expertt

Amazon SCS-C01 Sample Question 33

A company has five IAM accounts and wants to use IAM CloudTrail to log API calls. The log files must be stored in an Amazon S3 bucket that resides in a new account specifically built for centralized services with a unique top-level prefix for each trail. The configuration must also enable detection of any modification to the logs.

Which of the following steps will implement these requirements? (Choose three.)


Options:

A. Create a new S3 bucket in a separate IAM account for centralized storage of CloudTrail logs, and enable “Log File Validation” on all trails.
B. Use an existing S3 bucket in one of the accounts, apply a bucket policy to the new centralized S3 bucket that permits the CloudTrail service to use the "s3: PutObject" action and the "s3 GetBucketACL" action, and specify the appropriate resource ARNs for the CloudTrail trails.
C. Apply a bucket policy to the new centralized S3 bucket that permits the CloudTrail service to use the "s3 PutObject" action and the "s3 GelBucketACL" action, and specify the appropriate resource ARNs for the CloudTrail trails.
D. Use unique log file prefixes for trails in each IAM account.
E. Configure CloudTrail in the centralized account to log all accounts to the new centralized S3 bucket.
F. Enable encryption of the log files by using IAM Key Management Service

Answer: A, C, E Explanation: Explanation: https://docs.IAM.amazon.com/IAMcloudtrail/latest/userguide/best-practices-security.htmlIf you have created an organization in IAM Organizations, you can create a trail that will log all events for all IAM accounts in that organization. This is sometimes referred to as an organization trail. You can also choose to edit an existing trail in the master account and apply it to an organization, making it an organization trail. Organization trails log events for the master account and all member accounts in the organization. For more information about IAM Organizations, see Organizations Terminology and Concepts. Note Reference: https://docs. IAM.amazon.com/IAMcloudtrail/latest/userguide/creating-trail-organization.html You must be logged in with the master account for the organization in order to create an organization trail. You must also have sufficient permissions for the IAM user or role in the master account in order to successfully create an organization trail. If you do not have sufficient permissions, you will not see the option to apply a trail to an organization.

Amazon SCS-C01 Sample Question 34

An organization has three applications running on IAM, each accessing the same data on Amazon S3. The data on Amazon S3 is server-side encrypted by using an IAM KMS Customer Master Key (CMK).

What is the recommended method to ensure that each application has its own programmatic access control permissions on the KMS CMK?


Options:

A. Change the key policy permissions associated with the KMS CMK for each application when it must access the data in Amazon S3.
B. Have each application assume an IAM role that provides permissions to use the IAM Certificate Manager CMK.
C. Have each application use a grant on the KMS CMK to add or remove specific access controls on the KMS CMK.
D. Have each application use an IAM policy in a user context to have specific access permissions on the KMS CMK.

Answer: D

Amazon SCS-C01 Sample Question 35

A Security Engineer is working with a Product team building a web application on IAM. The application uses Amazon S3 to host the static content, Amazon API Gateway to provide RESTful services; and Amazon DynamoDB as the backend data store. The users already exist in a directory that is exposed through a SAML identity provider.

Which combination of the following actions should the Engineer take to enable users to be authenticated into the web application and call APIs? (Choose three.)


Options:

A. Create a custom authorization service using IAM Lambda.
B. Configure a SAML identity provider in Amazon Cognito to map attributes to the Amazon Cognito user pool attributes.
C. Configure the SAML identity provider to add the Amazon Cognito user pool as a relying party.
D. Configure an Amazon Cognito identity pool to integrate with social login providers.
E. Update DynamoDB to store the user email addresses and passwords.
F. Update API Gateway to use a COGNITO_USER_POOLS authorizer.

Answer: B, D, F

Amazon SCS-C01 Sample Question 36

Example.com hosts its internal document repository on Amazon EC2 instances. The application runs on EC2 instances and previously stored the documents on encrypted Amazon EBS volumes. To optimize the application for scale, example.com has moved the files to Amazon S3. The security team has mandated that all the files are securely deleted from the EBS volume, and it must certify that the data is unreadable before releasing the underlying disks.

Which of the following methods will ensure that the data is unreadable by anyone else?


Options:

A. Change the volume encryption on the EBS volume to use a different encryption mechanism. Then, release the EBS volumes back to IAM.
B. Release the volumes back to IAM. IAM immediately wipes the disk after it is deprovisioned.
C. Delete the encryption key used to encrypt the EBS volume. Then, release the EBS volumes back to IAM.
D. Delete the data by using the operating system delete commands. Run Quick Format on the drive and then release the EBS volumes back to IAM.

Answer: D Explanation: Explanation: Amazon EBS volumes are presented to you as raw unformatted block devices that have been wiped prior to being made available for use. Wiping occurs immediately before reuse so that you can be assured that the wipe process completed. If you have procedures requiring that all data be wiped via a specific method, such as those detailed in NIST 800-88 (“Guidelines for Media Sanitization”), you have the ability to do so on Amazon EBS. You should conduct a specialized wipe procedure prior to deleting the volume for compliance with your established requirements.https://d0. IAMstatic.com/whitepapers/IAM-security-whitepaper.pdg

Amazon SCS-C01 Sample Question 37

Compliance requirements state that all communications between company on-premises hosts and EC2 instances be encrypted in transit. Hosts use custom proprietary protocols for their communication, and EC2 instances need to be fronted by a load balancer for increased availability.

Which of the following solutions will meet these requirements?


Options:

A. Offload SSL termination onto an SSL listener on a Classic Load Balancer, and use a TCP connection between the load balancer and the EC2 instances.
B. Route all traffic through a TCP listener on a Classic Load Balancer, and terminate the TLS connection on the EC2 instances.
C. Create an HTTPS listener using an Application Load Balancer, and route all of the communication through that load balancer.
D. Offload SSL termination onto an SSL listener using an Application Load Balancer, and re-spawn and SSL connection between the load balancer and the EC2 instances.

Answer: B Explanation: Explanation: https:// IAM.amazon.com/blogs/compute/maintaining-transport-layer-security-all-the-way-to-your-container-using-the-network-load-balancer-with-amazon-ecs/

Amazon SCS-C01 Sample Question 38

An application has a requirement to be resilient across not only Availability Zones within the application’s primary region but also be available within another region altogether.

Which of the following supports this requirement for IAM resources that are encrypted by IAM KMS?


Options:

A. Copy the application’s IAM KMS CMK from the source region to the target region so that it can be used to decrypt the resource after it is copied to the target region.
B. Configure IAM KMS to automatically synchronize the CMK between regions so that it can be used to decrypt the resource in the target region.
C. Use IAM services that replicate data across regions, and re-wrap the data encryption key created in the source region by using the CMK in the target region so that the target region’s CMK can decrypt the database encryption key.
D. Configure the target region’s IAM service to communicate with the source region’s IAM KMS so that it can decrypt the resource in the target region.

Answer: D

Amazon SCS-C01 Sample Question 39

Which approach will generate automated security alerts should too many unauthorized IAM API requests be identified?


Options:

A. Create an Amazon CloudWatch metric filter that looks for API call error codes and then implement an alarm based on that metric’s rate.
B. Configure IAM CloudTrail to stream event data to Amazon Kinesis. Configure an IAM Lambda function on the stream to alarm when the threshold has been exceeded.
C. Run an Amazon Athena SQL query against CloudTrail log files. Use Amazon QuickSight to create an operational dashboard.
D. Use the Amazon Personal Health Dashboard to monitor the account’s use of IAM services, and raise an alert if service error rates increase.

Answer: A Explanation: Explanation: https://docs. IAM.amazon.com/IAMcloudtrail/latest/userguide/cloudwatch-alarms-for-cloudtrail.html#cloudwatch-alarms-for-cloudtrail-authorization-failuresOpen the CloudWatch console at https://console.IAM.amazon.com/clo udwatch/. In the navigation pane, choose Logs. In the list of log groups, select the check box next to the log group that you created for CloudTrail log events. Choose Create Metric Filter. On the Define Logs Metric Filter screen, choose Filter Pattern and then type the following: { ($.errorCode = "*UnauthorizedOperation") || ($.errorCode = "AccessDenied*") } Choose Assign Metric. For Filter Name, type AuthorizationFailures. For Metric Namespace, type CloudTrailMetrics. For Metric Name, type AuthorizationFailureCount.

Amazon SCS-C01 Sample Question 40

An organization receives an alert that indicates that an EC2 instance behind an ELB Classic Load Balancer has been compromised.

What techniques will limit lateral movement and allow evidence gathering?


Options:

A. Remove the instance from the load balancer and terminate it.
B. Remove the instance from the load balancer, and shut down access to the instance by tightening the security group.
C. Reboot the instance and check for any Amazon CloudWatch alarms.
D. Stop the instance and make a snapshot of the root EBS volume.

Answer: B Explanation: Explanation: https://d1. IAMstatic.com/whitepapers/IAM_security_incident_response.pdg

Amazon SCS-C01 Sample Question 41

Your company has a set of resources defined in the IAM Cloud. Their IT audit department has requested to get a list of resources that have been defined across the account. How can this be achieved in the easiest manner?

Please select:


Options:

A. Create a powershell script using the IAM CLI. Query for all resources with the tag of production.
B. Create a bash shell script with the IAM CLI. Query for all resources in all regions. Store the results in an S3 bucket.
C. Use Cloud Trail to get the list of all resources
D. Use IAM Config to get the list of all resources

Answer: D Explanation: Explanation: The most feasible option is to use IAM Config. When you turn on IAM Config, you will get a list of resources defined in your IAM Account.A sample snapshot of the resources dashboard in IAM Config is shown belowC:UserswkDesktopmudassarUntitled.jpgOption A is incorrect because this would give the list of production based resources and now all resourcesOption B is partially correct But this will just add more maintenance overhead.Option C is incorrect because this can be used to log API activities but not give an account of all resouFor more information on IAM Config, please visit the below URL:https://docs.IAM.amazon.com/config/latest/developereuide/how-does-confie-work.html The correct answer is: Use IAM Config to get the list of all resourcesSubmit your Feedback/Queries to our Expertt

Amazon SCS-C01 Sample Question 42

A company has complex connectivity rules governing ingress, egress, and communications between Amazon EC2 instances. The rules are so complex that they cannot be implemented within the limits of the maximum number of security groups and network access control lists (network ACLs).

What mechanism will allow the company to implement all required network rules without incurring additional cost?


Options:

A. Configure IAM WAF rules to implement the required rules.
B. Use the operating system built-in, host-based firewall to implement the required rules.
C. Use a NAT gateway to control ingress and egress according to the requirements.
D. Launch an EC2-based firewall product from the IAM Marketplace, and implement the required rules in that product.

Answer: C

Amazon SCS-C01 Sample Question 43

A Security Engineer must implement mutually authenticated TLS connections between containers that communicate inside a VPC.

Which solution would be MOST secure and easy to maintain?


Options:

A. Use IAM Certificate Manager to generate certificates from a public certificate authority and deploy them to all the containers.
B. Create a self-signed certificate in one container and use IAM Secrets Manager to distribute the certificate to the other containers to establish trust.
C. Use IAM Certificate Manager Private Certificate Authority (ACM PCA) to create a subordinate certificate authority, then create the private keys in the containers and sign them using the ACM PCA API.
D. Use IAM Certificate Manager Private Certificate Authority (ACM PCA) to create a subordinate certificate authority, then use IAM Certificate Manager to generate the private certificates and deploy them to all the containers.

Answer: E

Amazon SCS-C01 Sample Question 44

The Security Engineer is given the following requirements for an application that is running on Amazon EC2 and managed by using IAM CloudFormation templates with EC2 Auto Scaling groups:

-Have the EC2 instances bootstrapped to connect to a backend database.

-Ensure that the database credentials are handled securely.

-Ensure that retrievals of database credentials are logged.

Which of the following is the MOST efficient way to meet these requirements?


Options:

A. Pass databases credentials to EC2 by using CloudFormation stack parameters with the property set to true. Ensure that the instance is configured to log to Amazon CloudWatch Logs.
B. Store database passwords in IAM Systems Manager Parameter Store by using SecureString parameters. Set the IAM role for the EC2 instance profile to allow access to the parameters.
C. Create an IAM Lambda that ingests the database password and persists it to Amazon S3 with server-side encryption. Have the EC2 instances retrieve the S3 object on startup, and log all script invocations to syslog.
D. Write a script that is passed in as UserData so that it is executed upon launch of the EC2 instance. Ensure that the instance is configured to log to Amazon CloudWatch Logs.

Answer: C

Amazon SCS-C01 Sample Question 45

A company is using CloudTrail to log all IAM API activity for all regions in all of its accounts. The CISO has asked that additional steps be taken to protect the integrity of the log files.

What combination of steps will protect the log files from intentional or unintentional alteration? Choose 2 answers from the options given below

Please select:


Options:

A. Create an S3 bucket in a dedicated log account and grant the other accounts write only access. Deliver all log files from every account to this S3 bucket.
B. Write a Lambda function that queries the Trusted Advisor Cloud Trail checks. Run the function every 10 minutes.
C. Enable CloudTrail log file integrity validation
D. Use Systems Manager Configuration Compliance to continually monitor the access policies of S3 buckets containing Cloud Trail logs.
E. Create a Security Group that blocks all traffic except calls from the CloudTrail service. Associate the security group with) all the Cloud Trail destination S3 buckets.

Answer: A, C Explanation: Explanation: The IAM Documentation mentions the followingTo determine whether a log file was modified, deleted, or unchanged after CloudTrail delivered it you can use CloudTrail log fill integrity validation. This feature is built using industry standard algorithms: SHA-256 for hashing and SHA-256 with RSA for digital signing. This makes it computationally infeasible to modify, delete or forge CloudTrail log files without detection.Option B is invalid because there is no such thing as Trusted Advisor Cloud Trail checksOption D is invalid because Systems Manager cannot be used for this purpose.Option E is invalid because Security Groups cannot be used to block calls from other servicesFor more information on Cloudtrail log file validation, please visit the below URL:https://docs.IAM.amazon.com/IAMcloudtrail/latest/userguide/cloudtrail-loe-file-validation-intro.htmll For more information on delivering Cloudtrail logs from multiple accounts, please visit the below URL:https://docs.IAM.amazon.com/IAMcloudtrail/latest/userguide/cloudtrail-receive-logs-from-multiple-accounts.html The correct answers are: Create an S3 bucket in a dedicated log account and grant the other accounts write only access. Deliver all log files from every account to this S3 bucket, Enable Cloud Trail log file integrity validationSubmit your Feedback/Queries to our Expertt

Amazon SCS-C01 Sample Question 46

Amazon CloudWatch Logs agent is successfully delivering logs to the CloudWatch Logs service. However, logs stop being delivered after the associated log stream has been active for a specific number of hours.

What steps are necessary to identify the cause of this phenomenon? (Choose two.)


Options:

A. Ensure that file permissions for monitored files that allow the CloudWatch Logs agent to read the file have not been modified.
B. Verify that the OS Log rotation rules are compatible with the configuration requirements for agent streaming.
C. Configure an Amazon Kinesis producer to first put the logs into Amazon Kinesis Streams.
D. Create a CloudWatch Logs metric to isolate a value that changes at least once during the period before logging stops.
E. Use IAM CloudFormation to dynamically create and maintain the configuration file for the CloudWatch Logs agent.

Answer: A, B Explanation: Explanation: https://acloud.guru/ forums/IAM-certified-security-specialty/discussion/-Lm5A3w6_NybQPhh6tRP/Cloudwatch%20Log%20questioo


and so much more...