Last Update: 20201015
- https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/guide_credentials_profiles.html
- https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-profiles.html
- Actually determined that it is easier to run these tools from a Linux VM by downloading from Github and installing requirements following their installation Instructions.
- pip3 install -r requirements.txt
- pipenv install awscli ipython boto3 dateutil netaddr botocore policyuniverse sqlitedict cherrypy cherrypy-cors coloredlogs asyncio-throttle kube kube-hunter
- Update PS1 to show and time
- http://lockboxx.blogspot.com/2015/02/aws-api-security-auditing-cheat-sheet.html
- https://cloudsecops.com/aws-reconnaissance-tools/
- https://github.com/dagrz/aws_pwn
- https://github.com/toniblyx/my-arsenal-of-aws-security-tools
- https://rhinosecuritylabs.com/aws/pacu-open-source-aws-exploitation-framework/
- https://github.com/nccgroup/ScoutSuite
- https://github.com/RhinoSecurityLabs/pacu
- S3 Buckets -
Github: https://github.com/nccgroup/ScoutSuite
ScoutSuite is a cloud configuration auditing tool. It takes account credentials and uses it to automatically review many configuration settings that would be difficult to accomplish manually. The output is a report of the summaries for each issue with specific instance information and ratings for severity. It does not provide remediation details.
As with any vulnerability scan, the findings should be manually reviewed to provide a demonstration of the vulnerability. Scout Suite provides details about the flagged issue, but does not provide the output of successful commands. It also does not take the actual application deployment or compensating controls into consideration.
Github: https://github.com/RhinoSecurityLabs/pacu
Preparation Notes Need to make sure your terminal has unlimited scrolling. This is because Pacu stores some of the data in the SQL database instead of creating a file in the session data storage area. You will need to use the 'data ' command to output all of the data. But there is no way to select individual pieces of information and therefore it is all displayed to the screen. No summary information about the data is available either.
While running in Docker seems okay, this does not provide easy access to the data downloads. It also takes over the terminal so that the scrolling is not unlimited and it does not match the default configuration of the terminal it is run in. Thus, when service data is displayed it will be truncated.
Commands listed in interface are not ordered to help with information gathering. The following are some ordered steps for gathering details about the accounts and preparing to further testing as an individual or a team.
- Set the tool to use specific user credentials for each command
import_keys <profile_name>
- It is possible that setting the regions will help reduce noise, focus information gathering, and reduce time.
set_regions us-east-1 us-east-2 us-west-1
- AWS account: This is important because companies may have multiple accounts.
run aws__enum_account
- User accounts and roles. This is important because it shows all of the account and role configurations in JSON format. Output can be used in recommendations to specifically identify changes that will help reduce risk.
run iam__enum_users_roles_policies_groups
- EC2 Instances. This is important because it shows all of the EC2 configurations in JSON format. Output can be used in recommendations to specifically identify changes that will help reduce risk.
run ec2__enum
- AWS System Manager Parameter Store. This is important because this data contains secrets for the configuration of services and applications throughout the AWS infrastructure. It can be used to understand deployment and access other service. Exploitation with each secret depends on the functionality of that service.
run enum__secrets
run systemsmanager__download_parameters
- Gather user data from EC2 Instances. This will access EC2 instances and gather information.
run ec2__download_userdata
- A list of commands can be placed in a resource file and loaded to automatically run.
load_commands_file <file>
Commands for Information Gathering using the AWS CLI
- Current user account
aws iam get-user --profile <aws_creds_profile>
- All users
aws iam list-users --profile <aws_creds_profile>
- All user AccessKeyId (does not include secret key)
for i in
aws iam list-users --profile <aws_creds_profile> | grep UserName | cut -d: -f2 | cut -d" -f2; do echo === $i ===; aws iam list-access-keys --user-name $i --profile <aws_creds_profile>; done
- All Groups / Roles for a user
for i in
aws iam list-users --profile <aws_creds_profile> | grep UserName | cut -d: -f2 | cut -d" -f2; do echo === $i ===; aws iam list-groups-for-user --user-name $i --profile <aws_creds_profile>; done
- All Groups / Roles fr a user, name only
for i in
aws iam list-users --profile <aws_creds_profile> | grep UserName | cut -d: -f2 | cut -d" -f2; do echo === $i ===; aws iam list-groups-for-user --user-name $i --profile <aws_creds_profile> | grep GroupName | cut -d: -f2 | cut -d\" -f2; done
- List S3 Buckets
aws s3 ls --profile <aws_creds_profile>
- List S3 Bucket Policies
for i in
aws s3 ls --profile <aws_creds_profile> | cut -d' ' -f3; do echo === $i ===; aws s3api get-bucket-policy --bucket $i --query Policy --profile <aws_creds_profile>; done
- List S3 Bucket ACLs
for i in
aws s3 ls --profile <aws_creds_profile> | cut -d' ' -f3; do echo === $i ===; aws s3api get-bucket-acl --bucket $i --profile <aws_creds_profile>; done
- Test S3 Bucket Access Permissions
for i in
aws s3 ls --profile <aws_creds_profile> | cut -d' ' -f3; do echo === $i ===; aws s3api get-public-access-block --bucket $i --profile <aws_creds_profile>; done
- Test S3 Bucket Public Status
for i in
aws s3 ls --profile <aws_creds_profile> | cut -d' ' -f3; do echo === $i ===; aws s3api get-bucket-policy-status --bucket $i --profile <aws_creds_profile>; done
- List S3 Bucket with no credentials (PRE means directory)
aws s3 ls s3://<bucket.address>
- Recursively list S3 Bucket with no credentials
aws s3 ls s3://<bucket.address> --recursive
- Download a file from S3 Bucket
aws s3 cp s3://<bucket.address>/<filename> ./data/<bucket.address>_<filename>
- Upload a file to S3 Bucket
aws s3 cp ./data/<bucket.address>_<filename2> s3://<bucket.address>/<filename2>
- List Cloudfront Stacks
for i in
aws ec2 describe-regions --output text --profile <aws_creds_profile> --region us-west-2 | cut -f4; do echo ==== $i =====; aws cloudformation list-stacks --profile <aws_creds_profile> --region $i; done
- Describe Cloudfront Stacks
for i in us-east-1 us-east-2; do echo ==== $i =====; aws cloudformation describe-stacks --profile <aws_creds_profile> --region $i; done
- List Public IP Addresses by Region
for i in
aws ec2 describe-regions --output text --profile <aws_creds_profile> --region us-west-2 | cut -f4; do echo ==== $i =====; aws ec2 describe-addresses --region $i --profile <aws_creds_profile> --region $i | grep PublicIp\" | cut -d: -f2 | cut -d\" -f2; done
TBD