Hi Guys, Welcome to InfoSecSecure.

In this Blog, we are going to solve Flaws.cloud LEVEL3 challenge. Before solving this challenge you need to solve the LEVEL1 & LEVEL2 challenges. we will not only solve the challenges other than will provide the impact & mitigation/solution for this vulnerability. First of all, you need to understand why we should solve flaws.cloud challenges.

Simple Answer for that If you want to cloud pentester or want to grow your carrier in Cloud Security. You need to solve flaws.cloud challenges etc… If you are reading this blog that means you are working on your cloud penetration testing journey.

Click here for LEVEL1 Challenge
Click here for LEVEL2 Challenge

There are no SQL injection, XSS, buffer overflows, or many of the other vulnerabilities you might have seen before. These challenges provide us the knowledge of basic vulnerability. which we don’t think, That It can be a vulnerability. in this flaws.cloud LEVEL3 challenges there are AWS configuration related vulnerabilities. which we are going to discover :


Before solving this challenge. you should be aware of S3 bucket and Region. If you don’t know. Don’t worry. We will provide a small Intro for the S3 bucket and Region.

S3 Bucket: S3 bucket like a harddisk or pendrive. where we create the folder or store the data.

Region: Region like a location. where we store our data. For ex: In India. Mumbai is one AWS region, on this region we can store our data.


Let’s solve this challenge:

Challenge: Victim created a S3 bucket and stored some sensitive files. where AWS user credentials have been stored. and there is one more interesting thing by mistake, he permitted everyone which means any AWS account user can access this S3 bucket.

Solution: We have to take advantage of this vulnerability access the S3 Bucket and download all files. After downloading the files we have to find the AWS credentials which will help to solve the next challenge.


We will follow some steps to solve this challenge:

  • Open LEVEL3 challenge on your browser.

Before solving this challenge solve the LEVEL2 challenge. In the LEVEL2 challenge, you will learn how to list and download a file from an S3 bucket.

  • In this challenge first of all we will download the all file from the LEVEL3 S3 bucket. So open terminal and run ” aws s3 sync s3://level3-9afd3927f195e10225021a578e6f78df.flaws.cloud/ . –no-sign-request –region us-west-2 “ this command. this command will help to download all files from this S3 Bucket.
    • Here’s a breakdown of the entire command:
      • aws s3 sync: This is the base command for synchronizing files with Amazon S3.
      • s3://level3-9afd3927f195e10225021a578e6f78df.flaws.cloud/: This is the source S3 bucket path. It represents the bucket named “level3-9afd3927f195e10225021a578e6f78df.flaws.cloud/”.
      • .: This is the destination path on your local machine. In this case, it’s set to the current directory (.), meaning files from the S3 bucket will be synchronized to the current directory on your local system.
      • –no-sign-request: This option disables AWS Signature Version 4 signing of requests. It means the synchronization won’t require AWS signature-based authentication.
      • –region us-west-2: Specifies the AWS region to use. In this example, it’s set to the US West (Oregon) region.
  • Finally all files have been downloaded. for listing all files we will use the ” ls ” command. run the ” ls ” command on the terminal and observe list files.
  • Here is one more interesting that hidden files and folders wouldn’t show if we run the ” ls ” command. To list hidden folders & files need to run the ” ls -a “ command on the terminal.
    • Here’s a breakdown of the entire command:
      • ls: list the content on this particular directory.
      • -a: Used for listing the hidden files and folders.
  • Observe highlight point in above screenshot, we can see there is a one git directory. so there can be sensitive information. For checking sensitive information We will run the ” git log ” command in our terminal where we have downloaded all S3 bucket files and folders.
    • Here’s a breakdown of the entire command:
      • git log command in Git is used to display a log of commits in the repository. It shows a list of commit messages, along with information about the author, date, and a unique commit hash.

output of the git log command in a Git repository. Let’s break down what each part of the output means:

  • Commit Hash:
    • Commit b64c8dcfa8a39af06521cf4cb7cdce5f0ca9e526: This is a unique identifier for a specific commit. It’s a long string of characters (in this case, “b64c8dcfa8a39af06521cf4cb7cdce5f0ca9e526”) that serves as a reference to that particular state of the code.
  • Branch Information:
    • (HEAD -> master): This indicates that the HEAD (the currently checked-out commit) is on the “master” branch. It helps you understand which branch you are currently on.
  • Commit Date:
    • Date: Sun Sep 17 09:10:43 2017 -0600: Displays the date and time when the commit was made, along with the time zone offset.
  • Commit Message:
    • Oops, accidentally added something I shouldn’t have: The message the author provided when making the commit. It provides context or a description of the changes made in that commit.

We got two hash values First one ” f52ec03b227ea6094b04e43f475fb0126edb5a61 ” and the Second one ” b64c8dcfa8a39af06521cf4cb7cdce5f0ca9e526 “. Here we will check the difference between both hash values.

  • Open the terminal and run the ” git diff f52ec03b227ea6094b04e43f475fb0126edb5a61 b64c8dcfa8a39af06521cf4cb7cdce5f0ca9e526 ” command.
    • Here’s a breakdown of the command components:
      • git diff: command will display the differences between the files in these two commits. It may include information such as added lines, removed lines, and modified lines. This is a powerful way to see what changes have been made between two points in Git history.
      • f52ec03b227ea6094b04e43f475fb0126edb5a61: The identifier for the first commit.
      • b64c8dcfa8a39af06521cf4cb7cdce5f0ca9e526: The identifier for the second commit.

Finally, we got the access_key and secret_access_key. that means solved the LEVEL3 challenge but here our challenge is not solved. Here we need to verify that these credentials are working or not. using this information we will configure the credentials on CLI Terminal.

  • run the ” aws configure –profile flaws “ command for configuring the credentials.
    • Here’s a breakdown of the command components:
      • aws configure: This is the base command to configure the AWS CLI.
      • –profile flaws: This specifies that you want to create a profile named “flaws.” The profile name is arbitrary and can be replaced with any name you prefer. A profile is a way to manage multiple sets of AWS credentials and configurations.

When you will run above command, you’ll be prompted to enter the following information:

  • AWS Access Key ID: This is your unique identifier for accessing AWS services.
  • AWS Secret Access Key: This is a secret key associated with the access key.
  • Default region name: This is the AWS region code (e.g., us-east-1) where your resources will be created by default.
  • Default output format: This is the format in which AWS CLI commands should display their output (e.g., JSON, text, table).

After entering these details, your configuration for the “flaws” profile will be set up. You can use this profile by specifying the –profile flaws option with other AWS CLI commands to use the credentials and configurations associated with that profile.

  • After configuring the credentials. Run a simple command ” aws –profile flaws s3 ls “ to explore the S3 bucket.
    • Here’s a breakdown of the command components:
      • aws: This is the base command that signals the start of an AWS command.
      • –profile flaws: This part specifies the named AWS CLI profile to use.
      • S3: This is the AWS Simple Storage Service (S3), a scalable object storage service that you can use to store and retrieve any amount of data.
      • ls: Stands for “list.” It’s a command to list the contents of a specified S3 bucket or a particular path within the bucket.

As we have mentioned above after solving this challenge we will discuss about impact & mitigation of this vulnerability.

Impact for S3 bucket allows access to everyone :

  • Unrestricted access allows anyone to view, download, or modify the data in the S3 bucket.
  • Sensitive information, such as personal data or proprietary business data, may be at risk.
  • Unauthorized access could lead to data corruption or unintended modifications.
  • Breaches and data leaks can harm the organization’s reputation and erode customer trust.

Mitigation for S3 bucket allows access to everyone :

  • Implement and enforce proper access controls using AWS Identity and Access Management (IAM).
  • Restrict access to specific IP addresses or IP ranges.
  • Use bucket policies and IAM policies to define who can access the bucket and what actions they can perform.
  • Enable server-side encryption to protect data at rest.
  • Set up AWS CloudTrail to log all S3 bucket-related activities.
  • Enable S3 bucket logging to capture access logs for further analysis.
  • Implement real-time monitoring and alerts for suspicious activities.
  • Avoid using overly permissive “public” or “everyone” access in policies.

Here we have successfully solved the LEVEL3 challenge. In the Next Blog, We will solve the LEVEL4 challenge.