CVE-2024-3400: PAN-OS Command Injection Vulnerability in GlobalProtect Gateway. Learn More

CVE-2024-3400: PAN-OS Command Injection Vulnerability in GlobalProtect Gateway. Learn More

Managed Detection & Response

Eliminate active threats with 24/7 threat detection, investigation, and response.

Co-Managed SOC (SIEM)

Maximize your SIEM investment, stop alert fatigue, and enhance your team with hybrid security operations support.

Advisory & Diagnostics

Advance your cybersecurity program and get expert guidance where you need it most.

Penetration Testing

Test your physical locations and IT infrastructure to shore up weaknesses before exploitation.

Database Security

Prevent unauthorized access and exceed compliance requirements.

Email Security

Stop email threats others miss and secure your organization against the #1 ransomware attack vector.

Digital Forensics & Incident Response

Prepare for the inevitable with 24/7 global breach response in-region and available on-site.

Firewall & Technology Management

Mitigate risk of a cyberattack with 24/7 incident and health monitoring and the latest threat intelligence.

Microsoft Exchange Server Attacks
Stay protected against emerging threats
Rapidly Secure New Environments
Security for rapid response situations
Securing the Cloud
Safely navigate and stay protected
Securing the IoT Landscape
Test, monitor and secure network objects
Why Trustwave
About Us
Awards and Accolades
Trustwave SpiderLabs Team
Trustwave Fusion Security Operations Platform
Trustwave Security Colony
Technology Alliance Partners
Key alliances who align and support our ecosystem of security offerings
Trustwave PartnerOne Program
Join forces with Trustwave to protect against the most advance cybersecurity threats
SpiderLabs Blog

Amazon (AWS) S3 Bucket Take Over


Let’s try something a bit different and take a look at some of Trustwave SpiderLabs’  Open Source Intelligence (OSINT) research findings, and exploitation of vulnerable buckets and domains. I published this research internally on February 3, 2023, and here are my findings. Today, I will share with you how deleted S3 buckets could become a liability or threat to your organization and highlight the importance of cybersecurity in data and asset management.

An Amazon “Simple Storage Service” (S3) is a public cloud storage service, a resource available in the Amazon Web Services (AWS) platform. It provides object-based storage where data is kept  inside S3 buckets in distinct units called objects. I will discuss the benefits and risks of using cloud-based applications, particularly S3 buckets, as evidenced by our research findings.
While cloud-based applications have many benefits, they also pose various security risks. Some cloud-based applications can increase operational efficiency by allowing organizations to access software and data quickly, and by allowing employees, colleagues, and clients to communicate robustly, and arguably more securely, on any device. Storing all data in the cloud can reduce the need for costly hardware and software maintenance. As one can see with AWS, organizations can leverage cloud-hosted applications to benefit from improved scalability, reliability, and security.

However, there are risks when using cloud-based applications or storage, and it is essential to acknowledge that reliance on cloud-based solutions is not without challenges.  We can consider this a consequential security risk when storing data in the cloud. Common security pitfalls include poor asset tracking along with the absence of adequate policies or best practices implemented, which could ultimately result in data leakage and exploitation of assets.

Let us explore the nexus between cloud-based application security benefits and risks based on our OSINT AWS S3 bucket research findings.




Below, Figure 1 shows an illustration of the attack. Deletion of buckets is not an issue or a direct vulnerability. The issue stems from the deleted buckets, which are still being referenced by applications or systems. AWS_2
 Figure 1. Illustration of the Amazon (AWS) S3 Take Over Attack

According to Amazon, when buckets are deleted, their name can be re-used; hence, someone else might be able to claim the name. Therefore, it is advisable not to delete the bucket if you want to keep the same name for future use.

AWS_3 Figure 2. Amazon User Guide on reusing bucket names


In my previous post "To OSINT and Beyond” I demonstrated how OSINT and enumeration can help find these kinds of vulnerabilities. Discovering deleted S3 buckets is easy. We can use OSINT Tools like Shodan, Censys, and Google as shown below.


Figure 3. Using Shodan (OSINT tools) to find unclaimed or deleted S3 buckets

We can also use an intercepting proxy like Burp Suite and Extensions to detect if any deleted buckets are being used by target systems or applications.


Figure 4. Detecting deleted S3 Buckets using Burp Suite

Domain Take Over 

Before diving any further, let me sight the “S3 Domain take over” vulnerability. So, what is subdomain or domain take over? This is a security vulnerability that occurs when a threat actor gains control over a domain or subdomain that they do not own. In AWS S3 context, domain takeover specifically refers to a scenario when a threat actor takes control of a domain that is supposed to point to an S3 bucket but is misconfigured, deleted, or left unclaimed.
Spotting a domain with a deleted bucket is simple. We can check if a bucket is deleted by inspecting the HTTP/S responses with the keyword “NoSuchBucket.”


Figure 5. Shows the example domain’s DNS records were pointed to an S3 Bucket

So, let me give you a quick example and recreate that deleted bucket to which the vulnerable domain above is pointed.


Figure 6. Commands for recreating the deleted bucket and setup for hosting static web content on AWS S3

We hit refresh, and there you go.


Figure 7. The domain is now resolving on our recreated bucket.

In another scenario, we have a bunch of S3 Buckets to test, which is a bit more challenging since we don’t know if there are domains pointed or applications referencing a resource from deleted buckets. Below is a truncated output of the reconnaissance conducted.

In summary, over a hundred buckets were identified to be deleted.


Figure 8. Truncated output of enumeration with deleted buckets

Figure 9. Another example where a publicly accessible bucket and its contents is discovered



Years ago, my son posted on Facebook about a seemingly stubborn hen named Estrelia that had gone missing. Estrelia is a native hen. My son, a natural animal lover, formed a special bond with Estrelia when he first saw her as a chick. To make the story short, to our relief, after a few days, Estrelia managed to find her way back home.

In simple ways, the story of Estrelia draws striking parallels with the world of cybersecurity. Just as my son maintained a “never give up” mindset on finding Estrelia, we are equally dedicated to researching and uncovering potential issues that could benefit our clients. It underscores the importance of perseverance and patience, which are key factors in finding vulnerable assets and systems. Taking inspiration from this humbling story, I named the simple enumeration tool "S3lia" we used above.

Today Estrelia continues to live peacefully and comfortably on our small farm, under the watchful care of my son. Thanks to him, she's closely guarded and safe from harm.

AWS_11Figure 10. A stubborn, semi-angry-looking hen

Setup & Monitoring


Now that we have a list of deleted buckets, we will create a Proof of Concept (PoC) of reverse checking whether those deleted buckets are still in use elsewhere. To start, let’s try to recreate one of the vulnerable buckets in our AWS instance.

There are some policies that we need to add like public access, static content, and logging. Additionally, something important to note is that we should identify the former region of targeted buckets to collect data.


Figure 11. Available AWS Regions

To recreate the deleted bucket, we can use two methods: Web GUI, and CLI, which was already demonstrated above. If we have multiple targets, we need to write scripts for most parts of the process to automate, as we need to switch regions to increase the chances of detecting requests within the allotted period, so CLI is the best way to go.


Figure 12. Recreating a vulnerable bucket in our S3 instance  via CLI

Figure 13. Creating an S3 bucket  via Web GUI

As you might have guessed, the pending question is how an attacker can know if applications elsewhere are still referencing or using the bucket we just recreated. The answer is that we monitor the logs. Basically, we watch for any arbitrary request for a resource pointing to the bucket. 

Let’s create another bucket dedicated to harboring logs. We can go with setting up CloudTrail or even a regular bucket would do.


Figure 14. CloudTrail -> Trails log configuration

If everything is lined up, parsing the logs is easy. As we can see, a file ‘jquery.js’ is being requested, and since it does not exist, the server is returning ‘AccessDenied.’


Figure 15. A review of logs shows a request to a JavaScript file.

Figure 16. Additional example of parsing S3 logs



Here is a demo application we monitored making a GET request to a JavaScript resource “jquery.js.”


Figure 17. The demo web application was observed fetching an external resource that was supposed to be hosted on the deleted bucket.  

Now what’s left is to upload the test payload ‘jquery.js,’ a recreated malicious version which the application is requesting.


Figure 18. Uploading of a supposed malicious script 

After reloading the page, we can see that our payload is successfully executed.


Figure 19. Successful execution of JavaScript test payload

In summary, as technology continues to evolve rapidly, organizations must remain vigilant and adaptable to stay ahead of potential risks. Threat actors may exploit these weaknesses for malicious purposes, including phishing attacks, data breaches, unauthorized access, malware distribution, and other attack vectors. Such attacks not only compromise your organization's overall security but can also tarnish your reputation and that of your clients. 

To reinforce your defenses, consider incorporating security testing engagements into your regular practices. Moreover, maintaining an up-to-date asset inventory can significantly reduce the likelihood of encountering these issues. By proactively addressing vulnerabilities and bolstering your security measures, you can safeguard both your organization and your clients' security perimeters, ensuring a safer and more resilient digital landscape.

Latest SpiderLabs Blogs

CVE-2024-3400: PAN-OS Command Injection Vulnerability in GlobalProtect Gateway

Overview A command injection vulnerability has been discovered in the GlobalProtect feature within Palo Alto Networks PAN-OS software for specific versions that have distinct feature configurations...

Read More


We all know the cybersecurity industry loves its acronyms, but just because this fact is widely known doesn’t mean everyone knows the story behind the alphabet soup groups of letters, we must deal...

Read More

Phishing Deception - Suspended Domains Reveal Malicious Payload for Latin American Region

Recently, we observed a phishing campaign targeting the Latin American region. The phishing email contained a ZIP file attachment that when extracted reveals an HTML file that leads to a malicious...

Read More