A report recently published by Imperva has reported that more than half of web traffic comes from bots rather than human visitors. They have also noted some changes in the type of bots observed, including a predictable yet worrying trend in impersonator bots, which now account for 22% of bot traffic. Overall, 29% of all visits are from ‘Bad bots’ and 27% from ‘Good bots’. Impersonator bots are particularly troublesome as they’re harder to spot and block as they disguise themselves as ‘good’ bots such as search engines.
With Distributed Denial of Service (DDoS) attacks on the rise and botnets continuing to be profitable for cybercrime both website admin and those surfing the internet should be wary; sites with vulnerabilities such as Cross Site Scripting can be exploited to enrol visitors into a botnet and malware can be set to download to the machines. Gone are the days of infection by email, now all you need do is visit a site which has been attacked.
Mega vulnerabilities such as Shellshock have also aided the growth of botnets. Following Shellshock’s discovery and the release of a patch, exploit attempts increased from around 400 offending IPs at zero day to over 15,000 four weeks after discovery. Most of these were attempts by hackers to hijack vulnerable Linux and Unix servers. This can be expected to increase in the coming year as more mega vulnerabilities are likely to be discovered.
Popular content management systems such as WordPress put website owners and visitors particularly at risk, due to the high numbers of plugins offering a wide range of hacking opportunities. One particular DDoS attack last year was carried out using 162,000 WordPress sites.
Botnets can even be configured to search for other vulnerable servers, automatically recruiting them to the botnet and therefore the size of the botnet snowballs. Vice versa a website can be used to attack visiting machines.
Botnets can also be ‘rented out’ for profit, as the recently launched Lizard Stresser service by hacker group Lizard Squad has been. In this case the bot herders demonstrated their service by launching an attack on large gaming networks over the Christmas period. Such services allow other hackers to knock a chosen target offline without having to carry out the bulk of the DDoS attack work themselves. Due to the growth of these botnets, the volume and intensity of DDoS attacks has increased dramatically with the largest attack of 2014 being at a traffic volume of 400gbps and the average now being between 100 and 200gbps.
So, how can you protect your website from becoming a zombie?
The first step for the website admin would be to make sure their security measures are sufficient. Are your login credentials strong enough? Do you have any vulnerabilities for hackers to exploit? Make sure your site uses two factor authentication and strong passwords. Also make sure to run a web app vulnerability scan to find any vulnerabilities in your site that could be exploited by attackers and patch them immediately, especially in the case of mega vulnerabilities such as Shellshock.
There are also methods for checking if your website has already become part of a particular botnet, but as each botnet is different the checks vary depending on the botnet. e.g for the Stealrat botnet you would need to look for spammer scripts and malicious PHP files. Since most of the time, zombie servers are used in DDOS attacks or to send spam, you should look out for an increase in outbound traffic.
An increase in CPU usage might also indicate that the server is part of a botnet and being used for bitcoin mining, although this is less profitable for bot herders and thus not as common. Zombie servers need to call to the command and control server (C&C) to receive new commands or to upload stolen data. This is often done via IRC channels, and sometimes using custom protocols. Keep an eye out for connections being done on unusual ports.
As with any technology, it’s important to be up to date and if you use a Content Management System, make sure you are using the latest version as this should be the most secure. Also make sure that your web vulnerability scanner is up to date and run regular scans to locate any new vulnerabilities. A good anti-malware tool can also check if any known malware has been downloaded to your machine.