One of the most common causes of a slow scan relates to a large response time between the scanner and the target website or web application. The average response time is an average of the total time it takes for a web server to respond to a request made by the scanner.
An average response time of 200ms (0.2 of a second) or less should be considered to be a good response time, while a response time over 300ms (0.3 a second) is considered to be prohibitively large, and therefore, scans may take very long to complete.
Taking the above screenshot into consideration, a scan with an average response time of 1797.86ms (1.7 second) will take roughly up to 9 times longer to complete than it would have with a good response time (1797ms/200ms = 8.985 times longer). In addition, the response time tends to deteriorate over time when it is this high.
Unfortunately, there is no one thing that may be causing a high response time. The following are some things to check when the server response time is high:
- Web Server performance. Check things like CPU, memory, hard disk access etc. Most often a simple server upgrade solves the problem.
- Database performance. Once again, check CPU, memory and hard disk access, and check if any queries are taking too long and can be optimized.
- Bandwidth and Network performance. Ideally, test from different locations so as to confirm if the server’s network is the bottleneck.
- Check if your Web Application Firewall (WAF), Intrusion Detection System (IDS) or network firewall are throwing your site’s response time over the roof.
There may be situations when improving the response time is not possible (at least not immediately). In such cases, you have the following options.
- There are situations where you only need to scan part of the website, in which case, you can exclude some pages or full directories from the scan. This can be done by starting the scan from the Scan Wizard. In the second step, enable ‘After crawling let me choose files to scan’. After the crawl, Acunetix WVS will present you with a list of files found during the crawl and you can deselect any files which you do not want to scan.
- You can also exclude scanning parts of your website by using Directory and File Filters, which can be configured from Configuration > Scan Settings > Crawling Options > Directory and File Filters.
- If the site is template based, you may want to consider only scanning a few pages from each template, rather than scanning pages which are pretty much the same except for the text on the page.
- Consider using Scanning Profile other than the ‘Default‘ Scanning Profile, which scans for all the web vulnerabilities. A good alternative would be to scan using the ‘High Risk Alerts’ Scanning Profile, or create a custom Scanning Profile that is targeted towards scanning for a specific set of vulnerabilities.
- Ensure that you are not scanning using the Extensive Scanning mode. This will increase the scanning depth by trying all possible combinations on every parameter, thus generating more HTTP requests to your server. You can check this from Configuration > Scan Settings > Scanning Options. If needed, create a new Scan Settings template for sites with a slow response time.
Finally, if your site’s response time is below 200ms and the target server has an adequate resource, you may want to consider increasing the number of parallel connections made to the server, allowing the scan to complete in less time.