One of the most common causes of a slow scan relates to a large response time between the scanner and the target website or web application. The average response time is an average of the total time between the sending of the request made by the scanner and the web server’s response.
An average response time of 200ms (0.2 of a second) or less should be considered to be a good response time, while a response time over 300ms (0.3 a second) is considered to be prohibitively large, and therefore, scans may take very long to complete.
For example, if we take a scan with an average response time of 200ms which is the optimal average response time and another scan with an average response time of 400ms which is considered as a large response time, double that which is regarded as optimal.
For example, if a scan takes 30 minutes to complete with an average response time of 200ms, if the same scan had an average response time of 400ms, the scan would take 1 hour to complete.
Unfortunately, there is no one factor that may be causing a high response time. The following are some issues to look into when the server’s response time is too high.
Web Server performance
Check factors such as CPU, memory, hard disk access and other resources. Most often, a simple server upgrade may solve the problem.
Once again, check CPU, memory and hard disk access in relation to the resources being used by the database. Additionally check if any queries are taking too long to execute and if these can be optimized.
Bandwidth and Network performance
Ideally, perform this test from different locations so as to confirm whether the server’s network is the bottleneck.
Web Application Firewall (WAF), Intrusion Detection System (IDS) or Network Firewall interference
Check if any of these systems are being used and if they are contributing to the high increase of your average response time. This may be the case since every incoming request on your web server will be thoroughly analysed thus increasing the response time.
It might be ideal that the machine you are running Acunetix on be whitelisted for any of these security mechanisms, so that requests and responses pass through unhindered.
There may be situations when improving the response time is not possible (at least not immediately). In such cases, you have the following options.
There are situations where you only need to scan part of the website, in which case, you can exclude some pages or full directories from the scan. Once the scan is run, these paths are automatically not crawled. For more information, please refer to the Path Exclusions guide.
Using Scan Types
You may also consider using Scan Types, which will run a more specific set of scripts that aim to find a narrower group of vulnerabilities. Once such example is, if you first want to find the most important to fix and high risk vulnerabilities for a start, you may choose the High Risk Vulnerabilities Scan Type. For more information, please refer to the Scan Types guide.
The Scan Speed is set to Fast by default, but you may find that this configuration might not suite your needs. There are some cases where the Scan Speed should be put on a slower setting. This might seem counter intuitive for shortening scan times, although a fast scan will be sending multiple concurrent requests to the web server. This could congest your web server with requests, thus taking a long time to send back a response. A slower scan speed might mitigate this if your web server cannot keep up with the amount of requests being sent by Acunetix. On the other hand, if your web server is able to easily handle multiple requests at the same time, make sure that you are using the Fast Scan Speed setting.
In the case of very large or complex web applications (thousands of directories and files or thousands of inputs) the scanner will take longer to perform such scans since it needs to analyse a lot more information than if there were less pages and inputs. To such an extent, it is recommended to split large websites or web applications across several Targets and Scans.