How Response Time Affects a Scan’s Performance

One of the most common causes of a slow scan relates to a large response time between the scanner and the target website or web application. The average response time is an average of the total time between the sending of the request made by the scanner and the web server’s response.

An average response time of 200ms (0.2 of a second) or less should be considered to be a good response time, while a response time over 300ms (0.3 a second) is considered to be prohibitively large, and therefore, scans may take very long to complete.

For example, a scan with an average response time of 200ms is considered the optimal average response time while a scan with an average response time of 400ms is considered a large response time — double that which is considered optimal.

Following this example, if a scan takes 30 minutes to complete with an average response time of 200ms, then the same scan with an average response time of 400ms would take 1 hour to complete.

Unfortunately, there is no one factor that may be causing a high response time. The following are some issues to look into when the server’s response time is too high.

Main Optimization Tips

Web Server performance

Check factors such as CPU, memory, hard disk access and other resources. Most often, a simple server upgrade may solve the problem.

Database performance

Once again, check CPU, memory and hard disk access in relation to the resources being used by the database. Additionally check if any queries are taking too long to execute and if these can be optimized.

Bandwidth and Network performance

Ideally, perform this test from different locations to confirm whether the server’s network is the bottleneck.

Web Application Firewall (WAF), Intrusion Detection System (IDS) or Network Firewall interference

Check if any of these systems are being used and if they are contributing to the elevated average response time. This may be the case since every incoming request on your web server will be thoroughly analysed thus increasing the response time.

It is recommended that the Target machine you are scanning with Acunetix be whitelisted for any of these security mechanisms, so that requests and responses pass through unhindered.

Secondary Optimization Tips

There may be situations when improving the response time is not possible (at least not immediately) using the principal optimization techniques. In such cases, you have the following additional possibilities.

Excluding Paths

There are situations where you only need to scan part of the website. In this case, you can exclude some pages or full directories from the scan. Once the scan is run, these paths are automatically not crawled. For more information, please refer to the Path Exclusions guide.

Using Scan Types

You may also consider using Scan Types, which will run a more specific set of scripts that aim to find a narrower group of vulnerabilities. For example, if you wish to start by finding the most important and high risk vulnerabilities, you may choose the High Risk Vulnerabilities Scan Profile. For more information, please refer to the Scan Types guide.

Scan Speed

The Scan Speed is set to Fast by default, but you may find that this configuration might not suit your needs. There are some cases where the Scan Speed should be put on a slower setting. This might seem counterintuitive for shortening scan times, since a fast scan will be sending multiple concurrent requests to the web server. However, this could congest your web server with requests, thus taking a long time to send back a response. A slower scan speed might mitigate this if your web server cannot keep up with the amount of requests being sent by Acunetix. On the other hand, if your web server is able to easily handle multiple requests at the same time, make sure that you are using the Fast Scan Speed setting.

Splitting Scans

In the case of very large or complex web applications (thousands of directories and files or thousands of inputs) the scanner will take longer to perform such scans since it needs to analyse a lot more information than if there were less pages and inputs. For such scenarios, it is recommended to split large websites or web applications across several Targets and Scans.

« Back to the Acunetix Support Page