Scanning Options in the Acunetix Scanning Engine

The “Scanning Options” allow you to define the general scanning behaviour of the Acunetix scanning engine.

Disable alerts generated by crawler – Enable this option to disable crawler related alerts, such as: broken links, file inputs and files whose name indicates that they can be dangerous etc. from being reported.

Scanning Mode – From this node you can select the Scanning Mode which will be used during both the crawling and scanning stage of the target website. The scan mode will determine how both the crawler and the scanner will treat website parameters (also known as inputs), which will affect the number of security checks launched against the website. The following scanning mode options are available:

  • Quick – In this mode, the crawler will only fetch a very limited number of variations of each parameter, because they are not considered to be action parameters. Action parameters are designed to control the execution flow of the server scripts. Such scanning mode should only be used with small and static websites.
  • Heuristic – In this mode, the crawler will try to make heuristic decisions on which parameters should be considered as action parameters. It will intelligently fetch the parameters in order to conduct an effective scan. This will result in a larger number of different variations, and therefore the scanner will launch a high number of security checks against the website. This scanning mode is the most efficient and accurate, and is recommended as the scanning mode of choice unless there are specific reasons to use other scanning modes.
  • Extensive – In this mode, the crawler will fetch all possible values and combinations of all parameters. This will lead to a much larger number of variations, and therefore the scanner will launch an extensive amount of security checks against the website. This scanning mode should only be used for specialized security audits since it can take a considerable amount of time to finish.

Limit crawl recursions to X iterations – After a site is crawled and vulnerability scanning has started, the scanner can still discover new objects – for which a new crawl will be started. This is called iteration. Select this option to configure the maximum number of crawl iterations that can happen during a website scan.

Enable Port Scanning – Enable this option to port scan the web server on which the target website is hosted during a web security scan by default. More information available about the Port Scanner and Network Alerts.

Collect uncommon HTTP Requests – Acunetix Web Vulnerability Scanner can report any uncommon server responses that might include sensitive data, such as internal server errors. These alerts are reported under the ‘Knowledge Base’ node in the Scan Results window.

Abort scan if the server stops responding – If Acunetix encounters a number of Network Errors, it will automatically abort the scan. Configure the maximum number of network errors the scanner must encounter before completely aborting the scan or disable it if you do not wish the scanner to automatically abort the scan.

Use cookies set by the site during scanning – By default, Acunetix Web Vulnerability Scanner ignores the cookies sent by the website during the scan but uses the ones discovered during the crawling process. Enable this option to always use the latest cookies provided by the website, ignore the cookies discovered in the crawl and use the ones the website is sending during the scan.

List of hosts allowed – By default, Acunetix Web Vulnerability Scanner will not crawl links outside the target URL. However, some links on some websites link to external locations outside the target URL and may be required for a complete scan. Configure Acunetix Web Vulnerability Scanner to include and follow these links in the ‘list of hosts allowed’ field. Enter the hostname or IP address of the domain to be included in a crawl /scan and click the ‘+’ button to add the entry. E.g. when scanning there are links which link to

Note: Hostnames can be specified using wildcards, e.g. ‘*’, which includes all websites with a suffix of such as A question mark can also be used as a wildcard, e.g. ‘host?’, would include all websites with one character added after ‘host’ such as

Share this post
  • how to continue the scan if it stop because of internet connection or low pc power?!?!

    • If the scan has been paused, you can click on the Resume button to start a scan. However, if the scan is done without scanning all the site, you will need to restart the scan.

  • In one of my scans Acunetix reported a CSRF missing error due to the crawl. I’ve since fixed this problem however Acunetix continuous to throw this error. I understand that this is because I’m reusing my crawl file during the scan and since CSRF error is thrown from the crawl this error is thrown again. Is there any way for me to update the crawl file for this page alone so that this error is not reported again?

  • Leave a Reply

    Your email address will not be published.