External Tools Support in v10

Acunetix WVS v10 has introduced several new features, including an entirely re-engineered Login Sequence Recorder. The feature we’re going to be focusing on in this post is the ability to import the output of other tools into Acunetix WVS to facilitate the testing process of complex web applications and web services.

The crawler can automatically crawl practically all of your web application automatically, even if the web application you’re testing is fully loaded with the latest and greatest JavaScript frameworks and is relying heavily on HTML5 — for most users that will suffice. However, since a crawler works off-of links on a page, it can’t crawl what is not linked.

This is sometimes an issue pen-testers face with some applications, but more so with RESTful web services — sometimes, the way a web application or a web service is structured does not provide the crawler with links or references that can allow it to crawl the entire application.

Consider the example below. If the page ‘/secret-admin’ was not linked from anywhere in the site structure of the website or web application being crawled, it will never be picked up by the crawler, and what isn’t crawled can’t be scanned because the scanner simply does not know the page exists.



This is even more common in RESTful web services that do not use a WADL definition. A WADL definition is a description of the web service (like WSDL is to SOAP) and when supplied to Acunetix WVS, it eliminates the need for crawling.

Acunetix WVS version 10 introduces the ability to import results from its own HTTP Sniffer (.SLG) as well as other external tools such as Portswigger Burp Suite (Burp Suite XML), Telerik Fiddler (.SAZ) and any tool that can export an HTTP Archive file (.HAR).

This allows pen-testers to further extend their manual testing workflow and automate more of the security testing process, allowing more time and focus for discovering logical vulnerabilities.
scan wizard


Importing results from external tools is easy and is available both as part of the Scan Wizard, as well as a tool inside of the Site Crawler. Once the file is imported, the entries inside the file will be used to pre-seed the crawl with the data collected manually from external tools, the crawler will still run and attempt to find even more pages.

Share this post

Leave a Reply

Your email address will not be published.