Latest Comparison Report from Larry Suto

Last week, Larry Suto published a report entitled “Accuracy and Time Costs of Web Application Security Scanner Report”.  I’ve started to investigate in detail the results from this report. And I’ve found a list of inaccuracies.  Here is a direct quote from his paper:

Methodology

In order to cover as many bases as possible it was decided to run each scanner in two ways:

1. Point and Shoot (PaS): This includes nothing more than run default scanning options and provide credentials if the scanner supported it and the site used any.

2. Trained: This includes any configurations, macros, scripts or other training determined to be required to get the best possible results. As needed help was requested from the vendors or from acquaintances with expertise in each scanner to make sure that each was given all possible opportunity to get its best possible results.

Therefore he’s defining two modes; Point and Shoot and Trained. In the Point and Shoot mode he’s supposed to use the default scanning options AND provide credentials if the scanner supported it.

Except that for our scanner, he’s not doing this. Let’s take our test PHP website testphp.acunetix.com.

Here is a quick excerpt from his results:

Acunetix is listed as not finding any of the 4 XSS vulnerabilities from userinfo.php (trained or untrained).
That came as a big surprise to me. I’ve quickly made a test and surely, the vulnerabilities were found by Acunetix WVS.

This file “userinfo.php” is only available after you provide valid credentials, it’s not possible to access this file unauthenticated.

They were not found because Larry didn’t authenticated our scanner (didn’t provided any credentials). No wonder that Acunetix didn’t found the vulnerabilities. The same situation with the SQL vulnerability from cart.php (the shopping cart is only available when you are authenticated). He didn’t authenticated our scanner neither in the Point and Shoot mode or in the Trained mode. That’s not fair for us.

I then moved to the Cenzic test website (http://crackme.cenzic.com). Here Acunetix is listed as not finding a number of XSS vulnerabilities in various files such as /Kelev/php/transfer.php (parameters Amount, ToAccountNo), file /kelev/php/accttransaction.php (parameters FromDate, ToDate) and so on.

I’ve started a scan for crackme.cenzic.com and guess what?  All those vulnerabilities were found by Acunetix WVS. I think it’s the same situation as before: the scanner was not authenticated and therefore, it couldn’t access those pages.

Below, I’ve attached a screen shot with those vulnerabilities found by Acunetix WVS:

Therefore, Acunetix WVS was clearly disadvantaged in this comparison report.  It’s not possible to find vulnerabilities in authenticated pages without providing the right credentials.

In the end, I would like to point out a very suspicious log event from our test website. While analyzing the logs from testphp.acunetix.com I’ve found the following entry:

72.25.78.35 – – [20/Jan/2010:08:44:58 +0100] “GET /Flash/add.swf HTTP/1.1″ 200 17418 “file:///C:/NTOBJECTIVES/SOURCE/ntospider_5_0/ntospider/NTOGUI/NtoGui/Debug/Reports/acunetix/
2010_01_19_23_43/DF4D21797A665BCA9B48B5B5F5C37C2″ “Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0; Trident/4.0; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; .NET CLR 1.1.4322; .NET CLR 3.5.30729; .NET CLR 3.0.30618)”

This log entry was generated by NTOSpider while scanning our test website. What’s suspicious about this log entry is the Referer field:

file:///C:/NTOBJECTIVES/SOURCE/ntospider_5_0/ntospider/NTOGUI/NtoGui/Debug/Reports/acunetix/
2010_01_19_23_43/DF4D21797A665BCA9B48B5B5F5C37C2

Notice the directory: C:/NTOBJECTIVES/SOURCE/ntospider_5_0/? SOURCE? Debug?

Only NTObjectives employees should have access to the NTOSpider source code. I don’t have enough evidence to directly accuse NTObjectives, however, that log entry looks suspicious to me.

  • file:///C:/NTOBJECTIVES/SOURCE/ntospider_5_0/ntospider/NTOGUI/NtoGui/Debug/Reports/acunetix/
    2010_01_19_23_43/DF4D21797A665BCA9B48B5B5F5C37C2

    LOL %))

  • It’s really unusual that NTOSpider beat Acunetix on its own test web sites.

    I agree, Something smells bad…

  • Bogdan, Thats a pretty outrageous assertion. We did run some tests from our side as well when we heard about what Larry was doing. That internal file that is referenced is from the report generated from a scan and then we click the Validate button from our reports. Just because one of our developers ran a scan against your test site, you want to accuse us of foul play… that is really low.

    I also asked Larry about the vulns on you missed on your test site, and he said that your tool does not have automated form login support, and that users are required to generate a Login Macro. So when he did Point and Shoot, it meant just that. His Trained scan was the one that he did macros for, and this was consistent for all the scanners. As far as I understand, all the other scanners have automated form based login support, including ours.

  • Hi,

    Someone brought this to my attention. I just want to reiterate that many of the vendors new about the study and nothing prevented them from researching the sites.

    Also point and shoot did not include any login macros…many of the scanners work with just entering username and password…I tried to keep it consistent in that way

  • @Dan Kuykendall: Like I mentioned in the article, I don’t have enough evidence to directly accuse NTObjectives. However, that log entry looks suspicious to me. That’s just my opinion, take it as you want.

    Yes, Larry told me that he didn’t recorded a Login Sequence for our test website. Therefore our scanner had to find vulnerabilities in an authenticated area without any credentials. We don’t have automated form based login support because I don’t see the point in that. The application can log you out anytime so I don’t see much gain from implementing that. Our Login Sequence Recorder is much more flexible because it automatically detects when the session is invalidated and reruns the login sequence.

    @Larry Suto: By trying to keep it consistent you made it unfair for us.

  • Bogden,

    I think you bring up an issue that needs to be addressed in further comparisons…the point and shoot category seems to cause the most controversy…maybe for future tests the vendor can supply a point and shoot config or something. And if you look at HPs recent response they seem to indicate that these sites are unrepresentative of typical applications as they create unusual security situations that are designed to show off scanner capabilities….so are we saying point and shoot is a meaningless category and the only fair test is to have the vendor expertly tune the site beforehand?

    Larry

  • Larry, I’m not saying that Point and Shoot is a meaningless category. Actually, my opinion is that Point and Shoot is the best way to test an automated scanner. Everything else will cause controversy because some vendor will say that his scanner is not configured correctly and so on.

    However, what you did was not Point and Shoot. That’s where we disagree. For some scanners you entered the credentials and for our scanner you didn’t. If it’s Point and Shoot then let it be Point and Shoot: just enter the URL, hit enter and let the scanner do his job.

    Having the vendor expertly tune the site beforehand it’s not good either because some vendors might go too far and do more than tunning.

    As a conclusion, I think that the best comparison would be to take a list of real open source web applications (so everybody can reproduce the results) on different platforms (PHP, .NET, Java, …) and scan them using Point and Shoot. The scanner that finds the most vulnerabilities wins. The vendors should be informed after the results are completed and not before.

  • Oh the drama! A couple questions come to mind:

    1. Is the log entry timestamp meaningful? Is that when Suto was conducting his tests? If not, was it ‘coincidence’ that NTO had a developer testing a competitor’s web site?

    2. Dan Kuykendall’s mock outrage is actually very telling. Read between his lines. Are you honestly NOT that up on your competitors? Are you not using their utilities to determine their capability? If your developer was running this scan, why from a “dsl extreme” IP and not from an NTO owned IP to be clear what was going on? If an NTO dev was running your product against a competitor, I think it is safe to assume it is so that your product performs well against a competitor demo app, specifically for the purpose of selling your product. If not, you can install a handful of 3+ year old PHP apps and get better diagnostics and tuning data. Last, a lot of “we” that is implied, but why not sign in a manner that is clear you work for NTO in some capacity?

    3. Suto, why not publish all the results of your testing, including timestamps / reports of the scanners. Let people see the raw data to know it wasn’t you using a special copy of NTO. That would put one issue to rest, yes?

  • @ Bogdan:

    Please pay attention to the conversation by reading what others have to say. It would also be great if you think before you post a comment.

    Honestly, the way that you describe Point-and-Shoot is basically equivalent to seeing if the website owner has malformed HTML. Without logging into a web application? What are you thinking?

    @ jericho: it’s a DSL-E prefix according to ARIN, RADB, and live BGP

  • @Andre: You don’t need to provide credentials to test the capabilities of an automated scanner. You can test the unauthenticated part of the application. What I was proposing is to reduce the testing complexity. Adding authentication into the mix just adds unnecessary complexity to the testing process. Lary’s comparison is a good example.

  • I have been evaluating scanners for our firm and I would like to understand the complexity and results of the point and shoot. I am not only interested in the vulnerabilities found, but also the ease and time required to set up and prepare the scan for each product.

    HP’s response to their results in Larry’s report was to encourage us to run WebInspect ourselves and make our own conclusions and not to make accusations.

  • @Jeff

    We do recommend the same procedure; you should always try the product against the website or web application it will be scanning. If you are interested in a trial version of Acunetix WVS, contact our sales on sales@acunetix.com.

  • Leave a Reply

    Your email address will not be published.


    *