False Negatives in Web Application Security

What are false negatives and what cases automated web application security scanners to not detect a vulnerability? In this web application security blog post Robert Abela explains what false negatives are and what to look for when searching for an automated web vulnerability scanner to ensure that it detects all vulnerabilities and leaves no false negatives behind for malicious attackers to exploit.

As we have seen in a previous blog post, false positives in web application security have a long term bad affect on the security of your web applications and also on the procedures used for web application security. False negatives, i.e. web application vulnerabilities which are not detected by an automated vulnerability scanner can have the same long term bad effect, i.e. your web applications will have vulnerabilities that can be exploited by hackers.

What Causes False Negatives

The main reason why an automated web application security scanner does not detect a web vulnerability is because it did not crawl the vulnerable object. This type of automated scanner limitation raises again a number of questions, such as should automated web application security scanners be trusted and used in a web application penetration test?

Choosing the Right Web Application Security Scanner

The answer is yes, automated web application security scanners should be used, but one must do his homework properly when choosing an automated web application security scanner. One of the main features to look for when choosing a web application security scanner for your business is the crawler's ability to crawl your web applications. Before the scanner starts attacking a web application, the crawler crawls all of the web application content to identify all the inputs and attack surfaces. If a single attack surface is not crawled, it will not be scanned for vulnerabilities and if there is a vulnerability, it won't be reported.

Many of today's modern web applications use custom 404 error pages, URL rewrite rules for search engine friendly URL's and anti-CSRF mechanism to protect them from cross-site request forgery attacks. Although these features make a website more user friendly and secure, they typically hinder the crawler from identifying all attack surfaces.

The crawler of several automated web application security scanners can be configured and tweaked to crawl such web applications, but there is no guarantee it will identify all attack surfaces. Also configuring crawlers is very difficult and time consuming. Unless you are a seasoned penetration tester with years of experience, it is virtually impossible to understand what is happening under the hood of a web scanner. So unless your business affords an experienced penetration tester, you cannot use such scanners.

The Purpose of a Web Application Security Scanner

The whole point of investing in an automated web application security scanner is to ease the process of penetration testing and automate as much as possible, and not to spend countless hours tweaking it. A good web application security scanner should support at least 90 to 95 percent of web application technologies out of the box.  The best test to see if investing in an automated web application security scanner has a proper return of investment for your business is to simply try several different ones against a web application of your own and see if it crawls it properly without the need to tweak it.

Automatically Detect Web Vulnerabilities and Avoid False Negatives

Netsparker web application security scanner has a crawler that can crawl web applications built with any type of framework. You do not need to configure custom 404 error pages or URL rewrite rules. It heuristically detects them and auto configures itself to ease the job for you. The crawler of Netsparker also has a built-in AJAX and JavaScript engine which parses, executes and analyses these commonly used scripts in today's web applications. Last but not least, Netsparker also supports anti-CSRF tokens to ensure that web applications that use such technologies can still be crawled automatically and all attack surfaces in a web application are identified. For more information about the Netsparker crawler refer to the Advanced Web Application Security Scanning section.

Your Information will be kept private.