Performance Analysis in Netsparker

Sometimes, scans take longer than anticipated. Scan performance is dependent on various factors:

  • URL Rewrites
  • Dynamic links
  • Hardware that Netsparker Standard is running on
  • Network bandwidth
  • Other factors

First of all, going through a proxy (Burp, Fiddler, OWASP ZAP, for example) will definitely affect the performance of a scan. But, even without a proxy, scanning websites can still be slow based on the actual website itself. This can generally be fixed.

Is Your Website Public?

  • First consider whether it is possible for our Q/A Team to reach your website simply to understand its structure. (We do NOT perform any scans without your explicit permission.)
  • If your website is not public, you can send us the scan files (.nss, .ndb and .csv) that you have.
  • (These files can be found in the C:\Users\\[YOUR_USERNAME]\Documents\Netsparker\Scans\\[WEBSITE_NAME-RANDOM_ID] folder)
  • From there we can analyze why it is slow and recommend some configuration changes.
  • Then you can restart the scan or run a new scan with performance analysis enabled, and send us the related activitySummary.csv file. This will help us to assist you to further improve your scan performance.
  • (The file is located in C:\Users\\[YOUR_USERNAME]\Documents\Netsparker\Scans\\[WEBSITE_NAME-RANDOM_ID])

How to Enable Performance Analysis

  1. Open Netsparker Standard
  2. From the Home tab, Options. The Options dialog is displayed.
  3. Select Logging. The Logging Levels section is displayed.

  1. Enable the Performance Analysis checkbox.
  2. Click Save.

Most Common Reasons for Performance Issues

This table lists and explains the most common reasons for performance issues.

Reason

What Action to Take

Server Performance  

Login to the target web application’s server and database server to ensure that the CPU load is in the expected range. If it’s unexpectedly high, stop the scan until it’s back to the normal CPU load. Then test different engines and URLs separately to identify which part of the application is causing the problem, and fix it.

Connection Between You and the Target Application

If your internet connection, or the connection between Netsparker’s system and the target web server, is slow, your scan will be quite slow. Netsparker carries out lots of requests. So, if the target application’s response time is slow, there is not much you can do (other than finding a faster connection or optimizing your scan settings).

There are couple of ways to do this:

Alternatively, scan our test application at http://aspnet.testsparker.com/ to help gather information about the network performance. If your scan performance is good for http://aspnet.testsparker.com/, but bad for your application, it means it's either the connection or something specific to the target application.

Client-Side Performance Issues

The easiest way to check for this is to open your computer's Task Manager and watch Netsparker’s CPU usage. If it’s too high during the Crawling stage, do the following:

  1. Open Netsparker Standard
  2. From the Home tab (while a scan is running), click Scan Policy Editor. The Scan Policy Editor dialog is displayed.
  3. Select JavaScript. The JavaScript section is displayed.
  4. Deselect Analyze JavaScript / AJAX. (Please note that, in order to change this setting, a new policy should be created or one of the default policies should be cloned because Netsparker’s default policies are read-only).

This will speed up the scan, though it’ll decrease the crawling quality if the website includes lots of JavaScript and AJAX code.  

If CPU usage is also high during the Attacking phase, and if this is not happening in any other website, it’s possible that Netsparker can’t handle a certain code in that web application. Please report this to us so we can investigate the problem in detail and address it, or offer you a solution. You can also check the Scan Performance and Slowest Pages nodes in the Knowledge Base panel to see how different security check engines are behaving.

URL Rewrite Configuration  

If your website has extensive URL Rewrites, you can configure it manually (see How to Configure URL Rewrite Rules in Netsparker)

Heuristic URL Rewrite is generally good enough, but it might not pick up your rules as quickly as manual configuration.

Scan Policy Optimization  

Ensure that you have optimized the policy with the optimization wizard for your environment (see Automatically Optimize Scan Policies for Quicker and More Efficient Scans).

DOM-Based Cross Site Scripting (XSS) Checks  

DOM XSS can be extremely slow, especially in large, complex pages. After optimization, you can disable this in your Scan Policy. You can do this by going to Scan Policy Editor and unchecking Cross-site Scripting (DOM Based) (see (see Automatically Optimize Scan Policies for Quicker and More Efficient Scans).

Also, remember to select this optimized policy in the Start a New Website or Web Service Scan dialog, when starting the scan.

Overall, these should help you fix the performance issues, we are also happy to take a look at your scan or website to suggest a more precise configuration.

Number of links crawled

The Crawled URLs List report will help you to diagnose performance issues. For example, if there are dozens of similar repeating requests with same set of POST parameters, the Recurring Parameters option would help eliminate those redundant requests.

For further information, see Lists.

Netsparker

Dead accurate, fast & easy-to-use Web Application Security Scanner

GET A DEMO