You can crawl a website with Netsparker Desktop web application security scanner and use the crawl results as a base for future scans. This functionality is handy if you want to scan the target website again in the future but only scan the links that are identified during this crawl. To do so you need to:
Launch the Web Application Crawl
- Specify the URL of the target website and from the Scan Mode drop down menu select Crawl & Wait. In this mode Netsparker Desktop will crawl the target web application but will not start attacking the web application.
- Once the crawl is ready select Export from the File drop down menu and save it.
At this stage you can shut down Netsparker.
Scan a Web Application from a Saved Crawl
To launch a scan from a saved crawl:
- Run Netsparker Desktop and select Import from the File drop down menu.
- Navigate and select the saved crawl.
- Click the Resume button in the top left scan menu to proceed with the attack phase.