Set CrawlId to make this crawl resumable. Will also resume a previous scrawl with the same CrawlId if it exists.
Giving a CrawlId also causes the WebScraper to auto-save its state every 5 minutes in case of a crash, system failure or power outage. This feature is particularly useful for long running web-scraping tasks, allowing hours, days or even weeks of work to be recovered effortlessly.
public void Start( string CrawlId = null )
Public Sub Start ( Optional CrawlId As String = Nothing )