Can I get WebAuditr to obey or ignore my robots.txt file when it crawls my site?
WebAuditr will obey the robots.txt live on your site, based on the user agent you have selected for the crawl. You can also use the WebAuditr Robots Overwrite feature to ignore your current robots.txt file during a crawl and use the alternative version you have specified. If WebAuditr is specifically disallowed in a robots.txt file, then we will always respect this.
Will WebAuditr slow down my site when it’s crawling?
Most sites never experience a site slow down whilst using WebAuditr. Sometimes sites can experience a slow down if their server capacity is not able to handle user demand or there is an increase in user demand with WebAuditr running at the same time.