1. Home
  2. Docs
  3. Support
  4. Modifiers
  5. Using Robots.txt

Using Robots.txt

A robots.txt file tells search engine crawlers which URLs the crawler can (or can’t) access on your site.
This is used mainly to avoid overloading your site with requests that might affect its performance.

If you don’t have one, create it using this Link.

After creating it, before initiating the scan, go to the “Crawler” tab in New Scan. click on Browse in the Robots.txt field and choose your file:

That’s it! if you don’t have any other modifiers, initiate your scan.