Sitemap


Host

Policy


User agent

Allow

Disallow

Crawl delay

Clean param

Generate robots.txt-files.

A robots.txt-file tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of search engines. To keep a web page out of search engines, block indexing with a noindex meta tag or password-protect the page.