Robots.txt generator

Create robots.txt files to guide web crawlers




User agent



Crawl delay

Clean param

A robots.txt-file tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of search engines. To keep a web page out of search engines, block indexing with a noindex meta tag or password-protect the page.

Do you like

Sharing with your friends helps us grow and improve! We'd really appreciate your support in spreading the word!