robots.txt Generator
Generate effective robots.txt files that help ensure Google and other search engines are crawling and indexing your site properly. Generate and upload at the root folder of your website.
By default, all robots are:
Specify User-agents (comma-separated) Optional:
Crawl delay:
Enter Sitemap URLs (one per line):
Allowed directories or files:
Restricted directories or files:
Block specific query strings:
Specify the preferred host (optional):
Your robots.txt: