🤖 Robots.txt Generator
Create a valid robots.txt file with user-agent rules, sitemaps, and crawl-delay directives.
User-agent: * Allow: /
How to Use the Robots.txt Generator
Use the Quick Presets for common configurations, or build custom rules by adding user-agent blocks with allow/disallow paths. Add your sitemap URL(s) so search engines can discover all your pages. When finished, click Download to save the file and upload it to your website's root directory.
Robots.txt Syntax Reference
- User-agent: * — Applies the rule to all crawlers
- Disallow: /private/ — Blocks crawling of the /private/ directory
- Allow: /private/public-page.html — Overrides a Disallow for a specific page
- Crawl-delay: 10 — Wait 10 seconds between requests (not supported by Google)
- Sitemap: — Points crawlers to your XML sitemap for better discoverability
Common Mistakes to Avoid
- Blocking CSS/JS files that Google needs to render your pages
- Using robots.txt to hide sensitive content (use authentication or noindex instead)
- Forgetting to allow your sitemap URL
- Placing robots.txt in a subdirectory instead of the root domain