Robots.txt Generator
Known Bots
Click to add a block rule for common crawlers
Search Engines
GooglebotBingbotYandexBaiduDuckDuckGo
AI Crawlers
GPTBotChatGPT UserGoogle AIAnthropicClaudeBotCommon CrawlPerplexityByteDance
Social Media
TwitterFacebookLinkedIn
SEO Tools
AhrefsSemrushMajesticMoz
Rules
Sitemaps
Preview
User-agent: * Disallow:
Robots.txt Generator
Create valid robots.txt files using a visual builder. Configure user-agent rules, allow/disallow paths, crawl-delay settings, and sitemap references — with a live preview that updates as you type.
Features
- Visual rule builder with add/remove support
- Allow and Disallow path directives per user-agent
- Crawl-delay configuration with validation
- Sitemap URL references with absolute URL validation
- Live preview with one-click copy to clipboard
- Follows RFC 9309 (Robots Exclusion Protocol)
How It Works
- Configure one or more user-agent rule groups
- Add Allow or Disallow path directives to each group
- Optionally set a Crawl-delay value (note: not all crawlers support this)
- Add your sitemap URLs
- Copy the generated output to your clipboard
Notes
- Wildcard patterns (
*and$in paths) are Google extensions and may not be supported by all crawlers - Crawl-delay is not part of the official RFC 9309 standard and is ignored by some major crawlers
- Sitemap directives appear at the end of the file, outside all rule groups, per the specification
- Non-ASCII characters in paths are automatically percent-encoded in the output