Tool

Robots.txt Generator

Known Bots

Click to add a block rule for common crawlers

Search Engines

GooglebotBingbotYandexBaiduDuckDuckGo

AI Crawlers

GPTBotChatGPT UserGoogle AIAnthropicClaudeBotCommon CrawlPerplexityByteDance

Social Media

TwitterFacebookLinkedIn

SEO Tools

AhrefsSemrushMajesticMoz

Rules

Sitemaps

Preview

User-agent: *
Disallow:

Robots.txt Generator

Create valid robots.txt files using a visual builder. Configure user-agent rules, allow/disallow paths, crawl-delay settings, and sitemap references — with a live preview that updates as you type.

Features

  • Visual rule builder with add/remove support
  • Allow and Disallow path directives per user-agent
  • Crawl-delay configuration with validation
  • Sitemap URL references with absolute URL validation
  • Live preview with one-click copy to clipboard
  • Follows RFC 9309 (Robots Exclusion Protocol)

How It Works

  1. Configure one or more user-agent rule groups
  2. Add Allow or Disallow path directives to each group
  3. Optionally set a Crawl-delay value (note: not all crawlers support this)
  4. Add your sitemap URLs
  5. Copy the generated output to your clipboard

Notes

  • Wildcard patterns (* and $ in paths) are Google extensions and may not be supported by all crawlers
  • Crawl-delay is not part of the official RFC 9309 standard and is ignored by some major crawlers
  • Sitemap directives appear at the end of the file, outside all rule groups, per the specification
  • Non-ASCII characters in paths are automatically percent-encoded in the output