Robots.txt Generator

Create robots.txt files for search engine crawlers

Generate Robots.txt

User Agents

Sitemap

Crawl Settings

Time delay between requests (0 = no delay)

Format: requests/seconds (e.g., 1/1s)

What is Robots.txt?

Robots.txt is a text file that tells search engine crawlers which pages or files the crawler can or cannot request from your site.

It's used to manage crawl traffic and avoid overloading your site with requests, and to keep sensitive pages out of search results.

Common Directives:

  • User-agent - Specifies which crawler
  • Disallow - Blocks access to paths
  • Allow - Explicitly allows access
  • Crawl-delay - Sets delay between requests
  • Sitemap - Points to sitemap location

Best Practices

Place at Root

File must be at domain root: example.com/robots.txt

Use Lowercase

URLs and directives should be lowercase

Test Thoroughly

Use Google's robots.txt tester to validate

Be Specific

Only block what needs to be blocked