← All Tools

Robots.txt Generator

Create robots.txt rules to control search engine crawling.

Presets

User-Agent Rules

Sitemaps

Additional

What is robots.txt?

The robots.txt file tells search engine crawlers which URLs they can access on your site. It's placed in the root directory (e.g., example.com/robots.txt). While most well-behaved crawlers respect these rules, robots.txt is advisory — it doesn't enforce access control. For sensitive content, use proper authentication instead.