Free tool

Robots.txt Generator

Build a robots.txt with common rules, the right syntax, and your sitemap line.

Robots.txt Generator
Standard syntaxSitemap lineNo data leaves your device

Crawl rules

Disallowed paths

Sitemap

How it works

Pick which user-agents you want to address, the paths you want to disallow, and your sitemap URL. We assemble a syntactically correct robots.txt file you can copy and place at the root of your domain.

Methodology

Robots.txt is a polite-protocol file: well-behaved crawlers respect it, malicious ones ignore it. It is not a security control. Anything sensitive needs authentication, not a Disallow rule. Always include a Sitemap line so crawlers find your URL list quickly.

Frequently asked questions

Should I block AI training crawlers?

It's a policy choice. To block known AI crawlers, add User-agent rules for GPTBot, ClaudeBot, Google-Extended, and others, with a Disallow: /.

What if I need to block a page from indexing?

Robots.txt blocks crawling, not indexing. To prevent a page from appearing in search results, use a noindex meta tag in the page itself.

Other free tools