Robots.txt Generator

SEO-friendly robots.txt • Presets • Custom rules • Sitemaps

Standard / SEO: allow crawling of public content but block common admin and search pages. Good starting point for most sites.

Used to build Sitemap URLs and the Host directive. Include trailing slash.

Defaults to * for all crawlers. You can later add custom sections in the Custom rules box.

Not all search engines support crawl-delay. Use with care and only if needed.

Common rules
Sitemaps & host

Leave empty to use a single default sitemap at {site-root}/sitemap.xml (if enabled below).

Some search engines (mainly Yandex) support a Host directive. Usually the bare domain.

Custom rules

Preview & Code

Review the generated robots.txt, then copy it to your server. This tool does not write files automatically.

robots.txt preview

Saved Configurations

Saved locally in this browser. Ideal for keeping staging vs production templates.

    No saved configurations yet. Generate a robots.txt and click Save to keep it here.