robots.txt Generator

Complete robots.txt generator with AI bot control

AI Bot Control
GPTBot (ChatGPT)
OpenAI's crawler for ChatGPT training
ClaudeBot (Claude)
Anthropic's crawler for Claude training
PerplexityBot (Perplexity)
Perplexity's search crawler
Google-Extended (Gemini)
Google's AI training crawler
Googlebot (Standard)
Google's standard crawler for search
Bingbot
Microsoft Bing crawler
Blocked Paths (Disallow)
Allowed Paths (Allow)
Advanced Settings
Optional: delay between requests
Preferred domain (for Yandex)

More than a form — full control over crawlers and AI bots

Most robots.txt generators offer only a simple text field or a rigid template. The result is often an incomplete file that forgets important crawler rules or leaves AI bots uncontrolled access to content that should not be used for training.

This generator covers all relevant directives — from standard search engines to the latest AI crawlers — and produces a ready-to-deploy robots.txt without manual post-editing.

What the generator covers:

  • AI bot control — GPTBot (ChatGPT), ClaudeBot (Claude), PerplexityBot and Google-Extended (Gemini) are each individually toggleable. Blocked bots automatically receive a Disallow: / entry.
  • Disallow & allow paths — Any number of paths can be blocked or explicitly allowed. Typical examples like /admin/ or /private/ are pre-filled.
  • Crawl delay — Optional delay between crawler requests. Useful for websites with limited server capacity or to reduce aggressive crawling.
  • Sitemap declaration — One or more sitemap URLs can be entered directly. Google and Bing read this information automatically when crawling the robots.txt.
  • Standard & search engine crawlers — Googlebot and Bingbot are also individually controllable — for special cases like temporary de-indexing or relaunch scenarios.
  • Host directive — Optional specification of the preferred domain for Yandex and other search engines that support this directive.

The result is a clean, commented robots.txt — ready to download or copy directly to your server. Free, no account, instantly usable.

Common questions about the robots.txt Generator

What is a robots.txt and why do I need it? +

A robots.txt is a text file in the root directory of a website (accessible at domain.com/robots.txt). It tells search engine crawlers which areas may be crawled and which may not. Without a robots.txt, all crawlers have full access to the entire website.

What makes this generator different from others? +

This generator offers complete control over AI bots like GPTBot, ClaudeBot, PerplexityBot and Google-Extended — each individually toggleable. Additionally, disallow and allow paths, crawl delay and sitemap URLs can be configured.

Can AI bots like ChatGPT be blocked? +

Yes. Using the toggle switches, GPTBot (ChatGPT), ClaudeBot (Claude), PerplexityBot and Google-Extended (Gemini) can be specifically blocked. Important: AI bots respect robots.txt voluntarily — it does not replace legal protection.

How do I deploy the generated robots.txt? +

The generated robots.txt must be uploaded to the root directory of the website — accessible at https://domain.com/robots.txt. Google reads this file automatically at the next crawl. No manual submission is required.

What is crawl delay and when should I set it? +

Crawl-delay specifies how many seconds a crawler should wait between requests. Useful for websites with limited server resources. Important: Googlebot ignores this directive — crawl budget for Google is managed via Google Search Console.

Related Tools

robots.txt Validator
Check robots.txt for errors
Sitemap Generator
Generate XML sitemaps automatically
Schema.org Generator
Create structured data as JSON-LD