Skip to main content
SEO

ClaudeBot robots.txt – The 2025 Playbook

Add explicit robots.txt Allow rules for ClaudeBot, Claude-SearchBot, and Claude-User to unlock Claude.ai citations and shopping cards. Anthropic's bots respect the non-standard Crawl-delay, so you can throttle bursts without blocking visibility.

Kaden Ewald
Founder & SEO Strategist
January 24, 202514 min

TL;DR

Add explicit robots.txt Allow rules for ClaudeBot, Claude-SearchBot, and Claude-User to unlock Claude.ai citations and shopping cards. Anthropic's bots respect the non-standard Crawl-delay, so you can throttle bursts without blocking visibility. Follow the quick-start snippet below, test with cURL, and enrich pages with Product/FAQ schema.

Why ClaudeBot Matters to Revenue

Anthropic's conversational assistant Claude leapt from 18% to 29% enterprise market share in 2025. Its crawler, ClaudeBot, now sits among the five busiest AI spiders—logging million-hit days on knowledge bases and e-commerce catalogs. If your products or guides aren't fetchable, a rival's will fill the answer pane.

2025 Market & Traffic Stats

Metric (May 2024 → May 2025)YoY ChangeWhy It Matters
AI & search crawler traffic share+18%Non-human visits now rival human traffic
ClaudeBot support for Crawl-delayYesRare among AI bots; lets you pace requests
Sites explicitly allowing ClaudeBot43% of top-10K domainsEarly adopters own Claude answer boxes

Meet Anthropic's Crawlers

ClaudeBot vs Claude-SearchBot vs Claude-User

PurposeUser-agentBehaviourRobots.txt Support
Model trainingClaudeBotWide, steady crawlAllow/Disallow, Crawl-delay
Search indexClaude-SearchBotSpike bursts tied to queriesAllow/Disallow
On-demand fetchClaude-UserLow-volume, real-time pullsAllow/Disallow

Key point: ClaudeBot is the only major AI crawler that honours the Crawl-delay directive.

How to Spot Them in Logs

grep -E "ClaudeBot|Claude-SearchBot|Claude-User" access.log

Look for bursts exceeding 10 req/s as a signal you may need throttling.

Robots.txt Configuration

Quick-Start Allow Block

# — Anthropic allow-list (training + search) —
User-agent: ClaudeBot
Allow: /

User-agent: Claude-SearchBot
Allow: /

User-agent: Claude-User
Allow: /

Place this block above any wildcard User-agent: * rules so it isn't overridden.

Crawl-delay & Burst Throttling

ClaudeBot honours the non-standard Crawl-delay directive. Start conservative:

User-agent: ClaudeBot
Crawl-delay: 2

If spikes persist, return HTTP 429 with a Retry-After header or apply CDN rules.

Troubleshooting Flowchart

  1. Add rules to robots.txt
  2. Test: curl -A "ClaudeBot" https://yoursite.com/robots.txt — check 200 vs 403
  3. Observe logs for Claude-SearchBot within 24 hours
  4. Burst >10 req/s? Add Crawl-delay or 429 gating

Schema & Content Optimisation

Product / Article JSON-LD Essentials

Embed Product, Offer, and AggregateRating within 32 KB to avoid truncation. Claude uses this markup to build price-and-rating cards in chat snippets:

{
  "@context": "https://schema.org",
  "@type": "Product",
  "name": "Eco-Blend Running Socks",
  "sku": "ECO-SOCK-001",
  "offers": {
    "@type": "Offer",
    "price": "18.00",
    "priceCurrency": "USD"
  },
  "aggregateRating": {
    "@type": "AggregateRating",
    "ratingValue": "4.8",
    "reviewCount": "127"
  }
}

For blog and documentation pages, add Article plus an FAQPage block. Claude-SearchBot pulls FAQ answers verbatim—prime real estate for your messaging.

Cross-Industry Mini Cases

SectorQuick WinResult
D2C RetailServe variant URLs with stock schema8% uplift in Claude-sourced sessions (GWA client, Q2 2025)
B2B SaaSAllow public API docs, throttle bursts38% less bot bandwidth, citations maintained
HealthcarePeer-review citations + HIPAA noticeComplies with June 2025 core update trust focus

Risk & Compliance

Bandwidth — Use Crawl-delay plus 429 gating to prevent hammering.

IP Ranges — Not published; robots.txt is the main control mechanism.

Licensing — State AI-training permissions clearly in your Terms of Service.

CAPTCHA / Auth — Claude bots do not bypass CAPTCHAs or login walls.

Implementation Checklist

  1. Back up your current robots.txt
  2. Insert allow block for all three Anthropic agents
  3. Run curl -A "ClaudeBot" tests
  4. Monitor logs for new hits
  5. Create GA4 filter for utm_source=claude.ai
  6. Audit schema coverage across key pages
  7. Review server load after 14 days
  8. Document steps in SOP
  9. Schedule quarterly crawl audit
  10. Book an expert SEO Audit to benchmark AI readiness

FAQs

What is ClaudeBot?

ClaudeBot is Anthropic's primary web crawler, gathering public pages for model training and Claude.ai answer generation.

Is ClaudeBot safe for SEO?

Yes. It respects robots.txt—including Crawl-delay—so Google rankings remain unaffected.

How do I block ClaudeBot?

Add User-agent: ClaudeBot plus Disallow: /. Leave Claude-SearchBot allowed if you still want visibility in Claude answers.

Does ClaudeBot ignore crawl-delay?

No. Anthropic confirms compliance with this directive, making it unique among major AI crawlers.

Where can I see ClaudeBot traffic?

Filter server logs or create a GA4 custom dimension for utm_source=claude.ai.

Next Steps

Ready to translate AI crawler traffic into revenue? Start with a comprehensive SEO Audit—our team benchmarks crawl health, schema depth, and Claude eligibility in two weeks. Need content that earns citations? Explore our data-driven SEO programs that turn insights into demand.

Get marketing insights delivered

Join 5,000+ marketers getting actionable tips every week.

Want results like these?