Skip to main content
SEO

BingBot robots.txt – The 2025 Playbook

Add explicit robots.txt Allow rules for bingbot and BingPreview to capture Microsoft Copilot referrals without jeopardising Google rankings. The bots ignore crawl-delay, so throttle bursts with HTTP 429 or CDN rate caps.

Kaden Ewald
Founder & SEO Strategist
January 24, 202514 min

TL;DR

Add explicit robots.txt Allow rules for bingbot and BingPreview to capture Microsoft Copilot referrals without jeopardising Google rankings. The bots ignore crawl-delay, so throttle bursts with HTTP 429 or CDN rate caps. Test with cURL, enrich pages with Product/FAQ schema, and track sessions from utm_source=copilot.microsoft.com.

Why BingBot & Copilot Matter to Revenue

Microsoft's Copilot now serves 14% of global AI-chat traffic and fuels the only major AI assistant bundled into Windows and Office. Business-of-Apps reports 1.1 billion Copilot queries in May 2025 alone. Your content appears only if Bing's crawlers can fetch it.

2025 Adoption & Traffic Stats

Metric2024 → 2025Why It Matters
Copilot queries / month280M → 1.1BHuge discovery funnel
Bing legitimate bot traffic share~16% of good crawlsSecond only to Google
Domains allowing BingBot48% of top-10KEarly movers own answer cards

Meet Microsoft's Crawlers

BingBot vs BingPreview vs AdsBot

PurposeUser-agentBehaviourRobots.txt Support
Classic index & CopilotbingbotWide, steady crawlAllow/Disallow only
Snapshot / sidebar cardsBingPreviewRenders JS, burstyAllow/Disallow only

Key: BingBot does not honour crawl-delay. Rely on server-side throttling.

How to Spot Them in Logs

grep -Ei "bingbot|BingPreview" access.log | awk '{print $1,$12}' | head

Look for spikes above 15 req/s and add 429 gating if needed.

Robots.txt Configuration

Quick-Start Allow Block

# — Microsoft crawlers —
User-agent: bingbot
Allow: /

User-agent: BingPreview
Allow: /

Place above any wildcard User-agent: * group to avoid overrides.

Throttling & Burst Control

Bing ignores crawl-delay. Return HTTP 429 with a Retry-After header when sessions exceed 15 req/s. Apply CDN rules capping bingbot and BingPreview rate to 10 req/s per IP.

Troubleshooting Flowchart

  1. Add rules to robots.txt
  2. Test: curl -A "bingbot" https://yoursite.com/robots.txt — check 200 vs 403
  3. Observe logs for BingPreview within 24 hours
  4. Burst >15 req/s? Enable 429 gating

Schema & Content Optimisation

Product / Article JSON-LD Essentials

BingPreview lifts price, rating, and imagery into Copilot shopping cards directly from Product schema:

{
  "@context": "https://schema.org",
  "@type": "Product",
  "name": "Recycled-Aluminium Water Bottle",
  "sku": "ALU-WB-01",
  "offers": {
    "@type": "Offer",
    "price": "29.00",
    "priceCurrency": "USD",
    "availability": "https://schema.org/InStock"
  },
  "aggregateRating": {
    "@type": "AggregateRating",
    "ratingValue": "4.7",
    "reviewCount": "412"
  }
}

Add Article plus FAQPage schemas. Copilot quotes FAQ answers verbatim, linking back to your site.

Cross-Industry Mini Cases

SectorQuick WinResult
Retail / D2CExpose variant URLs & stock schema6.5% lift in Copilot-referral sessions (GWA client, Q2 2025)
B2B SaaSAllow docs, 429 at 12 req/s31% less bot bandwidth, citations intact
HealthcarePeer-review citationsMeets EEAT trust focus post-June 2025 core update

Risk, Compliance & Core-Update Alignment

Bandwidth — Throttle with 429; no crawl-delay support.

IP Ranges — Microsoft periodically publishes IP lists in Bing Webmaster Tools.

Licensing — Add “AI-training permitted for search only” clause in your Terms if needed.

EEAT — Display author creds, in-line citations, and original photos to align with Google's June 2025 core update.

Implementation Checklist

  1. Back up robots.txt
  2. Insert allow block above wildcard
  3. Test with curl -A "bingbot"
  4. Watch logs for bursts
  5. Create GA4 dimension for utm_source=copilot.microsoft.com
  6. Audit schema coverage
  7. Review server load after 14 days
  8. Update SOP
  9. Schedule quarterly crawl audit
  10. Book an expert SEO Audit for full AI readiness

FAQs

What is BingBot?

BingBot is Microsoft's primary web crawler, fetching pages for Bing Search and Copilot answers.

Is BingBot safe for SEO?

Yes. Allowing BingBot does not influence Google rankings; they use separate crawlers.

How do I block BingBot?

Add User-agent: bingbot plus Disallow: / in robots.txt. Keep BingPreview allowed if you still want image snapshots.

Does BingBot honour crawl-delay?

No. Throttle with HTTP 429 or CDN limits instead.

Where can I see BingBot traffic?

Filter logs for its user-agent or track utm_source=copilot.microsoft.com in GA4.

Is Your Site Ready for AI Search?

Configuring robots.txt is just one piece of the puzzle. Our AI Search Optimization service ensures your site is structured, cited, and visible across Google AI Overviews, ChatGPT, Perplexity, and Gemini.

Get a Free AI Search Audit

Next Steps

Ready to capture Copilot visibility? Start with a comprehensive SEO Audit—our team benchmarks crawl health, schema depth, and Bing eligibility in two weeks. Need content that wins citations? Explore our data-backed SEO programs that turn insight into demand.

Get marketing insights delivered

Join 5,000+ marketers getting actionable tips every week.

Want results like these?