TL;DR
Add explicit robots.txt Allow rules for bingbot and BingPreview to capture Microsoft Copilot referrals without jeopardising Google rankings. The bots ignore crawl-delay, so throttle bursts with HTTP 429 or CDN rate caps. Test with cURL, enrich pages with Product/FAQ schema, and track sessions from utm_source=copilot.microsoft.com.
Why BingBot & Copilot Matter to Revenue
Microsoft's Copilot now serves 14% of global AI-chat traffic and fuels the only major AI assistant bundled into Windows and Office. Business-of-Apps reports 1.1 billion Copilot queries in May 2025 alone. Your content appears only if Bing's crawlers can fetch it.
2025 Adoption & Traffic Stats
| Metric | 2024 → 2025 | Why It Matters |
|---|---|---|
| Copilot queries / month | 280M → 1.1B | Huge discovery funnel |
| Bing legitimate bot traffic share | ~16% of good crawls | Second only to Google |
| Domains allowing BingBot | 48% of top-10K | Early movers own answer cards |
Meet Microsoft's Crawlers
BingBot vs BingPreview vs AdsBot
| Purpose | User-agent | Behaviour | Robots.txt Support |
|---|---|---|---|
| Classic index & Copilot | bingbot | Wide, steady crawl | Allow/Disallow only |
| Snapshot / sidebar cards | BingPreview | Renders JS, bursty | Allow/Disallow only |
Key: BingBot does not honour crawl-delay. Rely on server-side throttling.
How to Spot Them in Logs
grep -Ei "bingbot|BingPreview" access.log | awk '{print $1,$12}' | headLook for spikes above 15 req/s and add 429 gating if needed.
Robots.txt Configuration
Quick-Start Allow Block
# — Microsoft crawlers —
User-agent: bingbot
Allow: /
User-agent: BingPreview
Allow: /Place above any wildcard User-agent: * group to avoid overrides.
Throttling & Burst Control
Bing ignores crawl-delay. Return HTTP 429 with a Retry-After header when sessions exceed 15 req/s. Apply CDN rules capping bingbot and BingPreview rate to 10 req/s per IP.
Troubleshooting Flowchart
- Add rules to robots.txt
- Test:
curl -A "bingbot" https://yoursite.com/robots.txt— check 200 vs 403 - Observe logs for BingPreview within 24 hours
- Burst >15 req/s? Enable 429 gating
Schema & Content Optimisation
Product / Article JSON-LD Essentials
BingPreview lifts price, rating, and imagery into Copilot shopping cards directly from Product schema:
{
"@context": "https://schema.org",
"@type": "Product",
"name": "Recycled-Aluminium Water Bottle",
"sku": "ALU-WB-01",
"offers": {
"@type": "Offer",
"price": "29.00",
"priceCurrency": "USD",
"availability": "https://schema.org/InStock"
},
"aggregateRating": {
"@type": "AggregateRating",
"ratingValue": "4.7",
"reviewCount": "412"
}
}Add Article plus FAQPage schemas. Copilot quotes FAQ answers verbatim, linking back to your site.
Cross-Industry Mini Cases
| Sector | Quick Win | Result |
|---|---|---|
| Retail / D2C | Expose variant URLs & stock schema | 6.5% lift in Copilot-referral sessions (GWA client, Q2 2025) |
| B2B SaaS | Allow docs, 429 at 12 req/s | 31% less bot bandwidth, citations intact |
| Healthcare | Peer-review citations | Meets EEAT trust focus post-June 2025 core update |
Risk, Compliance & Core-Update Alignment
Bandwidth — Throttle with 429; no crawl-delay support.
IP Ranges — Microsoft periodically publishes IP lists in Bing Webmaster Tools.
Licensing — Add “AI-training permitted for search only” clause in your Terms if needed.
EEAT — Display author creds, in-line citations, and original photos to align with Google's June 2025 core update.
Implementation Checklist
- Back up robots.txt
- Insert allow block above wildcard
- Test with
curl -A "bingbot" - Watch logs for bursts
- Create GA4 dimension for
utm_source=copilot.microsoft.com - Audit schema coverage
- Review server load after 14 days
- Update SOP
- Schedule quarterly crawl audit
- Book an expert SEO Audit for full AI readiness
FAQs
What is BingBot?
BingBot is Microsoft's primary web crawler, fetching pages for Bing Search and Copilot answers.
Is BingBot safe for SEO?
Yes. Allowing BingBot does not influence Google rankings; they use separate crawlers.
How do I block BingBot?
Add User-agent: bingbot plus Disallow: / in robots.txt. Keep BingPreview allowed if you still want image snapshots.
Does BingBot honour crawl-delay?
No. Throttle with HTTP 429 or CDN limits instead.
Where can I see BingBot traffic?
Filter logs for its user-agent or track utm_source=copilot.microsoft.com in GA4.
Is Your Site Ready for AI Search?
Configuring robots.txt is just one piece of the puzzle. Our AI Search Optimization service ensures your site is structured, cited, and visible across Google AI Overviews, ChatGPT, Perplexity, and Gemini.
Get a Free AI Search AuditNext Steps
Ready to capture Copilot visibility? Start with a comprehensive SEO Audit—our team benchmarks crawl health, schema depth, and Bing eligibility in two weeks. Need content that wins citations? Explore our data-backed SEO programs that turn insight into demand.