AI

ClaudeBot

By Paul Brock·Updated on 24-04-2026
TL;DR

ClaudeBot is Anthropic's web crawler collecting public content for training and powering Claude's search and research capabilities.

ClaudeBot (user-agent ClaudeBot) respects robots.txt. Anthropic runs multiple crawlers: ClaudeBot (training), Claude-User (on-demand browsing in Claude.ai and the API), Claude-SearchBot (web search for Claude features). For site owners: allowing increases citation chances when Claude users ask questions touching your expertise; blocking protects against training but not always against on-demand fetches.

Example

In robots.txt: User-agent: ClaudeBot
Allow: /blog/
Disallow: /customer-portal/
. Training allowed on public blog, not on gated customer portal. Transparent and specific.

Frequently asked questions

Claude-SearchBot vs ClaudeBot?

ClaudeBot crawls for model training. Claude-SearchBot indexes for Claude's real-time web search feature. Both respect robots.txt but have separate user-agents.

How do I get cited in Claude?

Ensure ClaudeBot access, publish about genuine expertise, structure content clearly (H2/H3, short paragraphs), allow a /llms.txt file, ensure E-E-A-T signals.

Related terms

Further reading

Need help with SEO or GEO?

We help Bitcoin, AI and fintech companies get found in Google and in AI search engines.

Book a call