llms.txt
llms.txt is a conventional file in a website's root that gives AI engines a structured overview of which content is relevant to read.
llms.txt is a proposal for a new web standard file, similar to robots.txt or sitemap.xml, specifically intended for AI engines. The file — placed at /llms.txt in the site root — contains a machine-readable summary of the site: what the company does, which pages are most important, what services are offered, with direct links and short descriptions. Initiated by Jeremy Howard in 2024 and widely adopted since (Anthropic, Zapier, Vercel and thousands more). For GEO it's valuable because it gives AI engines a 'fast index'.
Example
A simplified llms.txt for a webshop might start with '# Example B.V.\n> Online sale of sustainable sportswear.' followed by a section like '## Products' with bullet-link items. An AI engine reads this in seconds and instantly has context that would otherwise take dozens of page loads.
Frequently asked questions
Do I need to have llms.txt?
Not a ranking factor or formal standard; no AI engine mandates it. But it's low-effort and adoption is growing. For brands in knowledge-intensive sectors it's strong GEO hygiene.
Are llms.txt files actually used by AI engines?
It varies. Anthropic's Claude uses llms.txt explicitly in its web-search flow. OpenAI and Google have not publicly committed but are believed to use it in internal pipelines. Third-party crawlers (Perplexity, You.com) also appear to respect it.
Related terms
Further reading
- → Our service: GEO