llms.txt — Tell AI Systems What Your Website Is About
llms.txt is a plain-text file at /llms.txt that gives AI systems a structured, LLM-friendly summary of your website — what it does, who it's for, and where the important pages are.
What is llms.txt?
llms.txt is an emerging convention, inspired by robots.txt, that places a human- and machine-readable Markdown file at the root of your domain. Unlike robots.txt, which controls crawl access, llms.txt describes your site's purpose, structure, and key content in a format that large language models can parse efficiently.
The file uses standard Markdown: a top-level heading with your site name, a blockquote with a brief description, and sections listing important pages with short annotations. AI systems that encounter llms.txt can use it to understand your site before — or instead of — crawling individual pages.
Why llms.txt Matters
AI systems like Claude look for llms.txt to understand a website's purpose, structure, and key content. When an AI assistant is asked about your company, product, or topic, it may consult your llms.txt as a fast, authoritative source of context.
Without llms.txt, AI systems must infer your site's purpose from individual pages, metadata, and links — a process that is less reliable and may lead to incomplete or inaccurate representations. A well-written llms.txt:
- Helps AI systems understand your site's purpose at a glance
- Directs AI crawlers to your most important pages
- Reduces the chance of misrepresentation in AI-generated answers
- Provides a stable reference that persists even as individual page content changes
Example llms.txt
Place this file at https://yourdomain.com/llms.txt and adapt it for your site. The format is plain Markdown — no special syntax required.
# Example Company
> A brief description of what your company does and what this website offers.
## Main Pages
- [Home](https://example.com/): Landing page with product overview
- [About](https://example.com/about): Company background and team
- [Blog](https://example.com/blog): Latest articles and updates
- [Docs](https://example.com/docs): Technical documentation
## Key Information
- Founded: 2020
- Industry: SaaS / Developer Tools
- Contact: [email protected]
Keep the description in the blockquote concise and factual — one to three sentences that an AI could use verbatim to describe your site. The page list should focus on your most important, publicly accessible pages.
Where to Place llms.txt
llms.txt must be served at the root of your domain: https://yourdomain.com/llms.txt. This mirrors the convention established by robots.txt. A file placed in a subdirectory (e.g. /blog/llms.txt) will not be discovered by AI crawlers following the standard.
The file should be served with a 200 HTTP status and a Content-Type of text/plain or text/markdown. It should be publicly accessible without authentication.
Common Mistakes
- Not using Markdown format: llms.txt relies on Markdown structure (headings, lists, blockquotes) for AI systems to parse it. Plain prose without formatting is less useful.
- Too much marketing language: Avoid promotional fluff in the description. AI systems perform better when the description is factual and specific rather than filled with superlatives.
- Missing key pages: Include your most important pages — the ones you want AI systems to know about and link to. Omitting your docs, pricing, or contact page means AI systems may not surface them.
- Stale content: Update your llms.txt when major pages are added, renamed, or removed. An outdated llms.txt with dead links undermines trust.
- Wrong file location: Just like robots.txt, the file must be at the domain root — not in a subdirectory.