llms.txt: The Honest Guide - What It Does, What It Doesn't, and Whether You Need One
Everyone's talking about llms.txt. But does it actually work? Here's what we know, what we don't, and whether it's worth your time.
Dani
I have no strong opinion on llms.txt. There, I said it.
I know that's a weird thing to read in a guide about llms.txt, especially from someone who built an AI readiness audit tool. But after auditing a bunch of sites and reading everything I could find on the topic, I think the honest answer is: nobody really knows how much this matters yet. And I'd rather tell you that upfront than pretend I have certainty I don't.
Here's what I do know.
What llms.txt actually is
The concept is simple: a plain-text Markdown file at your domain root (yoursite.com/llms.txt) that gives AI systems a structured summary of what your site is about. Think of it as a table of contents for machines - here's who we are, here's what matters, here's where to find it.
The proposal came from Jeremy Howard of Answer.AI in 2024. Instead of forcing LLMs to crawl your entire site (expensive, slow, error-prone), you give them a concise, token-efficient entry point.
The format is straightforward:
# Your Site Name
> Brief description of what you do
## Main Sections
- [Products](/products): Overview of product offerings
- [API Documentation](/docs/api): REST API reference
- [Pricing](/pricing): Plans and pricing details
There's also an extended format (llms-full.txt) with more detail for each linked page, but the base file is all most sites need.
The adoption numbers look impressive - but context matters
Over 844,000 websites have implemented llms.txt as of late 2025, according to BuiltWith. Anthropic, Cloudflare, Stripe, and other major tech companies have one. That sounds like momentum.
But here's what you won't read in most guides: no major AI platform has officially confirmed they read this file.
John Mueller from Google addressed this directly in mid-2025: "No AI system currently uses llms.txt." Google's AI Overviews don't consult it when deciding what to cite. Semrush ran a controlled test on Search Engine Land - they found no correlation between implementing llms.txt and improved performance in AI results. From mid-August to late October 2025, their llms.txt page received zero visits from GPTBot, ClaudeBot, PerplexityBot, or Google-Extended.
Zero. Visits.
So why bother?
Because the cost-benefit math is wildly lopsided.
Implementation takes about 20 minutes. There's zero demonstrated downside. And the upside scenario is significant: if even one major AI platform starts reading llms.txt (which is increasingly likely as the standard matures), every site that already has one is ahead.
More importantly, the process of creating an llms.txt forces you to think about how your site looks to a machine. What are your 20 most important pages? How would you describe your business in two sentences? What content actually matters for something trying to understand what you do?
That exercise alone is valuable, even if no AI ever reads the file.
Where llms.txt genuinely helps today
There's one use case where it provides clear value: documentation-heavy products and developer tools.
If you maintain API docs, SDKs, or technical reference material, llms.txt gives LLMs a structured entry point. When a developer asks Claude to "explain how the Stripe API handles webhooks," the model benefits from having a clean summary pointing to the right resources. It reduces hallucination and improves accuracy.
For SaaS products, e-commerce sites, and content publishers? The value is more speculative. It's a bet on the future.
How to implement it properly
If you're going to add one, do it right:
Keep it focused. 20-50 links maximum. This isn't a sitemap - it's a curated overview. Link to your most important pages: homepage, key product pages, pricing, about, API docs, and any content hubs.
Write useful descriptions. Don't just list URLs. Each link should have a brief description of what the page contains and why it matters:
- [Schema.org Guide](/blog/schema-org-guide): Step-by-step guide to implementing JSON-LD structured data for product, organization, and FAQ types
UTF-8 encoding only. LLMs may misinterpret or reject files with other encodings.
Reference it in your HTML. Add a <link> tag in your document <head>:
<link rel="alternate" type="text/markdown" href="/llms.txt" title="LLM-readable site summary">
Update it when your site changes. A stale llms.txt with broken links is worse than no llms.txt.
What AgentReady checks
When we audit your site, we check three things about llms.txt: does the file exist at your domain root, is it valid Markdown with proper formatting, and does it contain useful structure - a title, description, and linked sections.
Most sites we've scanned don't have one at all. Of those that do, a surprising number are malformed or contain stale links.
The bottom line
llms.txt is not a magic bullet for AI visibility. Anyone who tells you "add llms.txt and watch your AI traffic soar" is overselling it. The honest answer: nobody knows how much it matters yet, because the AI platforms haven't been transparent about their crawling behavior.
But it costs almost nothing to implement, it forces good thinking about your site structure, and it positions you for a future where AI systems do start reading it. In a landscape where 47% of brands still don't have any GEO strategy at all, even small moves matter.
My take? I think the adoption numbers will eventually force AI platforms' hands. When nearly a million sites have implemented a standard, ignoring it becomes harder to justify. But I wouldn't make llms.txt your top priority. Fix your Schema.org first - that's where the proven impact is. Then add llms.txt as a quick win.
Want to know if your site has a valid llms.txt? Run a free audit - we check it along with 50+ other AI readiness signals.
Related articles
Get more guides like this
AI readiness insights, delivered to your inbox.