`llms.txt` is sometimes discussed in AI visibility circles, but it is not a standard ranking or citation requirement.
The durable work is still the same: publish useful public pages, keep sitemaps accurate, add schema, and make important content crawlable without JavaScript.
How to apply it
Do not treat it as the strategy
`llms.txt` should not replace crawlable pages, structured data, canonical URLs, or a strong sitemap.
Prioritize public source pages
Create pages that humans can visit and AI systems can cite: guides, comparisons, FAQs, product pages, and /insights/ content.
Use standards first
Invest in schema.org, robots.txt, sitemap.xml, semantic HTML, and internal linking before experimenting with non-standard files.
Best practices
- Do not rely on non-standard files as your AI visibility foundation.
- Keep public pages and schema as the source of truth.
- Avoid publishing summaries that contradict visible site content.
Common mistakes
- Assuming `llms.txt` alone will solve AI visibility.
- Stuffing it with promotional copy.
- Letting it contradict your site and structured data.
Frequently asked questions
Is `llms.txt` required?
No. It is not required, and Appear does not rely on it. Crawlable public content, structured data, and accurate sitemaps matter more.
How is `llms.txt` different from `robots.txt`?
`robots.txt` controls access. `llms.txt` helps explain the site. One is about permissions; the other is about meaning.