There is an ongoing debate among professionals and top SEO agencies about the future of SEO. How will we do SEO? And more importantly, where will SEO take place in the future? Some, like Neil Patel, have even renamed the acronym SEO –Search Engine Optimization– to Search Everywhere Optimization. And within that “everywhere” we find social networks, Amazon, and Artificial Intelligence –especially LLMs or Large Language Models. In this article, we’ll explain how to technically connect your website to LLMs so they consider you and include your brand in their responses.
Before diving in, it’s important to understand the context. Large Language Models (LLMs) are AI systems designed to understand, process, and generate human language in a coherent and contextualized way. They are trained on massive amounts of data to recognize patterns in language and can answer questions, write content, code, generate images and videos, and much more. Well-known LLMs include Google Gemini, ChatGPT, Claude, DeepSeek, and Grok. These are tools people already use regularly—and they account for a significant share of today’s online queries. Naturally, businesses want to appear in those results, and to do so, they need to be part of the models’ information sources.
The key to doing SEO for AI is understanding where LLMs get their information from—and how you can be included. They access websites, blogs, social networks, encyclopedias like Wikipedia, and structured databases. They also analyze millions of texts, conversations, and interactions with users. Your website is your main tool to appear in results from ChatGPT, Gemini, or Grok.
“Content is king” was first coined by Bill Gates in a 1996 essay. It remains the most iconic phrase in SEO and will likely remain so for years. Before ensuring LLMs like ChatGPT or Gemini can access your content, you need to make sure it’s valuable. If your content isn’t interesting, original, fresh, and relevant, no matter how well you technically enable access, the models will likely ignore it.
So start by creating content that’s well-structured, clearly written, and easy to understand. It helps to include references, links to reputable sources, and author names. In short: apply the classic EEAT (Experience, Expertise, Authoritativeness, Trustworthiness) principles.
Once your website has relevant content, the next step is to allow AI systems to access it. Ask any SEO expert what the most important .txt file is, and they’ll say robots.txt. This file tells crawlers like Google what they can and cannot access. Misconfiguring it is a leading cause of SEO issues during website migrations.
Blocking AI crawlers can be intentional or not. For example, if you check the New York Times robots.txt file, you’ll see that it contains specific rules to block bots like ChatGPT or Claude. This means the newspaper is deliberately preventing AI models from accessing and using its content.
If you ask ChatGPT whether it can access the New York Times, it confirms that it can’t—and cites the robots.txt file as the reason.
So once you’ve published valuable content, make sure AI bots can reach it by properly configuring your robots.txt file. Then comes the next step: guiding them to the content you want them to read. That’s where the llms.txt file comes in.
The llms.txt file acts as a bridge between your website and AI systems, helping them understand your site’s structure and relevance.
When LLMs try to parse a site, they often struggle with navigation menus, JavaScript, CSS, and other elements. The llms.txt file removes these barriers by presenting your key content in a clean, structured format that LLMs can easily understand. This helps AI tools better reference your brand and include you as a trusted source.
The way you use the llms.txt file depends on your site type. An e-commerce site might highlight product categories and descriptions, while a corporate website might focus on services. Blogs might emphasize article categories and authors.
This file was introduced by Jeremy Howard, co-founder of Answer.AI. It’s written in Markdown, a lightweight formatting language that’s easy to learn and doesn’t require coding. For example, use # for H1 titles, ## for H2 subtitles, and **bold** for emphasis.
The structure of the llms.txt file consists of four main parts:
For advanced support, there’s also an optional llms-full.txt file, but in most cases the standard version is enough.
You can create this file with any text editor and upload it to your domain root directory via FTP. It should be named exactly llms.txt.
There are also tools to generate the file automatically, such as Wordlift or Firecrawl. WordPress users can find dedicated plugins that help create the llms.txt file as well.
To avoid common mistakes, once the file is uploaded, make sure:
Using llms.txt helps models like ChatGPT or Gemini access and understand your website. This improves the likelihood of your brand being mentioned in their responses, which can drive visibility, authority, and business growth. As SEO professionals, our job is expanding. We now aim to be present across every search platform—whether AI or traditional engines like Google.
SEO is evolving into Search Everywhere Optimization. We must ensure our brand is discoverable not only in Google, but also in AI-powered answers. That means creating relevant, structured content and using tools like robots.txt and llms.txt to make it easy for AI systems to index and interpret it. Done right, this will secure our presence in the future of digital search.
Hello! drop us a line
The llms.txt file is the bridge between AI and your website