The shift from "rank on Google" to "get cited by AI"
For about twenty years, SEO meant one thing: get your website to appear on the first page of Google for the keywords your customers type in. The whole industry was built around this. Backlinks, keyword density, page authority, meta descriptions, sitemaps — the entire toolkit existed to please one algorithm.
That algorithm still matters. Google still has the largest share of search traffic on Earth. But here's what's changed: Google itself isn't really showing ten blue links anymore. Open Google today and search anything substantive. What you'll usually see at the top is a Google AI Overview — an AI-generated answer that summarises content from multiple sources, lets you read the answer in-place, and mostly stops you ever needing to click through. The blue links are still there, but they're below the fold, and click-through rates have collapsed in categories where AI Overviews appear.
Meanwhile, ChatGPT, Claude, Perplexity, Microsoft Copilot, Apple Intelligence and a dozen other AI search tools are doing the same thing in their own interfaces. None of them care about your meta description. They care about whether your content can be extracted, understood and cited. That's a completely different game.
The good news for businesses paying attention right now is that this is the most uncrowded SEO opportunity in two decades. Most agencies are still selling the old playbook. The new one is wide open, and the few of us who are doing it properly are getting outsized results from a tiny fraction of the effort.
The numbers that should be on every business owner's radar
I want to give you four numbers that, taken together, explain why this matters now and not "in a few years":
- ~31% of US searchers regularly use AI tools instead of Google for at least some queries (early 2026 data, up from under 5% two years ago)
- ~9× higher conversion rate from ChatGPT referral traffic compared to Google referral traffic, because the AI has effectively pre-qualified the user before sending them to your site
- ~3× more citations for sites with proper structured content (FAQ schema, llms.txt, definition boxes) compared to sites without
- 2 to 6 weeks for AI engines to start citing newly optimised content, vs. 6 to 12 months for traditional SEO ranking changes
Look at those four numbers together. The audience is moving fast, the conversion rate is much higher, the optimisation work pays off in weeks not months, and most of your competitors haven't started yet. This is the sort of window that closes quickly. The agencies that figure this out in the next 12 months are the ones that get to charge premium prices for it in 2027 and beyond.
What makes a website "extractable" by AI
OK, let's get into the actual substance. When an AI search engine like ChatGPT or Perplexity processes your website, it isn't reading it the way a human does. It's looking for chunks of structured information it can lift and use. The websites that get cited the most are the ones that make this lifting as easy as possible.
There are six broad categories of work that make a real difference. I'm going to talk about what each one is and why it matters, but I'm not going to write you a tutorial — partly because the implementation details vary by site and partly because, honestly, this is the work my clients pay me for. What I want you to come away with is the ability to look at any website (including your own) and tell whether it's been built with AI search in mind.
1. Structured data (Schema.org)
This is the foundation. Schema.org is a shared vocabulary for describing what the things on a webpage actually are. Is this paragraph an FAQ answer? A product price? A business address? A how-to step? Schema markup tells the crawler explicitly, in a format AI engines were literally trained to understand. FAQPage schema in particular is the single highest-leverage thing you can add — Google actively flags pages that have it as eligible for rich results, and AI engines treat properly marked-up Q&A as ready-to-cite content.
2. The llms.txt file
This is a relatively new convention, and most sites still don't have one. It's a simple text file at the root of your website that tells AI crawlers what the site is about, what the key URLs are, and what the most important content is. Think of it as robots.txt for the AI era. Sites that have a well-written llms.txt are getting cited about three times more often than sites that don't. It costs nothing and takes an hour to create. The fact that most agencies aren't doing it tells you everything about how slow the industry has been to wake up.
3. AI crawler permissions
Most websites have a robots.txt file. Almost none of them explicitly grant permission to AI crawlers like GPTBot, ClaudeBot, PerplexityBot, Google-Extended and Applebot-Extended. If you don't explicitly allow them, some of these crawlers will respect the silence and skip your site entirely. You can't be cited if you weren't crawled. Five lines of text in robots.txt makes the difference between being part of the conversation and being invisible.
4. Answer-first writing
This is the content discipline that separates good GEO writers from bad ones. The principle is simple: answer the question in the very first sentence, then explain. Don't bury the answer two paragraphs in. Don't open with throat-clearing context ("In today's fast-paced digital landscape..."). Lead with the answer, then back it up. This is exactly how AI engines extract content, and it's also, coincidentally, how good writing has always worked. The marketing-fluff school of SEO writing was always wrong; AI search just made it expensive to keep doing.
5. Proof signals: stats, citations, comparisons
AI engines disproportionately cite content that contains specific numbers, named sources, and comparison tables. Vague generic copy ("our industry-leading solutions deliver outstanding results") gets ignored. Concrete claims ("31% of US searchers used AI search in early 2026, up from under 5% two years ago") get extracted and reused. Comparison tables in particular are gold — they're structured, scannable, and decision-relevant, exactly the sort of content an AI is happy to lift and credit you for. If your website doesn't contain a single number with a source, that's the first thing to fix.
6. Freshness and authorship
AI engines prefer content that's clearly maintained and clearly attributed to a real person. A dateModified timestamp, a real author byline with credentials, and recent stats all signal "this is current and trustworthy." Anonymous, undated content from a brand with no human face gets weighted lower. This is the bit of GEO that overlaps most with what Google calls E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness). Real names beat brand names. Real dates beat "© 2024-2026" footers. Real expertise beats stock photos of smiling consultants.
How to spot a GEO-ready website in 30 seconds
Right — practical checklist. The next time you visit any website (your own, a competitor's, or an agency you're considering hiring), run through these six questions. You can answer all of them in about 30 seconds, no technical knowledge required:
- Is there a clearly visible FAQ section, with answers that get straight to the point?
- Are there comparison tables anywhere on the site (this vs that, before vs after, us vs them)?
- Are there specific statistics with sources, or just vague marketing language?
- Is there a real author name attached to the content, or is it all anonymous "we"?
- Does the site load fast and look clean on mobile?
- If you visit
thesite.com/llms.txt, does anything appear, or do you get a 404?
If the answer to most of those is "no," that website is invisible to the AI search era. It might still rank on Google for now, but it's not going to be cited by ChatGPT, Claude, Perplexity or Google AI Overviews — and that's where the audience is moving.
If you ran that checklist on the website of an agency that's trying to sell you "AI SEO services," and they failed it, you have your answer about whether to hire them.
The honest summary
GEO isn't a buzzword and it isn't a fad. It's the natural evolution of SEO for an era where the answer engine has replaced the results page. The techniques are real, measurable, and producing real results today — not in some hypothetical future. The agencies that have figured it out are quietly building moats. The ones that haven't are still selling 2019's playbook at 2026's prices.
You don't need to become an expert in any of this. You just need to know enough to tell the difference between an agency that actually does GEO and one that just put it on their homepage. That alone puts you ahead of about 90% of business owners shopping for web services right now.