🔍 GEO · Cornerstone Post

How to Optimise Your Website for AI Search in 2026

Graham Kennedy 11 April 2026 8 min read
Graham Kennedy
Written by
Graham Kennedy
Founder of Draxiq · Southport tech guy since 2009 ·
🔍

Here's a number that should bother every business owner with a website: roughly 31% of US searchers now regularly use AI tools like ChatGPT, Claude or Perplexity instead of Google — at least for some of their queries. Two years ago that number was under 5%.

That's not a slow drift. That's a stampede.

And the bit most people miss: when one of those AI tools answers a question, it doesn't just hand the user a list of blue links. It hands them an actual answer — often citing one or two specific websites as sources. The user reads the answer, maybe clicks through to one of the sources, and that's it. The other ten websites that would have ranked on Google's results page? Never seen, never visited, never made any money.

Welcome to the new search game. It's called GEO — Generative Engine Optimisation — and most agencies haven't caught up yet.

I'm going to walk you through what GEO actually is, why it matters more than people realise, and what makes the difference between a website that gets cited by AI engines and one that gets ignored. I'm not going to give you a step-by-step recipe — partly because every site is different, mostly because if I gave you the recipe you wouldn't actually use it. But by the end of this post you'll know exactly what to look for in your own site, and exactly what to ask any agency you're thinking of hiring.

The shift from "rank on Google" to "get cited by AI"

For about twenty years, SEO meant one thing: get your website to appear on the first page of Google for the keywords your customers type in. The whole industry was built around this. Backlinks, keyword density, page authority, meta descriptions, sitemaps — the entire toolkit existed to please one algorithm.

That algorithm still matters. Google still has the largest share of search traffic on Earth. But here's what's changed: Google itself isn't really showing ten blue links anymore. Open Google today and search anything substantive. What you'll usually see at the top is a Google AI Overview — an AI-generated answer that summarises content from multiple sources, lets you read the answer in-place, and mostly stops you ever needing to click through. The blue links are still there, but they're below the fold, and click-through rates have collapsed in categories where AI Overviews appear.

Meanwhile, ChatGPT, Claude, Perplexity, Microsoft Copilot, Apple Intelligence and a dozen other AI search tools are doing the same thing in their own interfaces. None of them care about your meta description. They care about whether your content can be extracted, understood and cited. That's a completely different game.

The good news for businesses paying attention right now is that this is the most uncrowded SEO opportunity in two decades. Most agencies are still selling the old playbook. The new one is wide open, and the few of us who are doing it properly are getting outsized results from a tiny fraction of the effort.

The numbers that should be on every business owner's radar

I want to give you four numbers that, taken together, explain why this matters now and not "in a few years":

Look at those four numbers together. The audience is moving fast, the conversion rate is much higher, the optimisation work pays off in weeks not months, and most of your competitors haven't started yet. This is the sort of window that closes quickly. The agencies that figure this out in the next 12 months are the ones that get to charge premium prices for it in 2027 and beyond.

What makes a website "extractable" by AI

OK, let's get into the actual substance. When an AI search engine like ChatGPT or Perplexity processes your website, it isn't reading it the way a human does. It's looking for chunks of structured information it can lift and use. The websites that get cited the most are the ones that make this lifting as easy as possible.

There are six broad categories of work that make a real difference. I'm going to talk about what each one is and why it matters, but I'm not going to write you a tutorial — partly because the implementation details vary by site and partly because, honestly, this is the work my clients pay me for. What I want you to come away with is the ability to look at any website (including your own) and tell whether it's been built with AI search in mind.

1. Structured data (Schema.org)

This is the foundation. Schema.org is a shared vocabulary for describing what the things on a webpage actually are. Is this paragraph an FAQ answer? A product price? A business address? A how-to step? Schema markup tells the crawler explicitly, in a format AI engines were literally trained to understand. FAQPage schema in particular is the single highest-leverage thing you can add — Google actively flags pages that have it as eligible for rich results, and AI engines treat properly marked-up Q&A as ready-to-cite content.

2. The llms.txt file

This is a relatively new convention, and most sites still don't have one. It's a simple text file at the root of your website that tells AI crawlers what the site is about, what the key URLs are, and what the most important content is. Think of it as robots.txt for the AI era. Sites that have a well-written llms.txt are getting cited about three times more often than sites that don't. It costs nothing and takes an hour to create. The fact that most agencies aren't doing it tells you everything about how slow the industry has been to wake up.

3. AI crawler permissions

Most websites have a robots.txt file. Almost none of them explicitly grant permission to AI crawlers like GPTBot, ClaudeBot, PerplexityBot, Google-Extended and Applebot-Extended. If you don't explicitly allow them, some of these crawlers will respect the silence and skip your site entirely. You can't be cited if you weren't crawled. Five lines of text in robots.txt makes the difference between being part of the conversation and being invisible.

4. Answer-first writing

This is the content discipline that separates good GEO writers from bad ones. The principle is simple: answer the question in the very first sentence, then explain. Don't bury the answer two paragraphs in. Don't open with throat-clearing context ("In today's fast-paced digital landscape..."). Lead with the answer, then back it up. This is exactly how AI engines extract content, and it's also, coincidentally, how good writing has always worked. The marketing-fluff school of SEO writing was always wrong; AI search just made it expensive to keep doing.

5. Proof signals: stats, citations, comparisons

AI engines disproportionately cite content that contains specific numbers, named sources, and comparison tables. Vague generic copy ("our industry-leading solutions deliver outstanding results") gets ignored. Concrete claims ("31% of US searchers used AI search in early 2026, up from under 5% two years ago") get extracted and reused. Comparison tables in particular are gold — they're structured, scannable, and decision-relevant, exactly the sort of content an AI is happy to lift and credit you for. If your website doesn't contain a single number with a source, that's the first thing to fix.

6. Freshness and authorship

AI engines prefer content that's clearly maintained and clearly attributed to a real person. A dateModified timestamp, a real author byline with credentials, and recent stats all signal "this is current and trustworthy." Anonymous, undated content from a brand with no human face gets weighted lower. This is the bit of GEO that overlaps most with what Google calls E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness). Real names beat brand names. Real dates beat "© 2024-2026" footers. Real expertise beats stock photos of smiling consultants.

How to spot a GEO-ready website in 30 seconds

Right — practical checklist. The next time you visit any website (your own, a competitor's, or an agency you're considering hiring), run through these six questions. You can answer all of them in about 30 seconds, no technical knowledge required:

If the answer to most of those is "no," that website is invisible to the AI search era. It might still rank on Google for now, but it's not going to be cited by ChatGPT, Claude, Perplexity or Google AI Overviews — and that's where the audience is moving.

If you ran that checklist on the website of an agency that's trying to sell you "AI SEO services," and they failed it, you have your answer about whether to hire them.

The honest summary

GEO isn't a buzzword and it isn't a fad. It's the natural evolution of SEO for an era where the answer engine has replaced the results page. The techniques are real, measurable, and producing real results today — not in some hypothetical future. The agencies that have figured it out are quietly building moats. The ones that haven't are still selling 2019's playbook at 2026's prices.

You don't need to become an expert in any of this. You just need to know enough to tell the difference between an agency that actually does GEO and one that just put it on their homepage. That alone puts you ahead of about 90% of business owners shopping for web services right now.

Related reading

Frequently asked questions

What is AI search optimisation in plain English?
AI search optimisation, also called GEO (Generative Engine Optimisation) or AEO (Answer Engine Optimisation), is the practice of structuring website content so that AI tools like ChatGPT, Claude, Perplexity and Google AI Overviews can extract it and cite it in their answers. Where traditional SEO tries to get you ranked on a results page, GEO tries to get you quoted directly inside the AI's reply.
Is AI search actually replacing Google?
Not replacing — but eating into it fast. As of early 2026, around 31% of US searchers regularly use AI search tools instead of Google for at least some of their queries, up from under 5% two years ago. Google still dominates total search volume, but AI search is the fastest-growing category and shows no signs of slowing.
How do I know if my website is GEO-ready?
A GEO-ready website typically has FAQPage schema markup, an llms.txt file at the root, AI crawler permissions in robots.txt, definition boxes that answer questions in the first sentence, comparison tables, statistics with sources, and a clean heading hierarchy. Most websites built before 2024 are missing all of these.
How long does GEO take to work?
Schema markup gets recognised by Google Search Console within days. AI search engines typically start citing optimised content within 2 to 6 weeks of a site being crawled. This is much faster than traditional SEO, which can take 6 to 12 months for visible ranking changes.
Can GEO be added to an existing website?
Yes, GEO can be retrofitted to any existing website regardless of platform. WordPress, Shopify, Wix, custom builds — they all support the underlying techniques. The work involves adding schema markup, restructuring content into answer-first format, creating an llms.txt file, and updating robots.txt for AI crawlers.
What does Draxiq charge for GEO work?
Draxiq's GEO Pack is £150 one-off, no subscription. It includes a full GEO audit, custom Schema.org markup, llms.txt file, AI crawler verification, answer capsule optimisation, definition box content, and a follow-up GEO scorecard. Works on any platform. Brand new Draxiq websites get all of this baked in by default at no extra cost.

Want this done properly on your site?

Draxiq builds GEO-optimised websites from the ground up — and adds the full GEO Pack to existing sites for £150 one-off. No retainers, no monthly fees, no fluff.

See the GEO Pack →
GK
Graham Kennedy Founder of Draxiq · Building AI websites and tools from Southport, UK · Get in touch