📝 GEO

Why Your Website Might Be Invisible to Google in 2026

Graham Kennedy 20 April 2026 8 min read
Graham Kennedy
Written by
Graham Kennedy
Founder of Draxiq · Southport tech guy since 2009 ·
Why Your Website Might Be Invisible to Google in 2026

Try something for me right now. Open Google and type site:yourdomain.com — replacing that with your actual domain. How many results come back? If the answer is zero, or far fewer pages than you know your website has, then congratulations: you've been paying for a website that Google doesn't know exists.

I see this constantly. A sole trader in Manchester, a mobile hairdresser in Bristol, an electrician in Edinburgh — they've all got a website. They're proud of it. They assume that having a website means people can find them. It doesn't. Having a website that Google has indexed, that search engines can read, that AI models can crawl — that's what makes you findable. And the gap between those two things is enormous.

The worst part? The reasons your site is invisible are usually tiny, technical, and completely hidden from you. Nobody told you about them when you bought the website. So let's fix that.

The robots.txt problem nobody warned you about

Every website has a file called robots.txt. It sits at yourdomain.com/robots.txt and it's essentially a set of instructions telling search engine crawlers what they're allowed to look at and what they're not. Think of it as a bouncer list for your website.

Here's the issue: a surprising number of website builders and developers leave this file in a state that blocks Google entirely. Sometimes it's a leftover from when the site was in development — a Disallow: / line that tells every crawler to stay away. Sometimes it's a default setting on a hosting platform that nobody unchecked. Either way, Google obeys the instruction. It doesn't index your pages. You don't appear in search results. And you have no idea it's happening.

In 2026, this problem has actually got worse, not better, because it's not just Google you need to worry about. AI crawlers — the ones that power ChatGPT, Claude, Perplexity, and the rest — are blocked by default on many popular website platforms. The platform adds a block for GPTBot, ClaudeBot, PerplexityBot, and others unless you explicitly opt in. Which means even if Google can see your site, the AI models that are increasingly answering people's questions cannot. You're invisible in a whole new way.

Empty title tags: the silent killer

Title tags are probably the single most important on-page SEO element. They're what Google reads first to understand what a page is about. They're what shows up as the clickable blue link in search results. And on a staggering number of small business websites, they're either empty, duplicated across every page, or set to something utterly useless like "Home" or "Page 1".

I audited a batch of 40 local UK business sites last quarter. Twenty-three of them had duplicate title tags across every page. Nine had completely empty title tags on at least one key page. This isn't an edge case — it's the norm.

When Google sees a page with no title tag, or a title tag that matches five other pages on the same site, it has very little reason to rank any of them. It doesn't know which page to show for which query. So it often shows none.

The fix is straightforward in theory — every page needs a unique, descriptive title tag that includes relevant keywords and location if you're a local business. But if you're running a DIY site builder template, these fields are often buried three menus deep, and nobody tells you they matter. This is one of the first things we look at in our AI search optimisation audits because it's low effort, high impact, and almost always neglected.

No schema markup means Google is guessing

Schema markup is structured data you add to your website's code that tells search engines exactly what your business is, where it's located, what services you offer, your opening hours, your reviews — all in a format machines can parse instantly. Without it, Google has to infer all of that from your page content, and it's not always good at inferring.

For local businesses, LocalBusiness schema is basically table stakes in 2026. Google's own documentation recommends it. AI search engines rely on it heavily to decide which businesses to cite when someone asks "best plumber near me" or "electrician in Leeds open on Saturdays".

Yet most small business websites have zero schema markup. None. The page might say "We're a plumbing company in Leeds" in the footer, but there's no structured data telling Google's systems that with certainty. You're relying on a crawler to read natural language and guess correctly. Sometimes it does. Often it doesn't.

Adding schema isn't something most sole traders can do themselves — it lives in the code, and it needs to be valid JSON-LD that matches Google's specifications. But it's not expensive to implement, and the difference it makes to how search engines understand your site is dramatic. If you're curious about the full picture of what AI-era search actually requires, I wrote a detailed breakdown in our guide on how to optimise your website for AI search in 2026.

Google Business Profile: the free listing you probably haven't claimed

This one baffles me every time I encounter it, and I encounter it weekly. Google Business Profile (GBP) is free. Completely free. It's the listing that appears in the map pack when someone searches for a local service. It shows your address, phone number, reviews, opening hours, photos, and a link to your website. For local businesses, it often drives more traffic than the website itself.

And yet a huge number of sole traders either haven't claimed their GBP listing, have claimed it but left it half-finished, or have one with an old address and wrong phone number. Google treats an unclaimed or incomplete GBP as a low-trust signal. If your competitors have complete, verified profiles with recent reviews and you don't, you're not in the race.

Some stats that should worry you:

If you don't have a verified, complete Google Business Profile, you are handicapping yourself for free. It costs nothing to fix. Go do it today.

Your website platform might be working against you

Not all website builders are created equal when it comes to search visibility. Some generate clean, crawlable HTML with sensible URL structures. Others produce bloated JavaScript-heavy pages that Google's crawler struggles to render, with auto-generated URLs like /page/5f3a2b1c that mean nothing to anyone.

I'm not going to name specific platforms — you can probably guess which ones I'm thinking of. But here are warning signs that your platform might be part of the problem:

If three or more of those apply, your website platform is actively harming your search visibility. This is exactly why we build sites the way we do at Draxiq — our website builds are designed from the ground up to be crawlable, fast, schema-rich, and visible to both traditional search engines and AI models. It's not a bolt-on. It's the foundation.

The AI search problem is real and growing

Here's the part most people haven't caught up to yet. In 2026, a meaningful and growing percentage of search queries never reach a traditional search results page. People ask ChatGPT. They ask Claude. They use Perplexity. They use Google's own AI Overviews. And these systems decide which businesses to mention based on a completely different set of signals than old-school Google ranking.

AI models prioritise structured data, clear entity definitions, authoritative citations, and — crucially — whether they can actually access your site at all. If your robots.txt blocks AI crawlers, if your pages lack schema, if there's no clear semantic structure to your content, then AI models won't cite you. They'll cite your competitor who got this right.

This isn't a future problem. It's a right-now problem. Over 40% of UK adults used an AI assistant for a search-style query at least once in the past month, according to recent Ofcom data. That number is climbing fast. And most small business websites are completely unprepared for it.

If you're thinking "I don't even know where to start with this," that's normal. The landscape has shifted quickly. But the businesses that get the technical foundations right now — schema, crawlability, structured content, AI accessibility — are the ones that will dominate local search for the next several years. The ones that don't will keep wondering why their website isn't bringing in any enquiries.

The point

Having a website is not the same as being findable. In 2026, there are more ways to be invisible than ever — blocked crawlers, missing schema, empty title tags, unclaimed business profiles, platforms that prioritise pretty templates over technical soundness. The good news is that most of these problems are fixable, often quickly and cheaply. The bad news is that nobody's going to fix them for you unless you know they exist. Now you do.

Go type site:yourdomain.com into Google. See what comes back. If it's not what you expected, you know where to find me.

— Graham

Related reading

Frequently asked questions

How do I check if Google has indexed my website?
Go to Google and search site:yourdomain.com (replacing yourdomain.com with your actual domain). The number of results shows how many of your pages Google has indexed. If it returns zero, Google hasn't indexed your site at all. If it returns far fewer pages than your site actually has, some pages are being missed or blocked.
What is robots.txt and why does it matter?
Robots.txt is a small text file on your website that tells search engine crawlers which pages they're allowed to access. If it contains a 'Disallow: /' rule, it blocks all crawlers from your entire site. Many website builders leave this in a restrictive state by default, especially during development, and it never gets updated when the site goes live.
Are AI crawlers like ChatGPT blocked from my website?
On many popular website platforms, yes — AI crawlers such as GPTBot (ChatGPT) and ClaudeBot are blocked by default in the robots.txt file. This means AI models cannot access your content and therefore cannot cite or recommend your business when users ask questions. You can check by visiting yourdomain.com/robots.txt and looking for these bot names in the disallow rules.
Do I really need schema markup for a small business website?
In 2026, yes. Schema markup — particularly LocalBusiness schema — tells search engines and AI models exactly what your business is, where it's located, and what you offer. Without it, these systems have to guess based on your page content, which often leads to your business being overlooked in favour of competitors who have structured data in place.
Is Google Business Profile still important in 2026?
Absolutely. Google Business Profile remains the primary driver of local map pack results, and it's free. Businesses with complete, verified profiles that include reviews, photos, and accurate information consistently outperform those without. If you're a local business and you haven't claimed and completed your GBP listing, you're missing one of the easiest wins available.

Find out what Google actually sees

We'll audit your site's crawlability, schema, and AI search readiness — and show you exactly what's holding you back.

See the GEO Pack →
Graham Kennedy
Graham Kennedy Founder of Draxiq · Building AI websites and tools from Southport, UK · Get in touch