The robots.txt problem nobody warned you about
Every website has a file called robots.txt. It sits at yourdomain.com/robots.txt and it's essentially a set of instructions telling search engine crawlers what they're allowed to look at and what they're not. Think of it as a bouncer list for your website.
Here's the issue: a surprising number of website builders and developers leave this file in a state that blocks Google entirely. Sometimes it's a leftover from when the site was in development — a Disallow: / line that tells every crawler to stay away. Sometimes it's a default setting on a hosting platform that nobody unchecked. Either way, Google obeys the instruction. It doesn't index your pages. You don't appear in search results. And you have no idea it's happening.
In 2026, this problem has actually got worse, not better, because it's not just Google you need to worry about. AI crawlers — the ones that power ChatGPT, Claude, Perplexity, and the rest — are blocked by default on many popular website platforms. The platform adds a block for GPTBot, ClaudeBot, PerplexityBot, and others unless you explicitly opt in. Which means even if Google can see your site, the AI models that are increasingly answering people's questions cannot. You're invisible in a whole new way.
Empty title tags: the silent killer
Title tags are probably the single most important on-page SEO element. They're what Google reads first to understand what a page is about. They're what shows up as the clickable blue link in search results. And on a staggering number of small business websites, they're either empty, duplicated across every page, or set to something utterly useless like "Home" or "Page 1".
I audited a batch of 40 local UK business sites last quarter. Twenty-three of them had duplicate title tags across every page. Nine had completely empty title tags on at least one key page. This isn't an edge case — it's the norm.
When Google sees a page with no title tag, or a title tag that matches five other pages on the same site, it has very little reason to rank any of them. It doesn't know which page to show for which query. So it often shows none.
The fix is straightforward in theory — every page needs a unique, descriptive title tag that includes relevant keywords and location if you're a local business. But if you're running a DIY site builder template, these fields are often buried three menus deep, and nobody tells you they matter. This is one of the first things we look at in our AI search optimisation audits because it's low effort, high impact, and almost always neglected.
No schema markup means Google is guessing
Schema markup is structured data you add to your website's code that tells search engines exactly what your business is, where it's located, what services you offer, your opening hours, your reviews — all in a format machines can parse instantly. Without it, Google has to infer all of that from your page content, and it's not always good at inferring.
For local businesses, LocalBusiness schema is basically table stakes in 2026. Google's own documentation recommends it. AI search engines rely on it heavily to decide which businesses to cite when someone asks "best plumber near me" or "electrician in Leeds open on Saturdays".
Yet most small business websites have zero schema markup. None. The page might say "We're a plumbing company in Leeds" in the footer, but there's no structured data telling Google's systems that with certainty. You're relying on a crawler to read natural language and guess correctly. Sometimes it does. Often it doesn't.
Adding schema isn't something most sole traders can do themselves — it lives in the code, and it needs to be valid JSON-LD that matches Google's specifications. But it's not expensive to implement, and the difference it makes to how search engines understand your site is dramatic. If you're curious about the full picture of what AI-era search actually requires, I wrote a detailed breakdown in our guide on how to optimise your website for AI search in 2026.
Google Business Profile: the free listing you probably haven't claimed
This one baffles me every time I encounter it, and I encounter it weekly. Google Business Profile (GBP) is free. Completely free. It's the listing that appears in the map pack when someone searches for a local service. It shows your address, phone number, reviews, opening hours, photos, and a link to your website. For local businesses, it often drives more traffic than the website itself.
And yet a huge number of sole traders either haven't claimed their GBP listing, have claimed it but left it half-finished, or have one with an old address and wrong phone number. Google treats an unclaimed or incomplete GBP as a low-trust signal. If your competitors have complete, verified profiles with recent reviews and you don't, you're not in the race.
Some stats that should worry you:
- Businesses with complete GBP profiles are 2.7x more likely to be considered reputable by consumers (Google's own data).
- 70% of local searches result in a visit to a business within 8 km — but only if that business actually appears in results.
- GBP listings with more than 10 reviews and a response from the owner rank materially higher in local pack results.
If you don't have a verified, complete Google Business Profile, you are handicapping yourself for free. It costs nothing to fix. Go do it today.
Your website platform might be working against you
Not all website builders are created equal when it comes to search visibility. Some generate clean, crawlable HTML with sensible URL structures. Others produce bloated JavaScript-heavy pages that Google's crawler struggles to render, with auto-generated URLs like /page/5f3a2b1c that mean nothing to anyone.
I'm not going to name specific platforms — you can probably guess which ones I'm thinking of. But here are warning signs that your platform might be part of the problem:
- Your page URLs contain random strings of characters instead of readable words.
- You can't edit meta titles and descriptions without upgrading to a paid plan.
- Your site takes more than 4 seconds to load on mobile (test it at PageSpeed Insights).
- There's no way to add custom code or schema markup.
- The
robots.txt file blocks AI crawlers and there's no setting to change it.
If three or more of those apply, your website platform is actively harming your search visibility. This is exactly why we build sites the way we do at Draxiq — our website builds are designed from the ground up to be crawlable, fast, schema-rich, and visible to both traditional search engines and AI models. It's not a bolt-on. It's the foundation.
The AI search problem is real and growing
Here's the part most people haven't caught up to yet. In 2026, a meaningful and growing percentage of search queries never reach a traditional search results page. People ask ChatGPT. They ask Claude. They use Perplexity. They use Google's own AI Overviews. And these systems decide which businesses to mention based on a completely different set of signals than old-school Google ranking.
AI models prioritise structured data, clear entity definitions, authoritative citations, and — crucially — whether they can actually access your site at all. If your robots.txt blocks AI crawlers, if your pages lack schema, if there's no clear semantic structure to your content, then AI models won't cite you. They'll cite your competitor who got this right.
This isn't a future problem. It's a right-now problem. Over 40% of UK adults used an AI assistant for a search-style query at least once in the past month, according to recent Ofcom data. That number is climbing fast. And most small business websites are completely unprepared for it.
If you're thinking "I don't even know where to start with this," that's normal. The landscape has shifted quickly. But the businesses that get the technical foundations right now — schema, crawlability, structured content, AI accessibility — are the ones that will dominate local search for the next several years. The ones that don't will keep wondering why their website isn't bringing in any enquiries.
The point
Having a website is not the same as being findable. In 2026, there are more ways to be invisible than ever — blocked crawlers, missing schema, empty title tags, unclaimed business profiles, platforms that prioritise pretty templates over technical soundness. The good news is that most of these problems are fixable, often quickly and cheaply. The bad news is that nobody's going to fix them for you unless you know they exist. Now you do.
Go type site:yourdomain.com into Google. See what comes back. If it's not what you expected, you know where to find me.
— Graham
Related reading
Frequently asked questions