You're walking to a meeting. You're wearing a pair of Ray-Ban Meta glasses. You say, out loud, to no one in particular: "Who's the best estate planning attorney near me?"
You don't get a list of ten options. You don't pull out your phone. You don't open a browser tab.
You hear a name in your ear.
Maybe two names. Probably one. That's the direction behavior is heading — and the implications for every local professional who lives by inbound are more significant than most people realize.
From Scrolling to Asking
For twenty years, local search worked the same way. Someone typed a query, got a list of ten blue links, and picked one. The game was: be high enough on the list that they picked you.
That game is changing fast.
Sixty percent of Google searches now end without a single click. AI answers the question directly — synthesizing information from multiple sources and delivering a complete response in the search window itself. Google AI Overviews, ChatGPT, Perplexity — they've all moved in the same direction. The answer comes to you. You don't go looking for it.
Wearables accelerate this by an order of magnitude.
When the interface is audio — smart glasses, earbuds, the AI assistant in your car — the entire interaction model changes. Screens invite browsing. Audio demands decisiveness. Nobody wants five names read out loud while they're walking down the street. The UX pressure on every screenless AI system pushes hard toward one high-confidence recommendation. One name. Maybe two. That's the entire decision set.
Why the List Will Compress
This isn't speculation about distant future technology. It's a logical consequence of how AI recommendation systems work under the constraints of audio interfaces and user patience.
Think about what happens when an AI assistant makes a bad recommendation. The user has a terrible experience. They lose trust in the assistant. They stop using it. For the companies building these systems — Apple, Google, Meta, OpenAI — a bad recommendation is an existential risk to the product.
So the system defaults toward caution. It recommends businesses it can verify. Businesses with consistent signals. Businesses whose online presence matches their real-world reputation. Businesses that are, in the language of machine learning, low risk.
Reading out five options is also cognitively exhausting for the user and reputationally risky for the assistant. One high-confidence answer is better product design.
The result is what you might call a winner's circle: a small set of local businesses in each niche and geography that meet the threshold for safe recommendation. The businesses outside that circle aren't ranked lower. They're simply not in the conversation. The circle is not large. For most local niches in most mid-sized markets, it's probably three to five businesses at most. In smaller markets, it may be one or two.
What AI Can't Fake
Before going further, it's worth being direct about something: none of this infrastructure work matters if the business isn't genuinely good.
AI systems can't manufacture integrity. They can read reviews, but they can't write them — and the signal of authentic, consistent five-star reviews across multiple platforms over multiple years is one that no amount of technical optimization can replicate.
Real-world excellence is the foundation. Consistent NAP data across directories. Genuine client reviews that reflect actual service quality. Real expertise demonstrated through content that answers real questions. Honest operations that hold up to scrutiny.
KodeCite's work — and any legitimate AEO work — is an amplifier, not a disguise. If the business is excellent, the infrastructure makes sure machines can see and trust that excellence. If the business isn't excellent, better schema markup won't save it.
Why Infrastructure Suddenly Matters Again
When an AI crawler visits your website, it's operating on a budget. Not a financial budget — a compute budget. These systems are scanning millions of pages. They spend more time and resources on pages that load quickly, parse cleanly, and communicate their content clearly. They spend less time — or skip entirely — pages that are slow, bloated, and hard to read.
Most local professional websites are slow, bloated, and hard to read. Not because the designers did bad work, but because of the platforms they were built on.
WordPress with a heavy theme, a page builder, and a dozen active plugins routinely delivers real-device load times of three to eight seconds on a modern phone. Wix and Squarespace are better, but still built on shared infrastructure with inherent performance ceilings. Real estate website subscription builders are often the worst of all — generic templates shared across thousands of agents, with thin or nonexistent structured data.
The contrast with a purpose-built edge deployment is stark. A Next.js site deployed on Vercel's global CDN loads the same pages in under a second on a real device — not a simulated benchmark, but actual load time on a 2026 iPhone on WiFi or 5G. That's not a marginal improvement. It's the difference between getting read and getting skipped.
The critique here isn't about brand names. It's about architecture and incentives. Subscription platforms are built to serve thousands of customers at acceptable quality. They're not built to be the fastest, cleanest, most machine-readable site in your local market. That's not their job. It's yours — if you want to be on the shortlist.
The Machine-Readable Local Expert
Speed is necessary but not sufficient. The other half of the equation is structured data — the information architecture that tells AI systems exactly who you are, what you do, where you operate, and why you're trustworthy.
Most local professional websites communicate this information to humans. They do it poorly, or not at all, for machines.
llms.txt is a natural-language brief for AI crawlers. Think of it as the cover letter your website sends to every AI system that visits. It explains in plain English who the business is, what it does, where it operates, which pages contain the most valuable information, and how the AI should describe the business to users. Most websites don't have one. The ones that do have a meaningful advantage in how accurately AI systems represent them.
agent.json is a structured identity file — think of it as DNS for AI agents. It encodes the business as a machine-readable entity: services offered, geographic coverage, contact information, authority signals, and the capabilities an autonomous AI system would need to recommend or interact with the business. As AI agents become more autonomous — shopping for services, booking appointments, answering questions on behalf of users — this file becomes the handshake that makes discovery possible.
Per-page JSON-LD schema is the granular layer. Every page on the site — homepage, service pages, location pages, articles, FAQs — carries custom structured data that encodes the specific content and context of that page. Not copy-pasted site-wide boilerplate, but bespoke markup that tells a machine exactly what it's looking at. Zero invalid items at launch, verified before the site goes live. Together, these layers make it easy for an AI system to say, with confidence: Call this business.
The Condensation Hypothesis
Here's the core argument stated plainly.
As AI assistants and screenless wearables go mainstream over the next two to four years, recommendation lists for local services will compress dramatically. The shortlist for any local niche — best estate planning attorney in Coeur d'Alene, best financial advisor in Boise, best real estate agent in Scottsdale — will shrink to one or two names.
That shortlist will be composed of operators who meet two criteria:
First, they run genuinely excellent businesses. Strong reviews, consistent reputation, real expertise, honest operations. This is non-negotiable and cannot be engineered around.
Second, they've invested in fast, structured, AI-readable web infrastructure. Sub-second load times. Clean markup. Custom schema. AI identity files. Consistent directory presence. Content written to answer questions, not fill pages.
Businesses that meet both criteria will be recommended. Businesses that meet only one will struggle. Businesses that meet neither will be invisible.
Picture this: you're wearing smart glasses and ask, "Who's the best real estate agent near me?" The assistant evaluates the local market in real time. It finds three agents with strong reviews, then checks their web presence. Two have slow, generic subscription sites with thin schema. One has a sub-second Next.js site with custom per-page JSON-LD, an llms.txt brief, and consistent NAP data across every major directory. The choice, from the assistant's perspective, is easy. You hear one name.
What This Means for a Local Professional Today
The practical implications break down into three areas.
Your site needs to load instantly. Not fast. Instantly. Sub-second on a real device. If you're on a subscription platform, this may not be achievable without rebuilding. That's an uncomfortable truth, but it's the truth.
Your expertise needs to be machine-readable. Schema markup is not optional anymore. Custom per-page JSON-LD that encodes your services, location, credentials, and authority isn't a nice-to-have — it's the difference between being understood by AI systems and being guessed at.
Your directory presence needs to be consistent. Google Business Profile, Bing Places, Apple Business Connect, Yelp, BBB — NAP data needs to match exactly across every platform. Inconsistencies create uncertainty for AI systems trying to verify your identity. Uncertainty means you don't make the shortlist.
Your content needs to answer questions, not describe services. Brochure content tells humans what you do. Answer-first content tells AI systems — and the humans asking them — why you're the right choice for a specific question in a specific context. The format matters as much as the words.
How to Become the Name AI Recommends
A simple checklist for where to start:
Audit your real-world reputation first.
Reviews, consistency, service quality. No infrastructure work matters if this foundation is weak.
Benchmark your site speed.
Run your homepage through PageSpeed Insights. If your mobile score is below 70 or your load time is above two seconds on desktop, you have a structural problem.
Check your schema.
Use Google's Rich Results Test. If you have no structured data, or invalid items, AI systems are guessing about your business.
Verify your directory consistency.
Search your business name across GBP, Bing, Apple Maps, and Yelp. Any mismatch in NAP data needs to be corrected.
Evaluate your platform honestly.
If you're on a subscription builder and you're serious about AI visibility, understand what it can and can't deliver. Some platforms have hard ceilings on performance and schema depth that can't be engineered around without rebuilding on a modern edge chassis.
Add llms.txt.
If you have access to your site's root directory, this is one of the highest-leverage additions you can make today. A clear, honest brief for AI crawlers about who you are and what you do.
The businesses that will dominate local AI recommendations over the next three years are largely the ones making these investments now — while the infrastructure is still novel and the competition is still asleep.
The shortlist is forming. The question is whether your name is on it.
Not sure where you stand? Get a free AI Scaffolding Audit — we'll show you exactly how your business currently appears in ChatGPT, Perplexity, and Google AI Overviews, and what it would take to get you on the shortlist.