Meaning Engine
Optimization.
AI systems retrieve content by meaning, not keywords. Every optimization tactic you know — from meta tags to citation sourcing — works because it moves your content closer to user queries in vector space. MEO is the framework that explains why.
By Tom Meredith, Co-Founder at SuperTrained
Here's something curious.
The SEO industry is twenty years old. In the last two years, it has been joined by AEO (Answer Engine Optimization) and GEO (Generative Engine Optimization). Three acronyms. Three sets of tactics. Three competing vendor pitches.
But none of them explain why the tactics work.
They tell you to add citations (because Princeton's 2024 study found a 30-40% citation lift). They tell you to include statistics (because BrightEdge data shows 2.4x higher AI citation rates with author attribution). They tell you to write answer-first content (because AI Overviews quote the first paragraph).
What if the tactics you already know work for a reason nobody is explaining?
That reason is vector proximity. And the framework that explains it is Meaning Engine Optimization.
The optimization stack has four layers.
Each layer builds on the one below it. The foundation — MEO — determines whether the others succeed.
Optimizes for crawlers and keyword indexes
Optimizes for featured snippets and direct answers
Optimizes for AI-generated citations and summaries
Optimizes for the retrieval mechanism itself — vector proximity
MEO determines all of the above.
Retrieval is proximity.
Every large language model — ChatGPT, Claude, Gemini, Perplexity — retrieves information the same way. They convert your query into a vector (a point in high-dimensional space) and find the content whose vector is closest.
This is not a metaphor. It is the literal mechanism.
When Perplexity answers a question about your company, it does not scan for keywords. It computes the cosine similarity between the query embedding and every candidate passage in its index. The passages that land closest in vector space get cited. Wellows et al. (2025) found this similarity score (r=0.664) is the single strongest predictor of whether content gets cited.
The implication is simple:
Every optimization tactic — meta descriptions, header hierarchy, internal linking, citation sourcing, statistical anchoring, entity consistency — works because it moves your content's vector closer to the queries you want to match.
SEO optimizes for keyword proximity. AEO optimizes for answer proximity. GEO optimizes for citation proximity. MEO optimizes for the shared mechanism underneath all three: semantic vector proximity.
What they say vs. what's actually happening.
Every piece of standard optimization advice has a deeper MEO explanation. Here's the translation table.
Three dimensions of meaning.
MEO operates across three measurable dimensions. Each one can be audited, scored, and improved independently.
Semantic Density
How much meaning per token
Content with high semantic density — specific numbers, named entities, sourced claims — creates sharper vector representations. Vague content produces diffuse vectors that match everything weakly and nothing strongly.
Do this
Replace every generic claim with a specific one. "We help businesses" becomes "SuperTrained built CloneICP, a semantic search tool that returns 20-50 scored matches in 60 seconds."
Entity Consistency
How tightly you cluster in embedding space
When your brand description varies naturally but consistently across every page, you form a tight cluster in vector space. AI systems associate your entity with your core concepts. Inconsistency fragments the cluster.
Do this
Define a core entity statement and repeat it with natural variation across every page. Always include your category ("boutique AI automation agency"), your deliverable ("custom AI agents"), and one differentiator.
Query Proximity
How close you land to the questions people ask
Content that directly mirrors the structure and vocabulary of likely queries lands closer in vector space. Answer-first architecture — leading with the conclusion — maximizes proximity to how people and AI systems phrase questions.
Do this
Write every opening paragraph as if it were the answer to a question. Make it quotable as-is by an AI system. If the opening requires context from later paragraphs, rewrite it.
How MEO powers SnowThere.
SnowThere is an autonomous family ski resort directory built by SuperTrained. It covers 116 resorts across 16 countries, published entirely by a three-agent editorial panel (TrustGuard, FamilyValue, VoiceCoach) — zero human editors. Operating cost: $5 per day.
Every resort guide applies MEO principles natively:
- 1.Semantic density: Each guide includes specific lift counts, elevation ranges, pass prices, and family ratings — not generic descriptions.
- 2.Entity consistency: “SnowThere” appears with natural variation across every page. The three-agent panel enforces voice consistency at publish time.
- 3.Query proximity: Every guide opens with an answer-first summary designed to be quoted directly by AI travel assistants.
The result: SnowThere content is structured for AI retrieval from the moment of creation, not retrofitted after the fact. The framework is baked into the pipeline, not bolted on.
116
resorts indexed
across 16 countries
$5
per day
total operating cost
0
editors
three-agent editorial panel
Research foundation.
MEO is not conjecture. It is built on peer-reviewed research and industry data from the teams studying how AI retrieval actually works.
Generative Engine Optimization (GEO)
Princeton University (Aggarwal et al.)
Content with citations achieves 30-40% higher visibility in generative engine responses. Fluency optimization (+41%), citation addition (+33%), and quotation inclusion (+28%) are the most effective strategies.
AI Citation and Author Attribution
BrightEdge Research
Content with clear author attribution achieves 2.4x higher citation rates in AI-generated answers. Approximately 30% of Google search results now contain AI Overviews.
Semantic Similarity in LLM Retrieval
Wellows et al.
Cosine similarity between query embeddings and content embeddings (r=0.664) is the single strongest predictor of whether content gets cited by large language models.
Large Language Models Struggle with Long-Tail Knowledge
Kandpal et al.
LLMs are significantly less accurate on facts that appear infrequently in training data. Content density and repetition directly affect model recall, supporting the case for semantic density and entity consistency.
If you know SEO, you already know MEO.
MEO does not replace your existing optimization work. It gives you the vocabulary to understand why that work succeeds — and where the gaps are.
Curious how your content scores?
SuperTrained offers the first MEO-specific audit. We measure the three dimensions — semantic density, entity consistency, and query proximity — and show you exactly where your content sits in vector space relative to the queries that matter.
Meaning Score Audit
$3,000 – $5,000
A one-time deep audit of your content across all three MEO dimensions. You get a Meaning Score, a gap analysis, and a prioritized remediation plan. Typically completed in 2 weeks.
- ✓Semantic density scoring across your top 20 pages
- ✓Entity consistency audit
- ✓Query proximity mapping for 10 target queries
- ✓Prioritized remediation plan
MEO Retainer
$5,000 – $12,000/mo
Monthly MEO management: monitoring AI citation frequency, optimizing content for vector proximity, and expanding your meaning footprint across AI surfaces.
- ✓Monthly Meaning Score tracking
- ✓Content optimization for AI retrieval
- ✓AI citation monitoring (Google, Perplexity, ChatGPT, Claude)
- ✓Monthly reporting and strategy calls
Or try a free Automation Blueprint to see how we think.
We're building a self-serve Meaning Score tool. Want early access?
SEO taught us to optimize for crawlers. AEO taught us to optimize for answer boxes. GEO taught us to optimize for generative summaries.
MEO teaches us to optimize for the mechanism that powers all three: semantic retrieval in vector space. The content that wins is not the content with the most keywords, the best backlinks, or the cleverest schema markup. It is the content whose meaning lands closest to what the user — human or machine — is looking for.
The future of search is not about being found. It is about being understood.
Or explore our operating principles to see how MEO fits into everything we build.