SEO

The Zero-Sum Game of AI Recommendations

AI search is a zero-sum game with limited citation slots. Here's how the three pillars of AI visibility determine who gets recommended.

Author:
Shanal Govender
Contributors
Vlad Shvets, Leon Claassen, Teddy Cipolla
Date:
March 6, 2026

Every search engine you've ever used has operated on the same generous principle: here are ten links, figure it out yourself. Google gave you a buffet. Bing gave you a slightly sadder buffet. But the point was always the same. Multiple options, multiple winners, plenty of room at the table.

AI search engines flipped that table over.

When someone asks ChatGPT "What's the best project management tool for remote teams?" it doesn't return ten blue links. It returns an answer. A synthesized, confident, citation-backed recommendation. And that answer typically mentions three to five tools. Maybe eight if it's feeling generous.

Which means if your competitor is in that answer, you're probably not.

ChatGPT Search recommending project management tools in AI search, only a handful of brands make the answer

The Old World vs. The New World

The shift from traditional search to AI search isn't an evolution. It's a species change. To understand why, look at what actually happens on each side.

In traditional search, you competed for attention. In AI search, you compete for existence. If you're not cited, you don't exist in the answer.

The table below lays out the differences. It's not a subtle shift. It's a different sport entirely.

Traditional Search (Google, Bing) AI Search (ChatGPT, Google AI Mode)
What the user sees 10 blue links per page 1 synthesized answer with 3-8 citations
How you win Rank on page 1 Get cited in the answer
Competition model Positive-sum (10 slots per query) Zero-sum (3-8 citation slots per query)
What matters Keywords, backlinks, domain authority Entity authority, mentions, UGC signals
User behavior Click through, compare, decide Read the answer, maybe click 1 citation
KPI Rankings and clicks Citations and share of voice

In the old model, ranking #7 on Google still meant something. You were on page one. Somebody might scroll down. In AI search, if you're the 9th most relevant source, you're invisible. The AI already gave its answer and moved on. (It doesn't even feel bad about it.)

Vlad Shvets
CEO @ Empact Partners
We've tracked AI search citations across 124 partnerships. The pattern is clear: AI engines typically cite 3 to 8 sources per response. That's not a ranking. That's a shortlist. You're either on it or you're not.

How AI Search Actually Decides Who Gets Cited

Understanding the zero-sum nature of AI search requires understanding how these engines actually work under the hood. Spoiler: it's not PageRank with a chatbot skin.

AI search engines use a process called Retrieval-Augmented Generation (RAG). Here's the simplified version that doesn't require a PhD to follow:

Step 1, Query fan-out: Your single query gets decomposed into 5-10+ sub-queries. Ask "best CRM for startups" and the AI is actually searching for pricing comparisons, feature lists, user reviews, integration capabilities, and competitor analyses, all at once.
Step 2, Retrieval: Each sub-query triggers a web search. The system pulls specific passages from indexed pages, not entire articles. A 3,000-word blog post might contribute a single 200-word chunk. If your content doesn't contain a passage that directly answers one of those sub-queries, it never enters the candidate pool.
Step 3, Reranking: Retrieved passages get scored and reranked by relevance, authority, and freshness. This is where entity recognition matters. AI engines know who you are, what you do, and how often other sources mention you.
Step 4, Synthesis: The top passages get woven into a coherent answer with citations. Three to eight sources make the cut. Everything else gets left on the cutting room floor.

This is why traditional keyword optimization alone won't save you. The AI isn't matching keywords. It's understanding intent, evaluating entity authority, and selecting the most credible passages from across the entire web. It's like the difference between stuffing your resume with buzzwords and actually being qualified for the job.

The Three Pillars of AI Engine Visibility

So how do you actually get cited? After working across more than a hundred partnerships, from pre-seed startups to publicly traded companies, we've identified three pillars that determine whether AI engines recommend you or your competitor.

Pillar 1: Your Website (The Foundation You Control)

Your website is still the starting point, but "optimized" means something entirely different now. AI engines don't care about your keyword density. They care about whether your content clearly communicates what you are, what you do, and why anyone should trust you.

That means entity-rich content that explicitly states your product category, use cases, and differentiators. Schema markup that tells AI crawlers exactly what each page represents. Clean URL architecture that makes your site easy to parse. And structured content with clear headings, comparison tables, and definitive statements, because AI engines retrieve passages, not vibes.

We helped flair go from near-zero to 1,600% organic traffic growth in three years. A significant part of that was restructuring their content to be passage-retrievable, with every section answering a specific question a prospect might ask an AI engine.

Pillar 2: Third-Party Mentions (The Reputation Layer)

Here's where most teams get it wrong. They hear "mentions" and think "backlinks." They're related, but they're not the same thing.

AI engines don't need a hyperlink to understand that a G2 review mentions your product. They don't need an anchor text to know that a TechCrunch article discussed your latest funding round. They read. They comprehend. They connect the dots. A brand mention without a link still registers as a signal of authority and relevance.

This changes the entire calculus of off-page strategy. You're not just chasing links for domain authority. You're building a web of mentions across authoritative sources that AI engines use to validate whether your brand deserves to be in the answer. Think of it as reputation that an AI can actually measure, instead of just something your CEO talks about at conferences.

Leon Claassen
Senior GTM Consultant @ Empact Partners
We've seen partners with fewer backlinks outperform competitors in AI citations because they had more unlinked brand mentions across high-authority sources. AI engines don't count links. They count credibility signals. A mention on a relevant industry publication carries weight whether it's hyperlinked or not.

Pillar 3: UGC, The Wildcard That's Actually the Ace

User-generated content is the most underestimated pillar. And by "user-generated content," we mostly mean one thing: Reddit.

Reddit accounts for approximately 20% of all citations in AI search results. Twenty percent. From a single platform. If that statistic doesn't make you reconsider your content strategy, nothing will. (Though if nothing will, you should probably stop reading and go back to optimizing meta descriptions.)

Why Reddit? Because AI engines are desperate for authentic, unfiltered human opinions. When someone on r/SaaS writes "We switched from Tool X to Tool Y and our team productivity doubled," that carries more weight in an AI's synthesis than a polished case study on Tool Y's website. AI engines know the difference between marketing and genuine user experience. They've read enough of both.

We ran a Reddit strategy for KKday that generated over 3 million post views and 4,000+ upvotes. That kind of authentic engagement doesn't just drive direct traffic. It feeds directly into the corpus that AI engines draw from when synthesizing answers about travel booking platforms.

Reddit is not a social media strategy. It's an AI search strategy that happens to live on a social platform.

Why This Is Urgent (Not Just Important)

The uncomfortable truth is that AI search behavior is cementing right now. Every day that an AI engine answers a query and cites your competitor instead of you, that pattern gets reinforced. The AI learns that your competitor is the authoritative source for that topic. It gets harder to displace them, not easier.

This isn't like traditional search where you could publish a better blog post next month and outrank someone. AI engines build knowledge graphs. They develop entity associations. They remember who the reliable sources are. If your competitor occupies the "best project management tool" citation slot for six months straight, you're not just losing traffic. You're losing position in the AI's understanding of your market.

Does that sound dramatic? Good. It should. We've been helping partners build AI engine visibility across more than a hundred engagements, and the pattern is consistent: early movers in GEO are compounding their advantage while everyone else is still debating whether AI search matters. (It does. We checked.)

Teddy Cipolla
Senior GTM Consultant @ Empact Partners
The partners who started GEO work six months ago are already seeing compound returns. They're getting cited consistently, which reinforces their authority, which leads to more citations. It's a flywheel, and the longer you wait to start spinning it, the harder it is to catch up.

What Winning Actually Looks Like

The formula isn't complicated. It just requires doing three things well simultaneously, which is harder than doing zero things well, which is what most companies are currently doing about AI search.

Audit your AI visibility: Before you optimize anything, figure out where you stand. Ask AI engines the queries your prospects ask. Count how often you get cited versus your competitors. You can't fix what you can't measure.
Restructure your website for passage retrieval: Every key page should have clear, self-contained sections that can be extracted and cited independently. Think FAQ-style clarity, not stream-of-consciousness thought leadership.
Build your mention footprint: Get mentioned, with or without links, across industry publications, review sites, comparison articles, and expert roundups. AI engines triangulate authority from multiple sources.
Invest in authentic UGC: Especially Reddit. Real users talking about real experiences with your product is the strongest citation signal AI engines have. You can't fake this, and you shouldn't try.
Monitor and iterate: AI search is dynamic. Track your citations weekly, identify gaps, and close them. The brands that treat this as a one-time project will lose to the ones that treat it as an ongoing program.

At Empact Partners, our GEO formula is straightforward: GEO = UGC + Mentions. It takes 3-6 months for initial momentum, and the results compound from there. We've seen it work across SaaS verticals from HR tech to developer tools to e-commerce platforms. Yes, we're biased, but we're also sitting on data from 124 partnerships that says we're right.

The Bottom Line

AI search is a zero-sum game. There are a limited number of citation slots per query, and every slot your competitor occupies is one you don't. The brands that understand this and act on it now will own their categories in AI search. The ones that wait will spend the next three years wondering why their traffic is declining despite "doing everything right."

The good news: the playbook exists. The three pillars work. And the window to build a compounding advantage is still open, just not for much longer.

If you want to understand where your brand stands in AI search and what it would take to start winning those citation slots, let's talk.

Ready
To Connect?

Let's Partner