The Structural
Dependency
AI depends on a content ecosystem it’s not paying enough to sustain. The open web is contracting. The question isn’t what happens to publishers — it’s what happens to everything built on top of them.
AI depends on a content ecosystem it’s not paying enough to sustain. The open web is contracting. The question isn’t what happens to publishers — it’s what happens to everything built on top of them.

Dan Muirhead · Co-Founder, Head of Strategy
In The Invisible Publisher, we mapped the structural rupture in how information reaches people. In The Earned Media Equation, we traced the implications for brand visibility — how the disruption of digital publishing threatens the earned media channel that AI engines depend on most.
Both pieces documented what’s happening. This one asks what happens next.
The research we’ve done across these volumes keeps circling back to a single structural dependency: AI platforms depend on a content ecosystem they’re not paying enough to sustain. The platforms that consume publisher content to generate answers are simultaneously destroying the economics that produce that content. And the publishers fighting to survive are rationally gating, restricting, and moving their best content off the open web — which further degrades the information ecosystem AI is built on.
This isn’t a market inefficiency. It’s a feedback loop.
What follows is our attempt to think through the second and third-order consequences of this loop — grounded in the constants of human behavior, culture, and economics that aren’t changing, even as the technology around them changes rapidly. We’re not making predictions with false precision. We’re mapping likely trajectories, gaming out how different actors will respond, and identifying the structural dynamics that brands and publishers need to understand whether the horizon is three years out or fifteen.
Some of this will be wrong. All of it is honest.
Before mapping futures, it’s worth naming the things that won’t change. Technology cycles come and go. The following are rooted in human nature, culture, and economic fundamentals. They’re the load-bearing walls of the analysis.
This predates the internet, search engines, and AI. It’s as old as language. People asked friends, then checked encyclopedias, then searched Google, now ask ChatGPT. The medium changes. The drive doesn’t. Digital publishing does not die because the demand for information does not die. The probability of digital publishing going away is near zero — it’s not a product category that can be obsoleted. It’s a behavior expressed through technology.
Trust is the most durable currency in information. It’s why brands exist. It’s why bylines matter. It’s why people pay for subscriptions to specific writers rather than just accessing “news.”
AI has a trust problem that’s getting worse, not better. Sprout Social’s Q1 2026 Pulse Survey found that 88% of consumers say AI video generation tools have reduced their trust in news on social media. Nearly a third say they’re less likely to choose a brand that uses AI ads. Only 26% of consumers prefer AI-generated content — down from 60% in 2023.
The supply of AI content is exploding. The demand for human-vetted content is holding steady or growing.
Time spent on social media has declined over the past two years to 2 hours and 21 minutes daily — down 10 minutes despite platforms doing everything to retain attention. Gen Z actively wants to spend less time on devices. Digital detox behavior is rising, especially among younger users.
There’s a ceiling on how much screen time humans will tolerate, and we’re bumping against it. This creates a counterweight to AI adoption: even as AI interfaces become more convenient, the total attention budget doesn’t expand to accommodate them. AI absorbs time from other digital activities. It doesn’t create new time.
This drives every social platform, every Substack, every Discord server, every live event. It’s why newsletter open rates outperform algorithmic feeds. It’s why publisher events revenue is growing faster than any other line item. AI can organize information. It cannot create belonging.
The music industry lost 50% of its revenue between 1999 and 2014 due to digital piracy and streaming disruption. Then streaming created a new equilibrium where total industry revenue eventually exceeded the pre-disruption peak — but the value distribution changed radically. Artists earn less per play. Labels that adapted thrive. The ones that didn’t are gone.
Digital publishing is following the same arc with a ~10 year lag. The disruption is real. “The industry dies” is almost never the outcome. What changes is who captures value, and how.
The same pattern is playing out in publishing. The disruption is real. But digital publishing doesn’t die because the demand for information, trust, community, and belonging doesn’t die. What changes is the business model, the distribution, and who captures the value.
The behavioral shift to AI is not a trend that might reverse. The adoption velocity makes reversal structurally impossible.
ChatGPT added roughly 600 million weekly users in fourteen months, going from 300 million in December 2024 to 900 million by February 2026. Google AI Mode reached 75 million daily active users by early 2026 — a 4x increase since its May 2025 launch — and expanded to 53 languages across 40+ markets. Perplexity’s daily usage crossed 35–45 million queries per day, with usage growing 5–6x in under two years. AI tools now handle 56% of global search-related sessions.
weekly active users in 14 months
80% AI chatbot market share. Mainstream behavior, not early adoption.
4x increase since May 2025 launch
53 languages, 40+ markets. AI Overviews reach 1.5B monthly users.
queries per day, 5–6x growth in 2 years
Research-focused users with high commercial intent.
These aren’t early-adopter numbers. This is mainstream behavior in the same way that Google Search was mainstream behavior by 2005 — the inflection point has passed. The convenience gap between “ask a question and get a synthesized answer” and “scan ten blue links and click through to find the answer” is too large to close.
Social media provides a useful parallel for how content consumption evolves. Social usage is massive — 5.66 billion users globally, roughly 69% of the world’s population — but the quality of engagement is shifting. Time spent per day has actually declined slightly. Users are fragmenting across more platforms (7–8 on average). And skepticism about content quality is rising: 56% of social media users report seeing “AI slop” often or very often.
The social signal: usage scales, but trust doesn’t scale with it. People don’t stop using AI or social platforms. But they become more selective, more skeptical, and more interested in authentic human voices as a counterweight. The demand for trustworthy content increases in proportion to the supply of untrustworthy content.
This is the market reality brands and publishers are navigating. People use AI for convenience. They seek out human content for trust. Both behaviors coexist, and will continue to coexist, because they serve different psychological needs.
The structural dependency at the center of AI visibility isn’t static. It’s a self-reinforcing cycle, and the inputs are already in motion.
AI platforms absorb publisher content
Publisher traffic and revenue decline
Publishers die, gate content, or shift to non-web formats
Less quality content available for AI training and retrieval
AI-generated content fills the gaps
Future models train on lower-quality data
AI answer quality degrades in specific domains
Cycle repeats — back to step 1
AI platforms forced to invest in content access — licensing, revenue share, direct production
AI platforms consume publisher content — both for real-time retrieval (citations) and for foundational model training. This consumption undermines publisher economics because it satisfies user queries without sending traffic back to the source. Publishers lose revenue and either die, gate their content behind paywalls, block AI crawlers, or shift their best content to formats AI can’t easily index. The pool of quality content available for AI shrinks.
Simultaneously, AI-generated content is flooding the web to fill the gaps. In 2025, AI-generated articles surpassed human-written content online for the first time. This matters because AI models trained on the outputs of other AI models progressively degrade — a phenomenon researchers call “model collapse.”
The result is a slow-moving quality degradation that most users won’t feel immediately. AI answers don’t suddenly become wrong. They gradually become thinner, less specific, less grounded in domain expertise, and more generic. The degradation shows up first in niches where coverage was historically produced by small and mid-tier publishers — the exact publishers dying fastest.
The disruption isn’t limited to what AI retrieves in real time to answer a query. Publisher content is what these models were trained on. It’s the foundational layer of how AI understands products, categories, brands, and the world. The editorial web is the substrate.
And that substrate is being withdrawn. 79% of top news sites now block AI training bots. Cloudflare blocks AI crawlers by default for new sites. Over a million Cloudflare customers have opted in to block crawlers. The publishers that survive are increasingly restricting access for future model training, even as current models still benefit from historical crawls.
So the impact is layered: AI is losing both the real-time citation sources (as publishers die or shift formats) and the training data pipeline (as surviving publishers block crawler access). The models that exist today were trained on a more open web than the one that exists now. Future models will be trained on a more restricted, more AI-generated, lower-quality dataset — unless the economics change.
The feedback loop isn’t fate. Every actor in the system is responding, and their responses create new dynamics. Here’s how we see the response chains playing out.
AI platforms are signing licensing deals to secure access to quality content. OpenAI has deals with the NYT, Washington Post, Guardian, Condé Nast, and AP. Microsoft launched the Publisher Content Marketplace. Perplexity created a publisher revenue share program with over 100 partners. These deals will expand and become table stakes.
Licensing works for Tier 1 and some Tier 2 publishers. For niche and local content, three responses emerge:
AI platforms that pay for content produce measurably better answers. Content access becomes a moat that reinforces the existing AI oligopoly. Quality becomes a function of content partnerships, not just model architecture. Only the largest AI companies can afford comprehensive licensing.
Tier 1 publishers become acquirers. Premium publishers will buy Tier 2 and Tier 3 properties to expand their content footprint, making their licensing packages more valuable to AI platforms. The NYT has already done this — acquiring The Athletic, Wordle, and Wirecutter. We expect meaningful consolidation in the next three to five years.
Tier 2–3 survivors become hybrid businesses. They monetize through events, community, newsletter subscriptions, and branded partnerships. The web presence becomes a storefront for the relationship, not the primary revenue engine. This mirrors the music industry: musicians stopped making money from album sales and started making money from touring, merch, and direct fan relationships.
Tier 4 splits into two outcomes. Ad-supported independent publishers largely die. Creator-driven publishers who build direct audience relationships survive. The model is durable because it’s powered by individual human connection, not search traffic.
Some of today’s Tier 2–3 publishers may pivot to producing structured, machine-readable content designed specifically for AI retrieval. Think “content as a service” for AI platforms — product databases, structured industry analysis, verified factual data feeds. This isn’t journalism. It’s information infrastructure. The publishers that recognize this early have a path to recurring revenue that doesn’t depend on human readers visiting their website.
Consumer behavior won’t split into “uses AI” or “doesn’t use AI.” It will segment by stakes.
Factual lookups, quick comparisons, how-to
Which product, restaurant, or service provider
Health, legal, financial, career
The threshold for “low stakes” rises over time as AI quality improves. But it doesn’t disappear.
Nobody “goes back to Google” in the traditional sense. But Google doesn’t disappear either. Google retains dominance in product search, local search, and visual search. The Google of 2028 won’t look like the Google of 2023. It’ll be an AI answer engine with traditional search as a fallback.
Nobody “goes back to visiting websites” as the primary behavior. But people don’t stop visiting websites entirely. They visit fewer sites, more intentionally, for higher-stakes purposes. The website becomes a trust destination, not a discovery surface.
We hold these with moderate confidence. The trajectories are established. The specific timing could shift.
Publisher content moves behind paywalls, into newsletters, into video, into community platforms. The freely accessible, ad-supported editorial web shrinks significantly. But it doesn’t reach zero. Some publishers — particularly local, civic, and publicly funded — continue to publish openly. And AI platforms create economic incentives (licensing, revenue share) to keep some content accessible, because they need it.
The platforms with licensing deals produce better answers in content-rich domains. The ones without produce increasingly generic, outdated, or AI-citing-AI responses. This stratification becomes a competitive differentiator in the AI market itself. “Which AI gives better answers about [specific domain]” becomes a meaningful comparison axis, just as “which search engine gives better results” was in the 2000s.
As the pool of independent, trusted publisher content shrinks, the remaining sources become more influential in AI citation. Brands that invested early in earned media relationships across surviving publishers compound their advantage. Brands that waited find fewer outlets, higher costs, and less AI visibility. The window for building earned media authority at reasonable cost is open now. It narrows every quarter as publishers consolidate and the surviving ones gain pricing power.
The “middle” of publishing — the Tier 2–3 publishers that are neither premium enough to command licensing deals nor niche enough to survive on community — gets hollowed out. What remains is a barbell: large institutional publishers on one end, individual creator-publishers on the other, and less in between. This is the same pattern that played out in retail (Walmart and boutiques survive; mid-tier department stores die) and in music.
Something between publishers and AI platforms. Think content aggregation with licensing built in. Maybe it’s Microsoft PCM at scale. Maybe it’s a new company. But the market needs a mechanism for connecting quality content with AI retrieval that works for publishers smaller than Condé Nast. This is the biggest structural gap right now, and gaps this large tend to get filled.
Just as data companies became essential infrastructure for the financial industry, content companies become essential infrastructure for AI platforms. The business model isn’t “sell ads against pageviews.” It’s “produce authoritative content that AI platforms license to improve their answers.” The publisher’s customer shifts from “the reader” (via ads) to “the AI platform” (via licensing) — with the reader relationship maintained through subscriptions, events, and community.
As AI-generated content proliferates and model collapse becomes empirically documented, a premium emerges for demonstrably human-created, editorially vetted content. This may take the form of certification, labeling, or simply brand reputation — similar to how “organic” became a market category in food.
The demand signal is already here: consumer preference for AI-generated content has dropped from 60% to 26% in three years. Publishers that can credibly certify their content as human-researched, human-written, and editorially vetted will command premium licensing rates from AI platforms that need quality training data.
AI models trained on AI output degrade. This creates a market correction: AI platforms are forced to invest more in human-generated training data, which means paying publishers more, which stabilizes the ecosystem at a new equilibrium. But this correction takes five to ten years to play out because the degradation is gradual and hard to measure in real time. The correction arrives. It just arrives after significant damage has already been done.
NYT • FT • Bloomberg • Economist • Consolidated Condé Nast/Hearst
Ten to twenty major global publishers with subscription, licensing, and events models. The “studios” of the content industry. They produce premium, human-vetted content that AI platforms pay to access and human audiences pay to subscribe to.
Substack writers • YouTube creators • Podcast hosts • Community builders
Millions of individuals with direct audience relationships through newsletters, video, podcasts, and communities. The “independent artists.” Some earn well. Most don’t. But the model is durable because it’s powered by individual human connection.
Product databases • Review aggregation • Structured analysis • Real-time data feeds
Companies that produce structured, machine-readable content designed specifically for AI retrieval. This category doesn’t fully exist yet. But it will, because the demand is structural. The “data providers” of the content industry.
The distinction between “social media,” “search engine,” and “AI assistant” blurs to the point of meaninglessness. Instagram already has search. Google has AI. ChatGPT has social features and browsing. TikTok functions as a search engine for Gen Z. In fifteen years, these are all just “information interfaces” with different interaction modalities.
The question for publishers and brands isn’t “which platform to optimize for.” It’s “how to be the trusted source that every interface draws from.” The publishers and brands that are the ground truth — the ones AI engines cite, social algorithms surface, and consumers seek out directly — win regardless of which interface mediates the interaction.
The structural dependency is real: AI platforms depend on content they’re not paying enough to sustain.
AI adoption is mainstream and irreversible at current scale.
Consumer trust in AI-generated content is declining even as usage grows.
Publisher economics are genuinely under threat, most severe for small and mid-tier publishers.
Markets find equilibrium after disruption — the music industry precedent is directionally applicable.
That AI answer quality will measurably degrade in specific verticals as source content disappears.
That publisher consolidation will accelerate meaningfully in the next three years.
That “verified human content” will emerge as a formal market category.
That AI platforms will invest directly in content production within five years.
That the new equilibrium will produce total industry revenue comparable to the pre-disruption peak.
The speed of the correction — regulation could accelerate it, or AI companies could resist paying longer than expected.
The degree of consumer behavioral change — AI quality improvements could compress verification behavior more than we expect.
Whether AI platforms will build or license content — the balance could tip either way.
Whether consolidation produces healthy institutions or monopolistic gatekeepers.
The internet is transitioning from a destination model to an infrastructure model. For twenty-five years, the internet was a place you went. You visited websites. You browsed. You clicked. Publishers built businesses on that behavior.
Now the internet is becoming infrastructure that powers AI interfaces, social feeds, and agent-driven commerce. Content still flows through it. But people interact with it through intermediary layers that abstract away the destination experience.
Publishing doesn’t die because the demand for content doesn’t die. But the business of publishing changes because the relationship between content creation and content consumption is being permanently intermediated. The publishers that survive own the relationship with either the audience (subscriptions, community) or the AI platform (licensing, structured data feeds) — not the ones trying to capture attention at the intermediary layer.
For brands, this means AI visibility is not a channel you optimize. It’s a structural dependency you manage. The content ecosystem your brand’s visibility is built on is shifting beneath you. The brands that understand the dependency — and invest in the health and diversity of their earned media ecosystem — will have more durable visibility than the ones waiting for the dust to settle.
AI platforms are consuming publisher content, stripping away the brand, and returning almost nothing. The Three Surfaces framework for navigating the AI-mediated content landscape.
ReadHow the structural disruption of digital publishing is reshaping brand visibility in AI — and what it means for the companies that depend on earned coverage.
ReadSources and methodology notes: This analysis draws on AI platform usage data from OpenAI, Google, and Perplexity public disclosures and independent tracking (Similarweb, Semrush, Business of Apps). Consumer sentiment data from Sprout Social Q1 2026 Pulse Survey (2,000+ respondents, US/UK/Australia), Journal of Business Research, and Hootsuite Social Trends 2026. Social media usage data from DataReportal Digital 2025, GWI, and platform-specific reporting. Music industry revenue data from RIAA. Publisher economics data from sources cited in Volumes 1 and 2. Projections are clearly labeled as such and represent the authors’ analysis of structural trends, not quantitative forecasts.
AIVO maps your visibility across ChatGPT, Google AI, Perplexity, and more — then builds the strategy to close the gaps.