---
title: "Is AI Really Eating the World? [1/2]"
description: "Hyperscalers spend $400B on AI, API prices drop 97%, and DeepSeek builds frontier models for $500M. Value is flowing to applications, not model providers."
date: 2025-11-23
updated: 2026-05-04
author: "Philipp D. Dubach"
categories:
  - "AI"
keywords:
  - "AI model commoditization"
  - "generative AI platform shift"
  - "AI value chain where value accrues"
  - "hyperscaler AI capex spending"
  - "enterprise AI adoption rate"
  - "LLM pricing decline"
  - "frontier AI model cost"
  - "AI infrastructure investment"
  - "open-source AI models"
  - "DeepSeek frontier model"
  - "AI competitive moat"
  - "technology platform cycle"
  - "model layer vs application layer"
  - "AI consulting revenue"
  - "AI capex sustainability"
  - "enterprise AI deployment"
  - "labor-augmenting AI"
  - "AI integration services"
  - "generative AI investment thesis"
  - "LLM API cost reduction"
type: "Analysis"
canonical_url: "https://philippdubach.com/posts/is-ai-really-eating-the-world-1/2/"
source_url: "https://philippdubach.com/posts/is-ai-really-eating-the-world-1/2/index.md"
content_signal: search=yes, ai-input=yes, ai-train=yes
---

# Is AI Really Eating the World? [1/2]

*Philipp D. Dubach · Published November 23, 2025 · Updated May 4, 2026*


## Key Takeaways

- Hyperscalers are spending $400B on AI infrastructure in 2025, more than global telecom capex, while API pricing has dropped 97% since GPT-3, pointing to rapid commoditization.
- 92% of developers now use AI coding tools, but 40% of CIOs do not expect production AI agent deployment until 2026 or later, showing adoption is deep in pockets but shallow overall.
- Consulting firms like Accenture are booking $3B+ in GenAI revenue, but the money comes from integration and process redesign, not from the models themselves.
- DeepSeek proved a frontier model can be built for $500M, collapsing the assumption that only the richest labs can compete at the capability frontier.


---

In August 2011, Marc Andreessen wrote ["Why Software Is Eating the World"](https://a16z.com/why-software-is-eating-the-world/), an essay about how software was transforming industries, disrupting traditional businesses, and revolutionizing the global economy. Recently, [Benedict Evans](https://www.ben-evans.com/benedictevans/2014/1/18/a16z), a former a16z partner, gave a presentation on the generative AI platform shift three years after ChatGPT's launch. His argument in short:

>we know this matters, but we don't know how.

In this article I will try to explain why I find his framing fascinating but incomplete, and why the evidence points toward AI model commoditization rather than durable competitive advantages at the model layer. Evans structures technology history in cycles. Every 10-15 years, the industry reorganizes around a new platform: [mainframes](https://en.wikipedia.org/wiki/Mainframe_computer) (1960s-70s), PCs (1980s), web (1990s), smartphones (2000s-2010s). Each shift pulls all innovation, investment, and company creation into its orbit. Generative AI appears to be the next platform shift, or it could break the cycle entirely. The range of outcomes spans from "just more software" to a single unified intelligence that handles everything. The pattern recognition is smart, but I think the current evidence points more clearly toward commoditization than Evans suggests, with value flowing up the AI value chain to applications rather than to model providers.

The hyperscalers are spending historic amounts on AI infrastructure. In 2025, [Microsoft, Google, Amazon, and Meta will invest roughly $400 billion](https://techblog.comsoc.org/2025/11/01/ai-spending-boom-accelerates-big-tech-to-invest-invest-an-aggregate-of-400-billion-in-2025-more-in-2026/) in AI capex, more than global telecommunications capex. Microsoft now spends over 30% of revenue on capex, double what Verizon spends. What has this produced? Models that are simultaneously more capable and less defensible. When ChatGPT launched in November 2022, OpenAI had a massive quality advantage. Today, dozens of models cluster around similar performance. [DeepSeek proved that anyone with $500 million can build a frontier AI model](https://newsletter.semianalysis.com/p/deepseek-debates). LLM pricing has collapsed. [OpenAI's API pricing has dropped by 97% since GPT-3's launch](https://techcrunch.com/2025/08/08/openai-priced-gpt-5-so-low-it-may-spark-a-price-war/), and every year brings an order of magnitude decline in inference cost.

*Related: [AI Capex Arms Race: Who Blinks First?](https://philippdubach.com/posts/ai-capex-arms-race-who-blinks-first/)*

Now, $500 million is still an enormous barrier. Only a few dozen entities globally can deploy that capital with acceptable risk. [GPT-4's performance on complex reasoning tasks](https://arxiv.org/abs/2303.08774), [Claude's extended context windows of up to 200,000 tokens](https://www.anthropic.com/news/claude-2-1), [Gemini's multimodal capabilities](https://blog.google/technology/ai/google-gemini-ai/), these represent genuine breakthroughs. But the economic moat isn't obvious to me (yet). Open-source AI models from Meta and Mistral keep narrowing the gap, and if the model layer commoditizes fully, the competitive advantage shifts to data, distribution, and integration.

Evans uses an extended metaphor: automation that works disappears. In the 1950s, automatic elevators were AI. Today they're just elevators. As [Larry Tesler](https://en.wikipedia.org/wiki/Larry_Tesler) noted in 1970,

> AI is whatever machines can't do yet. Once it works, it's just software.

The question: will LLMs follow this pattern, or is this different?

*Related: [The Most Expensive Assumption in AI](https://philippdubach.com/posts/the-most-expensive-assumption-in-ai/)*

Current enterprise AI deployment shows clear winners but also real constraints. Software development has seen massive adoption, with [GitHub reporting that 92% of developers now use AI coding tools](https://github.blog/news-insights/research/survey-ai-wave-grows/). Marketing has found immediate uses generating ad assets at scale. Customer support has attracted investment, though with the caveat that LLMs produce plausible answers, not necessarily correct ones. Beyond these areas, the enterprise AI adoption rate looks scattered. [Deloitte surveys from June 2025 show that roughly 20% of U.S. consumers use generative AI chatbots daily](https://www.deloitte.com/us/en/insights/industry/telecommunications/connectivity-mobile-trends-survey.html), with another 34% using them weekly or monthly. Enterprise deployment is further behind. [McKinsey data shows most AI "agents" remain in pilot or experimental stages](https://www.mckinsey.com/capabilities/quantumblack/our-insights/the-state-of-ai). A quarter of CIOs have launched something. Forty percent don't expect production deployment until 2026 or later.

But I think here's where Evans' "we don't know" approach misses something important. Consulting firms are booking billions in AI integration contracts right now. [Accenture alone expects $3 billion in GenAI bookings for fiscal 2025](https://www.crn.com/news/ai/2025/accenture-s-3b-ai-bet-is-paying-off-inside-a-massive-transformation-fueled-by-advanced-ai). The revenue isn't coming from the models. It's coming from integration projects, change management, and process redesign. The pitch is simple: your competitors are moving on this, you can't afford to wait. If your competitors are investing and you're not, you risk being left behind. If everyone invests and AI delivers modest gains, you've maintained relative position. If everyone invests and AI delivers nothing, you've wasted money but haven't lost competitive ground. Evans notes that cloud adoption took 20 years to reach 30% of enterprise workloads and is still growing. New technology platform cycles always take longer than advocates expect. His most useful analogy is spreadsheets. [VisiCalc](https://en.wikipedia.org/wiki/VisiCalc) in the late 1970s transformed accounting. If you were an accountant, you had to have it. If you were a lawyer, you thought "that's nice for my accountant." ChatGPT today has the same dynamic. Certain people with certain jobs find it immediately essential. Everyone else sees a demo and doesn't know what to do with the blank prompt. This is right, and it suggests we're early. But it doesn't tell us where value will accumulate in the AI value chain.

The standard pattern for deploying technology goes in stages: (1) Absorb it (make it a feature, automate obvious tasks). (2) Innovate (create new products, unbundle incumbents). (3) Disrupt (redefine what the market is). We're mostly in stage one. Stage two is happening in pockets. [Y Combinator's recent batches are overwhelmingly AI-focused](https://www.ycombinator.com/companies), betting on thousands of new companies unbundling existing software (startups are attacking specific enterprise problems like converting COBOL to Java or reconfiguring telco billing systems). Stage three remains speculative. From an economic perspective, there's the automation question: do you do the same work with fewer people, or more work with the same people? This echoes debates about [labor-augmenting technical change](https://en.wikipedia.org/wiki/Technological_change#Labor-augmenting_technological_change) in economics. Companies whose competitive advantage was "we can afford to hire enough people to do this" face real pressure. Companies whose advantage was unique data, customer relationships, or distribution may get stronger. This is standard economic analysis of labor-augmenting technical change, and it probably holds here too.

_Continue reading [Is AI Really Eating the World? AGI, Networks, and Value [2/2]](/posts/is-ai-really-eating-the-world-agi-networks-value-2/2/)_



---

## Frequently Asked Questions


### Is generative AI a true platform shift like mobile and cloud?

Based on historical technology platform cycles, generative AI appears to be the next platform shift following mainframes (1960s-70s), PCs (1980s), web (1990s), and smartphones (2000s-2010s). However, the range of outcomes spans from "just more software" to something more fundamental. The hyperscalers are spending $400 billion in 2025 on AI infrastructure, more than global telecommunications capex, which signals they believe this is a platform shift.


### Are AI models becoming commoditized?

Evidence strongly suggests yes. When ChatGPT launched in November 2022, OpenAI had a massive quality advantage. Today, dozens of models cluster around similar performance. DeepSeek proved that $500 million can build a frontier model, and OpenAI's API pricing has dropped 97% since GPT-3's launch, with LLM inference costs declining by an order of magnitude annually. Open-source AI models from Meta and Mistral are further accelerating the commoditization trend.


### Where will value accrue in the AI value chain?

Current evidence points toward value flowing up the stack to the application layer rather than to model providers. Consulting firms are booking billions in AI integration contracts, but the revenue comes from enterprise deployment, change management, and process redesign rather than from the models themselves. Accenture alone expects $3 billion in GenAI bookings for fiscal 2025. This mirrors historical patterns where platform and application companies ultimately capture more value than infrastructure providers.


### What does enterprise AI adoption actually look like?

Software development has seen massive adoption with 92% of developers using AI coding tools. Marketing generates ad assets at scale, and customer support has attracted investment. Beyond these areas, enterprise AI adoption is scattered. McKinsey data shows most AI agents remain in pilot stages, with 40% of CIOs not expecting production deployment until 2026 or later. Companies spent $37 billion on generative AI in 2025, up from $11.5 billion in 2024.


### How much are hyperscalers spending on AI infrastructure?

In 2025, Microsoft, Google, Amazon, and Meta will invest roughly $400 billion in AI infrastructure, more than global telecommunications capex. Microsoft now spends over 30% of revenue on capex, double what Verizon spends. Goldman Sachs projects cumulative hyperscaler capex from 2025 through 2027 will reach $1.15 trillion.


### Will open-source AI models replace proprietary ones?

Open-source AI models are closing the gap rapidly. DeepSeek proved a frontier model can be built for $500 million, and its V3 model was trained for just $5.5 million on 2,048 GPUs. The performance gap between open-source and proprietary models shrank significantly in 2025. However, proprietary models still lead on complex reasoning tasks, and the competitive moat may lie in integration, enterprise deployment, and application-layer services rather than in model capabilities alone.



---

Canonical: https://philippdubach.com/posts/is-ai-really-eating-the-world-1/2/
Content-Signal: search=yes, ai-input=yes, ai-train=yes
This file is the canonical machine-readable variant of https://philippdubach.com/posts/is-ai-really-eating-the-world-1/2/. Author: Philipp D. Dubach (https://philippdubach.com/).
