← Blog
· AI

ChatGPT has been showing ads since February 9.

Sam Altman called advertising in AI « uniquely unsettling » in May 2024. A « last resort, » he said.

The last resort arrived in under two years.

Why? Because only 5% of ChatGPT’s 800 million users pay for a subscription. And OpenAI generated $20 billion in revenue in 2025 — against $9 billion in losses.

Anthropic raised $30 billion in February 2026. At a $380 billion valuation. On $9 billion in annualized revenue. The ratio is staggering.

We talk a lot about models, benchmarks, tokens per second.

Rarely about the only question that should obsess decision-makers: is anyone actually making money?

The short answer: no.

The long answer: Deutsche Bank estimates OpenAI’s cumulative losses at $143 billion before reaching profitability. In 2029. Maybe.

The paradox is structural.

Unlike cloud computing — where a single server handles thousands of clients at near-zero marginal cost — every AI query is compute-intensive. Every token generated requires GPU cycles. There are no classical economies of scale on inference.

The result: OpenAI spends $1.69 for every dollar earned.

The more you sell, the more you lose.

And the price war is accelerating. The cost of one million GPT-4-equivalent tokens dropped from $20 in late 2022 to $0.40 in early 2026. Prices are falling at a median rate of 50x per year.

Meanwhile, DeepSeek delivers comparable performance at 1/30th the cost. Funded by a Chinese hedge fund that returned 56% in 2025. And isn’t even trying to be profitable.

Alibaba distributes Qwen for free under Apache 2.0 — the most downloaded open-source model in the world — and monetizes through its cloud. Zhipu AI (GLM) just completed the largest foundation model IPO in history in Hong Kong. $45 million in revenue, $350 million in losses. GLM-4.7-Flash is free. No quotas.

The Chinese ecosystem is attacking on every front: price, open source, volume.

When your competitors aren’t even trying to make money on the model itself, you have a structural problem.

Your $20/month ChatGPT subscription runs on the same business model as a gym membership. It works because most subscribers barely use it. Light users subsidize power users.

Except a gym counts on people quitting in February. AI is the opposite — the better the models get, the more people use them. More every day.

95% of organizations have not measured any return on their generative AI investments, according to a study by the MIT Media Lab.

OpenAI launched ads. When a $20 billion company turns to advertising, it means subscriptions aren’t enough.

The only player getting rich in this story is Nvidia.

$57 billion in revenue in a single quarter. 86% of the AI chip market. $500 billion order backlog.

The shovel seller during the gold rush.

People often compare AI to AWS — which lost money for 3 years before becoming Amazon’s cash machine.

Except AWS has near-zero marginal costs. AI doesn’t.

Except AWS didn’t face an entire Chinese ecosystem — DeepSeek, Qwen, GLM — attacking simultaneously on price, open source, and volume.

What this means if you’re integrating AI into your operations:

Current API prices are likely subsidized. Budget for a 30–50% increase in the medium term.

Can your provider survive a venture capital drought? That’s a legitimate question when OpenAI is negotiating a $100 billion round to cover its losses.

And the alternative exists. A well-quantized local model — 32B at Q8 on a Mac Studio with 64GB of RAM — covers 80% of use cases at a fraction of the recurring cost. Not for everything. But as a safety net.

AI will transform organizations. But lasting transformation doesn’t depend on a single provider burning $5 billion a year.

The question isn’t whether AI will transform the world. The question is: who will still be standing when the bill comes due?