Is the AI Boom a Rational Market or a Speculative Bubble?
- Tom Hansen
- Aug 13
- 5 min read

A market caught between promise and pressure
Artificial intelligence has become the gravitational center of the global tech economy. Market valuations are rising faster than even the most exuberant forecasts can justify. Capital floods into model labs and infrastructure providers with unprecedented speed. Nvidia briefly reached a $4 trillion valuation. OpenAI is reported to command figures north of $300 billion. Venture deals in the AI sector have overtaken entire industries.
The narrative is breathless. Yet beneath the surface, core unit economics look broken. Many AI leaders spend more on compute than they earn from customers. Margins erode as usage scales. And the infrastructure costs - electricity, chips, and talent - are not following Moore’s Law but veering sharply into the laws of physics. This raises the central question: are we in a moment of rational pricing for a general-purpose technology, or in the early stages of a speculative correction?
Anatomy of exuberance: how bubbles form and burst
To make that judgment, we need a disciplined lens. Economist Hyman Minsky outlined the lifecycle of speculative bubbles in five phases: displacement, boom, euphoria, profit-taking and panic. These stages are not just psychological. They manifest in traceable market behaviors: invented metrics, declining attention to fundamentals, leveraged bets, herd behavior and narrative dominance over earnings. This lens provides the diagnostic framework for the rest of this analysis.
The displacement moment: from Netscape to ChatGPT
Every bubble begins with a rupture in expectations. In 1995, it was Netscape’s IPO and the birth of the commercial internet. In 2022, it was the public launch of ChatGPT. This shift re-priced AI’s perceived timeline by a decade and triggered capital flows at a pace and scale the web era never reached. In less than 18 months, funding for AI startups exploded. Over $100 billion was raised in 2024 alone. Unlike the web, this phase occurred within an already digital economy, which meant the reflex to allocate capital was faster, larger, and more concentrated.
Revenue multiples on shaky ground
A defining feature of this phase is the application of high software-style revenue multiples to businesses with fundamentally different cost structures. AI model providers are often valued at 25 to 30 times revenue despite facing rising marginal costs. Gross margin after compute — essentially what’s left once the cloud bill is paid — is a more sober metric. In many cases it is negative or flatlining.
The assumption that scale leads to efficiency is challenged by the nature of inference. Every user interaction burns compute. Demand elasticity is limited by GPU availability and energy capacity. In this context, revenue growth can deepen losses. That is not a startup problem. It is a structural pricing problem embedded in the economics of model usage.
A market structured for consolidation, not competition
This pricing distortion is reinforced by a structural concentration that rarely gets discussed outside financial circles. The AI market is not competitive in any classical sense. It is a highly concentrated oligopoly. Microsoft, Amazon, Google, Meta and Nvidia control most of the infrastructure, capital and distribution.
This concentration creates a paradox. On one hand, the dominant firms are resilient. On the other, the system is fragile. Bargaining power is asymmetrically distributed. Smaller firms rely on platforms they cannot disrupt. And the dominant actors shape the field through bundling, closed data loops and integration into core enterprise infrastructure. Antitrust may slow the trend, but history suggests it rarely reverses structural dominance.
The upstream trilemma: hardware, power and policy
Every AI interaction begins not with code, but with electrons. As demand scales, the sector faces a physical trilemma. Chips, energy and regulatory throughput are becoming upstream chokepoints. GPUs are scarce and production is bottlenecked through TSMC. Energy demand is surging. Data centers may consume 12 percent of U.S. electricity by 2028. In some regions, grid capacity and water access already limit deployment.
This means revenue is now dependent on physical and political variables, not just on software progress. Permitting cycles, power contracts and hardware availability sit upstream of monetization. The economics of AI are increasingly coupled to infrastructure constraints rather than user growth curves.
Friction at the edge: regulation, chips and geopolitics
The external environment is not neutral. Compliance regimes, geopolitical tensions and regulatory scrutiny are reshaping the landscape. In Europe, the AI Act imposes escalating fixed costs, particularly for general-purpose models. In the United States, antitrust pressure is intensifying. And the US-China chip conflict is fragmenting supply chains, driving up costs and introducing discontinuities into global infrastructure. This friction raises the cost of doing business and complicates scaling strategies. Markets that assume global expansion may find themselves slowed or splintered by political and legal realities.
Who owns the margin?
The question of value capture has shifted from models to infrastructure. Durable margins increasingly accrue to players who control power, chips, or distribution rather than to those who build the models themselves. Microsoft can absorb OpenAI’s costs and monetize through Azure and Office. Nvidia profits from the hardware layer regardless of which model succeeds. In contrast, model labs compete on quality but often lack leverage. Unless they secure proprietary distribution, differentiated contracts or defensible integration paths, they risk becoming commoditized. Scale without pricing power is not advantage. It is exposure.
Three plausible futures
Strategically, three outcomes are now in play.
The first is a soft landing, where valuations compress modestly while revenues catch up. This scenario depends on steady enterprise uptake, manageable cost growth and benign macroeconomic conditions. It is plausible, but fragile.
The second is a structural squeeze, where growth is capped by energy, hardware and regulatory limits. Valuations plateau, capital rotates out, and a sector known for hyper-expansion enters a long operational grind.
The third is a contained correction, in which speculative-tier companies are repriced or fail, but core infrastructure players remain strong. This scenario mirrors past sector rotations and is arguably the most probable: capital leaves the narrative layer but consolidates around durable function.
Each path comes with signals. These include widening spreads between price-to-revenue and price-to-earnings multiples, rising GPU lead times, stalling enterprise productivity gains, and policy actions affecting infrastructure or compliance. Leaders who ignore these indicators may mistake friction for delay, or volatility for validation.
Navigating the fog: operator and investor cues
What matters now is not general optimism, but operational realism. For investors, that means moving beyond storytelling and tracking hard indicators: gross margin after compute, power purchase agreements, contract terms and energy constraints. For operators, it means financial operations (finops) designed for metered usage, renegotiation of cloud dependencies, and governance not as a constraint but as a monetizable feature.
The successful AI firm of this cycle will not be defined solely by model quality, but by cost discipline, margin architecture and the ability to manage upstream risk as a core competency. That is not a return to austerity. It is a return to precision.
Conclusion: endurance, not exuberance
The arc of AI technology remains intact. Its impact on knowledge work, automation and productivity is unfolding. But the capital story is already changing. Exuberance is being replaced by endurance. Hype is giving way to infrastructure, physics and contracts. Scarcity is no longer about ideas, but about electrons, permits and chips.
Valuations will correct. Narratives will evolve. The deeper truth is this:
markets do not reward imagination alone. They reward execution that survives constraint. In the coming phase, watching megawatts and gross margins will tell us more about AI’s future than any headline.



