AI: Look for Air Pockets, Not Bubbles

Plenty of ink has been spilled on whether we’re in the midst of an “AI bubble.” While many are debating this hot topic, we see a different concern on the horizon: air pockets. What’s the difference? Imagine you’re on a flight. An air pocket feels like hitting severe turbulence—your plane jolts a few feet but stays aloft. A bubble, on the other hand, is like getting knocked from the sky.

So what exactly are these air pockets? Temporary disruptions and disconnections where one part of the AI ecosystem races ahead, leaving others to catch up. Hitting an air pocket might mean facing a sudden shortfall in demand—for compute, chips, datacenter capital expenditures—which would eventually realign after some near-term pain. On a smaller scale, this could emerge as disjointed growth, while at a larger scale, we could see significant capital cycles in certain parts of the value chain.

Overall, we view generative AI as a truly transformative technology—a view we’ve held since our first take in 2023. Put simply, the world has only begun to scratch the surface of its potential. The more immediate concerns stem from temporary imbalances in chips, power, and other resources—not systematic collapse.

We’ll Never Use So Little AI Again

In 1900, the world consumed around 12,000 terawatt-hours (TWh) of energy. Today, that figure has skyrocketed to around 190,000 TWh. Remarkably, the US power grid alone (even excluding energy sources like car engines) can now produce nearly as much as the entire world did 125 years ago.

Likewise, today’s AI usage is almost certainly the lowest level we’ll see in our lifetimes.

There are many ways to gauge future AI growth—whether it’s the number of queries or inferences, tokens processed, FLOPs (“floating point operations,” or computations), labor hours replaced, annual dollars spent on AI, GDP generated, or gigawatts of energy required. By any of these measures, AI potential is simply massive.

We’d put the opportunity for companies and the economy in the trillion-dollar range. McKinsey research suggests it could reach around $4.4 trillion from corporate use cases, and our findings align with that figure.[1] While much of today’s AI usage comes from hobbyists or on freemium models, we expect a significant shift as enterprise adoption grows. As AI becomes more embedded into the daily workflows of billions of workers globally, the value generated is set to soar.

Gauging AI Demand

The sheer volume of compute power being used to train models boggles the mind. We estimate that around 2.5 x 1027 FLOPs were performed in 2025 for training alone. For context, the GPU in the laptop this is being written on would take 23.5 million years to complete that many calculations. Consider that 23.5 million years ago, the Alps and Himalayas were still rising, and the first ape species were evolving from monkeys.

What’s more, over the past 15 years, the computational intensity of frontier models has surged at an astonishing rate of 4.6x/year (Display). While we still expect rapid growth from here, we also foresee potential resource constraints on the horizon.

Chart: Training compute for frontier AI models has grown 4.6x/year since 2010

Inferencing could also grow sharply, as more people ask AI to perform more—and more complex—tasks. Depending on how AI capabilities evolve, these models could save hundreds of billions (or even trillions) of labor hours per year, justifying trillions of dollars in annual corporate spending. For some perspective, if AI technology is deployed globally over the next decade, eventually saving 20%–25% of workers’ time, that could justify annual enterprise spending of over $10 trillion, in our view. While those time savings seem a little optimistic right now, in another three years, they could look conservative. It remains to be seen just how much of that “willingness-to-pay” makes its way into revenue streams. 

Demand could still become a constraint, and we remain attuned to stories of failed AI adoption. Yet we see increasing evidence of corporate AI uptake, along with more signs of its value-add from chief information officers’ points of view (Display). To put it plainly, if AI demand caps out at $1 trillion per year or less, then yes, we’re in a bubble. You can’t spend hundreds of billions or even trillions of dollars annually while recouping your investment with a $1 trillion revenue stream. Yet, we think reasonable estimates of AI’s value-add and the implied willingness-to-pay exceed that hurdle.

Chart: Enterprise AI is just hitting an inflection point

Hurdles to AI Adoption

Chip production and power generation/transmission are the two likeliest culprits when it comes to hurdles to AI adoption. All those calculations rely on high-powered chips, but producing these chips requires advanced technology, which has recently faced scaling challenges.

In the US at least, the emerging bottleneck is the electricity to power those chips in datacenters. Expanding the power generation infrastructure remains a major challenge for the AI boom. Despite rising demand, large-scale utility projects can only be greenlit and built so quickly—and those projects are being penciled in rapidly. BloombergNEF’s December 2025 update projected a 36% in US data center power needs since April, implying 10% annual growth in datacenter power over the next decade, which could lead to around $3 trillion in datacenter capex (Display). Given that datacenters currently represent a small portion of total power demand, that growth could account for about one-tenth of the overall increase in power production, which seems reasonable.

Chart: Data centers are expected to proliferate rapidly

Unfortunately, these projects typically take about seven years to complete, and we may need more power sooner to keep up with demand. To address this, fresh datacenter projects are increasingly turning to “behind the meter” power options that produce electricity onsite. Bernstein Research estimates that this could account for 20%–25% of new capacity by 2027–2028 and reach 40% by the end of the decade, mostly in the US, where power constraints loom largest.

Beyond chips and power, data centers face other potential constraints, including their significant water needs, their demands for high-bandwidth memory chips, and social and political pressures. We’re already witnessing pushback against the proliferation of AI-generated media (so-called “AI slop”) as well as concerns about the site selection for new datacenter projects due to their impact on local water supplies and electricity costs. Fears of job displacement could lead to even stronger public resistance.

A First Look for Investors

Given these dynamics, how can investors position accordingly? Look for companies that can address bottlenecks—particularly those that enjoy a competitive advantage in tackling them.

In terms of risks, we’re wary of potential coordination issues among the various elements of the ecosystem. While each element faces its own technological/production hurdles and supply dynamics, they remain intertwined.

We’ve been navigating a path initially shaped by AI models’ capabilities, then by the surge in chip production. Now, power limits are creating turbulence. As each part of the system advances at its own pace, the pressures build and shift. The real risk is sudden dislocation, with one element racing ahead while the others lag behind. This might look like building models that can’t be trained, producing chips that can’t be powered, investing in datacenters without assured supply, or adding power without demand. The journey isn’t seamless, but it’s still within reach.

Ultimately, we expect at least one capital cycle to play out, as we’ve seen in other major technological rollouts throughout history—railroads, the power grid, automobiles, and the internet (Display). History suggests that the pieces of the value chain that carry high fixed costs, long lead times, and poor potential for repurposing are most prone to overbuilds, as firms aggressively invest to capture future demand. 

Chart:The lessons from past boom-bust industrial cycles

With so many competitors vying for success, overall demand may not be sufficient for all players to earn high returns on their investments, leading to price collapses. With that in mind, we see some risks in the development of large-language models, which may ultimately become an oligopolistic market. We share similar concerns about the build-outs of datacenter capacity, though responsible developers will only proceed with construction once they have secured tenants and reliable power sources. Investors should keep their industry analyst hats on, carefully considering the barriers to entry in each market, the required levels of aggregate future demand, and the potential end-state industry structure.

Authors
Matthew D. Palazzolo
Senior National Director, Investment Insights—Investment Strategy Group
Christopher Brigham
Senior Research Analyst—Investment Strategy Group

[1] https://www.mckinsey.com/mgi/media-center/ai-could-increase-corporate-profits-by-4-trillion-a-year-according-to-new-research

The views expressed herein do not constitute research, investment advice or trade recommendations, do not necessarily represent the views of all AB portfolio-management teams and are subject to change over time.

Related Insights