top of page

What’s Really Holding Back the AI Boom? Computing Power or Electricity?

  • Writer: FBD GROUPS
    FBD GROUPS
  • Mar 19
  • 4 min read

Updated: Apr 22



Why is the Conversation Shifting from “Computing Power” to “Electricity”? 


In the past, discussions around Artificial Intelligence (AI) typically centered on the capabilities of AI models, chip speeds, and overall computing power. By 2026, investors, policymakers, and energy executives are focused on a different question altogether: How much electricity does AI actually require, and can the grid keep up with it? 


According to the International Energy Agency (IEA), global data center power consumption is projected to reach approximately 945 Terawatt-hour (TWh) by 2030. This represents more than a double increase from current levels and will account for roughly 3% of total global electricity demand. At the same time, electricity usage from data centers is expected to grow at around 15% annually, more than four times faster than overall electricity demand across all other industries combined. 


AI has turned into a new paradigm of power demand, and it requires high-capacity and 24/7 uninterrupted operation. In that context, a new consensus is taking shape. If data, algorithms, and computing power once defined the foundations of AI, then electricity has become the fourth pillar today. 

  

Electricity is profoundly shaping the pace of AI development, dictating the strategic locations of data centers, and even swaying the competitive standing of nations. 



How is Data Center Demand Staining the U.S. Power Grid? 


As data center development accelerates, electricity demand is rising sharply. The pressure is already visible in the United States. Northern Virginia, home to the world’s largest cluster of data centers, has seen contracted power capacity approach 40 GW, according to Dominion Energy. That’s roughly equivalent to the consumption of 10 million homes. 


The grid, however, wasn’t built for this kind of surge. PJM Interconnection, one of the largest grid operators in North America, projects electricity demand to grow at an average rate of about 4.8% annually over the next decade. This signifies a fundamental shift in the underlying logic of how systems operate. 


The load capacity of transmission lines and substations, along with the limitations of distribution networks, have become the primary bottlenecks in data center development. Decaying infrastructure and protracted construction timelines have made these challenges even more intractable. Meanwhile, demand is moving much faster. 


At the World Economic Forum in Davos, leaders across both the tech and energy sectors made the same point: AI may drive digital progress, but it also exposes the physical limits of the energy systems that support it. 



Are Tech Giants Becoming Energy Companies? 


Faced with energy shortages, tech giants have shifted their strategy. They are no longer mere consumers of electricity but are increasingly moving upstream to secure their own energy sources: 

  • Signing long-term energy purchase agreements; 

  • Investing directly in energy generation; 

  • Building dedicated energy systems; 

The direction is becoming clearer at the policy level as well. The U.S. government has signaled that large-scale AI data centers cannot rely indefinitely on the public grid, they will need to secure their own energy. The implication is straightforward: AI competition is no longer just about compute capacity. It’s about access to energy. 



How Is the Industry Responding?


  • Natural Gas 

Natural gas remains the most practical short-term option, often paired with renewable energy to support data center growth. But as supply chains tighten, gas turbines and related infrastructure are facing longer lead times, and project timelines are starting to stretch.  


  • Nuclear 

Nuclear energy has quickly moved to the forefront of the discussion. Microsoft has signed a 20-year power purchase agreement tied to the restart of Three Mile Island; Amazon has secured access to nearly 2 GW of nuclear capacity; Google is working with partners on small modular reactors (SMRs). Nuclear power offers the constant, reliable energy supply that AI infrastructure demands. Yet, due to protracted development cycles and complex permitting hurdles, it remains a strategic long-term play rather than a quick fix for current energy shortages. 

 

  • Renewables 

Solar and wind energy continue to expand rapidly as part of the energy mix for data centers.  While renewable energy has seen remarkable growth in recent years, its inherent intermittency remains a significant drawback for grid stability. 



How is AI Industry Chain being redefined?  


As electricity becomes a constraint, the focus of the AI industry is shifting. It’s no longer just about GPUs, servers, or cloud platforms. The feasibility of projects today often hinges on the timely delivery of transformers, power distribution equipment, energy storage systems, and cooling infrastructure. 


The International Energy Agency (IEA) estimates that by 2030, data centers could account for nearly half of the growth in U.S. electricity demand. In that environment, the competitive edge will belong to those who can deliver electricity reliably, efficiently, and at scale. 


From multiple perspectives, AI infrastructure is increasingly resembling an industrial system rather than just an information system. AI runs on electricity, and electricity depends on a deeply interconnected supply chain. 



AI Runs on Power, and Power Runs on Infrastructure 


In the AI era, a new benchmark is emerging, how much usable intelligence can be generated per kilowatt-hour of electricity. The computing race is no longer just about chips; it has evolved into a systemic competition centered on power and infrastructure. 

  

In the end, the question isn’t simply who has the most computing power. It’s this: Who can actually power it? 

 
 
 

Comments


bottom of page