Opinion
AI doesn’t need more power, it needs a smarter grid
AI's impact on the energy grid could be transformative, allowing use to harness vast amounts of unused or idle capacity. Image: REUTERS/Norlys Perez
- Stanford research reveals advanced economy grids operate at 30% utilization, leaving vast capacity idle due to outdated coordination systems.
- A 1% improvement in system flexibility could unlock 100 GW in the US alone, equivalent to $500 billion in avoided infrastructure.
- Portland General Electric's partnership demonstrates hundreds of megawatts can be accelerated years ahead through latent capacity activation.
As nations compete to power the AI revolution, a counterintuitive strategy is emerging: the infrastructure bottleneck constraining technological leadership is being solved not through copper and construction, but through code. What’s known as flexible grid optimization could double effective capacity faster than any building programme, with profound implications for competitiveness and climate goals.
When Satya Nadella acknowledged that Microsoft has GPU clusters sitting idle – depreciating assets waiting for power that may not arrive for years – he crystallized the defining constraint of the AI era. Sam Altman's assessment that OpenAI requires a gigawatt of power daily, roughly 20 times what the entire United States added in new generation capacity last year, reveals the scale of misalignment between AI ambitions and energy reality.
This is no longer simply a utility planning issue. It's a question of national competitiveness and human prosperity.
Data centres are driving growth in electricity demand
The conventional narrative frames this as an infrastructure deficit requiring massive capital deployment and decade-long construction timelines. Yet Stanford researchers studying grid utilization patterns across advanced economies have identified a paradox that reframes the entire challenge: these sophisticated electricity systems operate at approximately 30% utilization, with two-thirds of installed capacity sitting idle most hours.
The grid reaches capacity constraints for perhaps 100 hours annually, under rare peak conditions. The rest of the time, vast capacity goes unused.
Diverging strategies in the global AI race
The world is witnessing fundamentally different approaches to solving the AI power challenge, each reflecting distinct institutional capabilities and strategic priorities.
China deployed nearly 550 GW of new power capacity last year while the United States added 53 GW. This rapid deployment model leverages state coordination, centralized decision-making and the ability to fast-track major infrastructure projects. For an economy building out its grid to serve expanding urban populations and industrial growth, this infrastructure-first approach makes strategic sense.
Advanced Western economies face a different context. Their mature grids already serve developed populations, but patterns of use reveal massive untapped potential. Managing flexibility for less than 100 hours annually could unlock 100 GW of effective capacity nationwide – doubling the grid without doubling the infrastructure.
The critical insight is recognizing which strategy aligns with institutional strengths and infrastructure realities. For countries with advanced grid coordination capabilities, market structures that price flexibility and regulatory frameworks that enable rapid innovation, the pathway of intelligent optimization offers compelling advantages. These include deployment timelines measured in months rather than decades, dramatically lower capital requirements and natural alignment with decarbonization goals.
From energy scarcity to energy abundance
This shift means using AI to make the grid itself more intelligent, rather than relying only on making AI workloads flexible. It's a challenge of optimization, rather than scarcity.
Power systems evolved around rigid patterns, sized for worst-case peak hours that occur <0.1% of the time, leaving massive installed capacity underused the rest of the time. The opportunity lies not in changing when data centres consume power, but in deploying AI to orchestrate every component of the system in real time.
GridCARE's partnership with AI infrastructure developers and Portland General Electric demonstrates the concept at scale. By deploying predictive AI models that forecast renewable output and demand hours in advance, coordinating batteries and backup systems strategically, and dynamically managing loads across the grid, PGE has accelerated hundreds of megawatts of computing capacity years ahead of original timelines, without building new generation or transmission.
The approach combines proven technologies into one AI-native orchestrated system:
- Generative AI forecasting that predicts grid conditions with unprecedented accuracy.
- Automated coordination of battery storage and virtual power plants.
- Real-time optimization of data centre consumption based on system constraints.
- When needed, geographically distributed computing that shifts workloads to where power is abundant.
All of these technologies exist today. The innovation is using AI to integrate and deploy them at scale.
The reality of AI, data centres and energy
People fear that data centres will drive up electricity rates. The reality is precisely the opposite.
The fixed costs of electricity infrastructure – transmission networks, distribution systems, substations – exist whether they're operating at 30% or 90% utilization. When data centres consume power during off-peak periods and low-demand hours, those fixed costs spread across more kilowatt-hours, lowering average rates for all customers rather than raising them.
Recent analysis by GridCARE shows that a 1 GW data centre utilizing spare grid capacity can reduce rates for the average consumer by as much as 5%, or $100/year. By turning data centres into flexible partners that absorb renewable surpluses and share grid expenses, utilities can expand capacity while reducing bills.
The principle extends beyond data centres. As transportation and buildings electrify, the same coordination capabilities that optimize AI workloads can manage vehicle charging and heat pump operation. A grid optimized for one source of flexible demand becomes more valuable for all flexible loads.
The fastest, cheapest, and cleanest megawatt is the one you don't need to build.
The most important infrastructure project of our generation
This is the infrastructure challenge that will determine who leads the AI era. Whoever solves the power puzzle first will shape the next decade of technological development and economic competitiveness.
Doubling the grid in five years through code rather than primarily through construction isn't merely possible. It may represent the highest-leverage strategy available for mature economies seeking to maintain technological leadership while accelerating climate goals.
Abundance doesn't start with more resources. It starts with using what we already have intelligently. That's an opportunity that could reshape how we power AI – as well as how we think about infrastructure, abundance and the relationship between technological progress and resource constraints.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Artificial Intelligence
Forum Stories newsletter
Bringing you weekly curated insights and analysis on the global issues that matter.
