"To compete globally, we must expand energy production and reduce energy costs for American families and businesses. America must lead the world in innovation and technology breakthroughs."
Chris Wright, Secretary, Energy
Last month I attended Energy Imperatives Summit in Washington, DC and the topic of data centers and powering the AI revolution were front and center. It is important to note that as AI capabilities expand exponentially, so too does their appetite for electricity. Policymakers who fail to grasp this connection risk watching their countries fall behind in the most consequential technological race of our time.
The Scale of the Challenge
The numbers are staggering. Global data center electricity consumption is projected to more than double by 2030, reaching approximately 945 terawatt-hours annually. In the United States alone, data centers could consume between 4.6% and 9.1% of total electricity by 2030, up from 4.4% in 2023. This represents an increase of roughly 485 trillion kilowatt-hours globally—one of the most significant surges in electricity demand from any single sector in modern history.
This isn't merely about keeping the lights on in server farms. AI training and inference operations are exponentially more energy-intensive than traditional computing. Training a single large language model can cost millions of dollars in electricity alone. As models grow more sophisticated and AI applications proliferate across industries, this energy hunger will only intensify.
Energy as Economic Destiny
History teaches us that industries gravitate toward regions with abundant, affordable energy. For instance, aluminum smelting follows hydroelectric power. Data centers cluster in areas with cheap electricity. The AI industry will be no different—except the stakes are immeasurably higher.
Countries that can provide the cheapest and most reliable energy will enjoy decisive advantages in AI development. They can train larger models more cost-effectively, run more experiments, offer AI services at competitive prices, and attract the brightest minds and biggest investments. Conversely, nations saddled with expensive or unreliable energy will find themselves increasingly marginalized in the AI economy.
Consider the mathematics: if one country can train advanced AI models at half the energy cost of another, it can either achieve the same capabilities for less money or develop superior capabilities for the same investment. Over time, this compounds into an insurmountable competitive advantage.
The Reliability Imperative
Cost matters, but reliability may be even more critical. AI training runs that experience power interruptions can lose weeks of progress and millions of dollars in computational work. Countries with unstable electrical grids will struggle to attract serious AI development, regardless of their energy prices.
This reliability requirement extends beyond avoiding blackouts. AI operations need consistent, predictable power quality. Grid instability, voltage fluctuations, and frequency variations can damage sensitive computing equipment and corrupt training processes. Nations serious about AI leadership must invest in robust, modern electrical infrastructure.
Strategic Energy Choices
Policymakers face critical decisions about their energy mix. Traditional fossil fuel plants may offer reliability but come with carbon costs and price volatility. Renewable
energy sources like solar and wind are increasingly cost-competitive but require sophisticated grid management to handle intermittency.
Nuclear power presents perhaps the most compelling option for AI-focused energy policy. Small modular reactors (SMRs) can be co-located with data centers, providing dedicated, carbon-free baseload power. Companies like Microsoft, Google, and Amazon are already signing deals for nuclear power, recognizing its unique advantages for AI applications. The restart of Three Mile Island's Unit 1 reactor specifically to power Microsoft's operations signals the beginning of this trend.
Years ago, a mentor taught me the phrase: 'The easiest money to earn is the money you save.' In that vein, investing in cutting-edge, high-efficiency cooling technologies—such as direct-to-chip cooling installations and single-phase immersion cooling equipment being pioneered by companies like Chemours—could be a critical stopgap measure.
The Innovation Ecosystem Challenge
While energy costs are becoming increasingly important, they don't operate in isolation. Many years ago, when I worked for Ernst & Young LLP then later with two tech start up companies during the dot.com era, I learned established technology ecosystems in places like Silicon Valley, London, and Beijing provide crucial advantages that cheap energy alone cannot replicate. These include concentrated technical talent, vocational training institutions, venture capital networks, research institutions, prudent government regulation and the complex web of supplier relationships that when brought together accelerate innovation.
However, as energy becomes a larger share of AI development costs, we may see a geographic bifurcation. Research and development activities might remain in established hubs, while actual model training and inference operations migrate to energy-advantaged regions. This creates opportunities for countries and regions with abundant energy resources to capture significant portions of the AI value chain.
For instance, my city of Lancaster, PA being a stone’s throw from Three Mile Island and Peach Bottom nuclear power plants could find itself in an “energy advantageous” situation.
Policy Recommendations
First, policymakers must recognize AI energy demand as a national security issue. Countries that cannot power AI development will lose technological sovereignty and economic competitiveness. Energy policy should explicitly account for AI infrastructure needs in capacity planning and grid modernization efforts.
Second, governments should streamline permitting and approval processes for energy projects that serve AI applications. This includes nuclear power, renewable energy installations, and grid infrastructure upgrades. The timeline for AI development far outpaces traditional energy project cycles, requiring accelerated regulatory pathways.
Third, investment in grid modernization and energy storage technologies will be essential. AI workloads create unique demand patterns that require sophisticated grid management capabilities. Smart grid technologies, battery storage systems, and demand response programs will help maximize the efficiency of AI-related energy consumption.
The Window of Opportunity
The AI revolution is still in its early stages, but the window for strategic positioning is narrowing rapidly. Countries that act decisively to secure abundant, reliable, and affordable energy will position themselves as leaders in the AI economy. Those that delay or mismanage their energy policies risk becoming technological colonies of more energy-advantaged nations.
The choice is clear: embrace the reality that energy policy has become AI policy, or watch from the sidelines as other nations shape the future of human technological capability. The age of AI demands nothing less than a fundamental reimagining of how we think about energy as a strategic resource. The time for half-measures and incremental thinking has passed—the future belongs to those bold enough to power it.
Kevin- What I find personally fascinating about this article is that a state whose electeds (dont call them leaders) pride themselves on Silicon Valley, as if they built it, don't seem to be aware that the States global tech leadership could easily evaporate due to their insistence on renewable energy and their shutting down of natural gas and nuclear power plants.
So, true! I live in Los Alamos, New Mexico. Yes, the home of enormous Los Alamos National Laboratory, currently employing roughly 12,000 people on the remote mesas where the atomic bombs were developed during World War II.
Los Alamos is facing multiple challenges for energy. We're in the "Modern Megadrought" (perhaps climate change, perhaps normal drought like the Southwest sees so often). This means less hydroelectric production on the Western Area Power Administration grid, including the loss of most or all hydro at Glen Canyon and at Hoover Dam, plus on smaller hydro-producing dams throughout the region, such as northern New Mexico's Abiquiu, important to Los Alamos.
Meanwhile--while Trump slowing immigration is a huge help--the Southwest's population continues to explode, increasing regular demand for power. At the same time, AI is coming on the horizon at Los Alamos and at Sandia national laboratories and elsewhere.
And, wait for it, you'll all be delighted to hear that Los Alamos' mostly woke county council is moving to EVs for the county fleet and encouraging residents to buy EVs, absent ANY discussion of such power limitations! But, by gum, the Biden/Harris administrations said it was the way to go, who are we to put expertise at a major national laboratory to bear on evaluate the wisdom of the move or, duh, where Los Alamos is to get enough power?
Meanwhile, discussion about water (and its connection to power production) or any possibility of needing to limit growth in a Southwest that, by every definition, will be "ground zero" for climate change, is non-existent. Of course, not one person in a million in the modern Southwest understands water or that there are astounding limits to water--with or without climate change--in a region straddled by four major deserts. Nor do they grasp that our life's blood, the Colorado River is not and will not be sufficient, in drought, to provide water. But, "gotta grow, gotta, grow; gotta have EVs, gotta have EVs."