SELECT LANGUAGE BELOW

Tech companies creating large AI data centers ought to cover their energy costs.

Tech companies creating large AI data centers ought to cover their energy costs.

As we look toward the future, the rapid expansion of artificial intelligence and the surge in power demand for expansive data centers is presenting significant hurdles for utility systems across the U.S.

Despite forecasting a substantial increase in electricity needs over the next ten years, there’s a worrisome underappreciation of the true scale of these challenges.

The notion of constructing a data center that consumes over a gigawatt of electricity—enough to supply more than 875,000 homes—used to feel extraordinary. Now, it’s just part of our regular discussions among data center developers.

When taken as a whole, the obstacles can seem daunting.

A recent report from Wood Mackenzie pointed out that there are currently 64 gigawatts of confirmed power projects related to data centers, with an additional 132 gigawatts potentially in development. To give some perspective, 64 gigawatts could power about 56 million homes, which is more than double the populations of numerous major U.S. cities.

Utility companies are finding it challenging to meet the anticipated energy demands from the AI sector. Many lack the financial and organizational capacity needed to expand generation and transmission facilities within the critical timelines required by data center developers.

This raises key questions about who bears the costs and risks associated with these large-scale energy projects.

If you’re considering becoming an AI developer like Amazon or Microsoft, you should know that their total market value is seven times that of the entire S&P 500 utility sector—or roughly equivalent to the number of homes and businesses served by local utilities.

Efforts are currently underway in the halls of Congress, the Federal Energy Regulatory Commission, and various federal agencies, along with state regulatory bodies and public discussions on both the national and local levels.

Regardless of where solutions are developed, certain principles and goals need to guide public policy in this sphere.

  • Data center developers must be accountable for meeting their extensive power needs (for instance, over 500 MW). Texas recently implemented a law requiring data centers and other large users to cover the infrastructure costs related to their energy requirements. Traditionally, these costs would be spread among all utility customers, but given the exceptional demand from data centers, it’s fair to expect these developers to pay for new facilities. They also have the financial resources to do so, which can be reflected in fees for AI services.
  • Significant data center developers should bear the risks associated with new utility generation and transmission, rather than putting those burdens on utilities. For example, the Ohio Public Utilities Commission recently approved a compromise requiring data centers with mobile units over 25 megawatts to enter into a ten-year electrical service contract and to pay a minimum demand fee based on 85% of contracted capacity. Similar provisions in Texas law compel data center developers to make substantial early payments in the planning stages, which requires them to disclose their power requests concurrently. Once a location is found, it’s not unusual for a request to be withdrawn. It’s the developers who can manage these financial risks, not the utilities.
  • Generation facilities located within large data centers need to be properly integrated with local utility grids, ensuring a fair allocation of costs. While some projects have proposed operating as completely independent “islands” from the grid, most prefer to connect for backup purposes. If managed well, this connection can benefit both data centers and utility systems, as long as the costs are fairly allocated across the board.
  • The government should persist in supporting the development of nuclear technologies, such as small modular reactors. U.S. utilities often lack the capital to take on the risks associated with constructing new nuclear plants. The advent of new customers, like data center developers who are power-hungry and financially robust, alters this dynamic. Support from the government, amounting to billions, for new nuclear tech should continue, aiming to drive down costs.
  • There should be ongoing government efforts to enhance energy efficiency in data centers. These facilities consume vast amounts of electricity for a range of functions. The National Institute of Renewable Energy has produced a guide that data centers can use to minimize energy consumption. Additionally, the market holds great potential for developing ultra-efficient chips that can lower training and operational costs for AI models. The government should actively work to speed up the development of these chips.

The stakes in this public policy debate surrounding our energy future are quite high. If approached correctly, AI could truly transform the U.S. economy and its energy infrastructure.

However, mishandling this situation could lead to overwhelming demands for new generation and transmission solutions, potentially straining utility companies’ finances and operational capabilities, resulting in steep rate increases for homeowners and businesses, and complicating efforts to reduce fossil fuel dependence while meeting climate goals.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News