- AI queries worldwide pile pressure directly onto the American electricity grid
- Utilities pass grid upgrade costs onto households while data centers stall
- Electricity demand projections show data centers tripling U.S. consumption by 2028
The accelerating demand for computing power has pushed artificial intelligence into the center of the US energy debate.
Data centers used to support cloud services, streaming platforms, and online storage already consume large amounts of electricity, but the rise of AI tools has magnified those needs.
According to federal projections, the share of national electricity use from data centers could rise from 4% in 2023 to 12% by 2028.
AI’s energy appetite intensifies demand
Since running an AI writer or hosting an LLM is more energy-intensive than typical web activity, the growth curve is steep.
This expansion is not only changing the relationship between technology firms and utilities, but it is also reshaping how electricity costs are distributed across society.
Electricity prices in the US have already climbed more than 30% since 2020, and a Carnegie Mellon–North Carolina State study warns of another 8% nationwide rise by 2030.
In states such as Virginia, the increase could reach 25%. Utilities argue that grid upgrades are essential, but the concern is who will pay for them.
This is only the beginning, because when a French person asks ChatGPT when the next strike is planned, Americans pay more for electricity.
How? When anyone anywhere in the world asks ChatGPT an everyday question, the extra energy consumed by that query is absorbed into U.S. grid demand.
This is because the ChatGPT system runs on US-based servers, hosted in American data centers and powered by the US electricity grid.
If technology firms secure large capacity allocations and delay projects, households and small businesses may be left paying for unused infrastructure.
The case of Unicorn Interests in Virginia, where a delayed facility left nearby customers covering millions in upgrade expenses, underscores this risk.
To counter such problems, American Electric Power in Ohio proposed a rate plan requiring data centers to pay for 85% of the requested capacity regardless of actual use.
The state’s regulators approved the measure despite opposition from cloud service providers, who offered a 75% minimum instead.
Some companies have sought to bypass traditional utilities by generating their own power.
Amazon, Microsoft, Google, and Meta already operate renewable installations, gas turbines, and diesel backup generators, and some are planning nuclear facilities.
These companies not only produce electricity for their own operations but also sell surplus energy into wholesale markets, creating competition with traditional suppliers.
In recent years, such sales have generated billions, giving major cloud providers influence over both supply and price in certain regions.
The volatile consumption patterns of AI training, which can swing sharply between peaks and lows, pose another challenge.
Even a 10% shift in demand can destabilize networks, forcing utilities to intervene with dummy workloads.
With households already paying more each month in some states, the concern is that consumers will end up covering the cost of keeping LLM hosting and AI writer systems online.
Via Toms Hardware
You might also like
- These are the best AI website builders around
- Take a look at our pick of the best internet security suites
- Scientists have a unique way of tackling video deepfakes using a burst of light