Florida State University expert provides analysis on surging energy costs from artificial intelligence

Florida State University Social and Sustainable Enterprises Director Mark McNees suggests that the surge in energy expenses driven by AI could just be getting started.
Florida State University Social and Sustainable Enterprises Director Mark McNees suggests that the surge in energy expenses driven by AI could just be getting started.

The global demand for artificial intelligence (AI) has driven billions of dollars into new data centers, sharply increasing energy use and costs for consumers.

Built to handle the massive computational power of AI workloads, these data centers are creating wholesale electricity costs as much as 267% more than five years ago for customers living in those areas. A single ChatGPT query consumes approximately 10 times as much energy as a traditional Google search.

Florida State University’s Mark McNees is a professor at the Jim Moran College of Entrepreneurship who specializes in social entrepreneurship and innovation. He directs the Social and Sustainable Enterprises program, providing expertise on areas that include organization transformation, board governances, multi-stakeholder networks and building cultures of innovation.

McNees suggests that the surge in energy expenses driven by AI could just be getting started.

“As AI adoption accelerates and data centers proliferate to support this demand, we’re facing significant upward pressure on electricity prices that most consumers don’t yet realize is coming,” McNees said.

McNees teaches several social entrepreneurship courses while mentoring FSU students in the cultivation, refinement and launch of social enterprises. He has written several opinion columns and editorials on energy sustainability and efficiency, and hosts the long-running InNOLEvation Mindset Podcast that tells the stories of various entrepreneurs.

Professor Mark McNees is available for interviews on the intersection of AI infrastructure and energy grid capacity. He can be reached at mmcnees@jimmorancollege.fsu.edu.

Mark McNees, director, Social and Sustainable Enterprises

Worldwide surging demand for AI has collided with rising energy costs for customers in areas where data centers are operating. How sustainable is this type of dilemma in the long term?
The short answer: It’s not sustainable without fundamental changes to how we finance energy infrastructure. The current system was designed for an era when electricity demand grew only modestly year over year. That world ended when ChatGPT launched.

Here’s the economic reality nobody wants to discuss: Utilities are building expensive infrastructure based on speculative data center demand. Electric utility company AEP Ohio alone has received requests for 30 gigawatts of new connections from data centers, enough to power 24 million homes. But data center developers are shopping for projects across multiple locations before committing. When those projects don’t materialize, who pays for the stranded infrastructure? Ratepayers. As one consultant from the Institute for Energy Economics and Financial Analysis put it, residential customers tend to end up holding the bag for stranded costs.

The sustainability question cuts both ways. Yes, wholesale electricity prices have surged 267% in some areas near data center clusters. But the contrarian view supported by Lawrence Berkeley National Laboratory research suggests that states with higher electricity demand growth have experienced smaller retail price increases. The logic is straightforward, when more electricity flows across existing infrastructure, fixed costs are spread over more kilowatt-hours.

The difference between these outcomes comes down to planning and who bears the costs. In northern Virginia, large data center customers cover roughly 9% of transmission costs, helping keep residential transmission rates below the national average. In Mississippi, data center revenue has funded grid modernization without raising household rates. The model works when implemented thoughtfully. The crisis emerges when it isn’t.

What potential solutions are there to allow consumers to not bear the cost of rising energy prices from these AI data centers?
The good news: Solutions exist, and some tech companies are already implementing them. Microsoft recently announced it will request to pay higher electricity rates in areas where it’s building data centers — specifically to prevent residents from subsidizing its AI infrastructure. The announcement came after President Trump indicated his administration is working with major tech companies to ensure Americans don’t “pick up the tab” for their power consumption.

But voluntary corporate goodwill isn’t a policy framework. Here are the structural solutions that can protect consumers:

Require data centers to build their own generation. Solar and battery storage are now the cheapest and fastest ways to deploy new electricity capacity. When Meta built a data center in Aiken, South Carolina, it partnered with a solar developer to install 100 megawatts of on-site generation. Redwood Materials launched a microgrid combining 12 megawatts of solar with 63 megawatt-hours of storage specifically to power AI data centers. The technology exists. Policy should incentivize — or mandate — its deployment.

Reform interconnection and capacity market rules. Utilities shouldn’t build infrastructure for speculative projects. Stricter requirements — like those AEP Ohio proposed, requiring data centers to post more collateral or commit to specific electricity purchases — can ensure developers have skin in the game before ratepayers assume risk.

Leverage distributed energy resources. A Rewiring America report proposes an elegant inversion: If data center developers invested in residential energy efficiency (heat pumps, rooftop solar, home batteries) they could “unlock the capacity they need” while reducing household bills. In California, a virtual power plant test dispatched 535 megawatts from over 100,000 homes, meeting half of San Francisco’s energy demand. The infrastructure exists in America’s rooftops. We’re just not using it strategically.

Embrace renewable energy as the fastest path to capacity. During a July 2025 Senate hearing, a Vantage Data Centers executive testified that meeting America’s AI power needs requires all energy sources — including storage. His key insight: Renewables paired with batteries provide the “reliable, grid dispatchable” power that data centers need. This isn’t environmental activism. It’s the fastest way to build capacity. Orders for new gas turbines face seven-year delays. Solar installations can be deployed in months.