The rapid expansion of AI-powered computing facilities is generating an unparalleled increase in electricity consumption that risks overwhelming power systems and hindering climate objectives. Concurrently, artificial intelligence innovations have the potential to transform energy infrastructures, hastening the shift toward renewable energy sources.
“We are on the brink of potentially monumental change across the economy,” stated William H. Green, head of the MIT Energy Initiative (MITEI) and Hoyt C. Hottel Professor in the MIT Department of Chemical Engineering, during MITEI’s Spring Symposium, “AI and energy: Peril and promise,” which occurred on May 13. The event convened specialists from industry, academia, and government to discuss solutions to what Green defined as both “local electrical supply issues and achieving our clean energy objectives,” while aiming to “harness the advantages of AI without incurring some of the detriments.” Addressing the energy requirements of data centers and leveraging AI for the energy transition is a primary research focus for MITEI.
AI’s astonishing energy consumption
The symposium commenced with alarming data concerning AI’s energy needs. After years of stagnant electricity demand in the United States, computing facilities currently account for roughly 4 percent of the nation’s energy usage. Although uncertainty prevails, some forecasts indicate this consumption could escalate to 12-15 percent by 2030, largely fueled by artificial intelligence technologies.
Vijay Gadepally, senior scientist at MIT’s Lincoln Laboratory, underscored the magnitude of AI’s usage. “The power required to maintain some of these extensive models is doubling approximately every three months,” he remarked. “A singular ChatGPT exchange consumes as much power as charging your phone, and generating an image requires about a bottle of water for cooling.”
Facilities requiring between 50 to 100 megawatts of energy are rapidly emerging across the United States and globally, spurred by both casual and institutional research needs relying on extensive language models such as ChatGPT and Gemini. Gadepally referenced congressional testimony by Sam Altman, CEO of OpenAI, emphasizing how foundational this linkage has become: “The cost of intelligence, the cost of AI, will converge to the cost of energy.”
“The energy requirements of AI present a considerable obstacle, but we have an opportunity to leverage these immense computational capabilities to advance climate change solutions,” asserted Evelyn Wang, MIT’s vice president for energy and climate and former director at the Advanced Research Projects Agency-Energy (ARPA-E) at the U.S. Department of Energy.
Wang also highlighted that innovations engineered for AI and data centers — such as efficiency improvements, cooling technologies, and clean-power strategies — could be broadly applicable beyond computing facilities themselves.
Approaches for clean energy solutions
The symposium examined various pathways to tackle the AI-energy dilemma. Some panelists presented frameworks suggesting that while artificial intelligence may heighten emissions in the near term, its optimization capabilities could facilitate significant emission reductions post-2030 through enhanced power systems and expedited clean technology development.
Research indicates regional disparities in the cost of powering computing centers with renewable energy, according to Emre Gençer, co-founder and CEO of Sesame Sustainability and former MITEI principal research scientist. Gençer’s assessment revealed that central United States offers substantially lower costs due to synergistic solar and wind resources. Nevertheless, achieving zero-emission electricity would necessitate vast battery installations — five to ten times more than moderate carbon scenarios — elevating costs two to threefold.
“If we aim for zero emissions with dependable power, we require technologies beyond renewables and batteries, which will be prohibitively expensive,” Gençer cautioned. He pointed out the importance of “long-duration storage technologies, small modular reactors, geothermal, or hybrid solutions” as essential complements.
Due to the energy demands of data centers, there is renewed interest in nuclear power, mentioned Kathryn Biegel, manager of R&D and corporate strategy at Constellation Energy, noting that her company is recommencing operations at the former Three Mile Island site, now designated as the “Crane Clean Energy Center,” to address this demand. “The data center sector has become a crucial priority for Constellation,” she explained, emphasizing how their requirements for both reliability and emissions-free electricity are reshaping the energy industry.
Can AI hasten the energy transition?
Artificial intelligence could significantly enhance power systems, according to Priya Donti, assistant professor and the Silverman Family Career Development Professor in MIT’s Department of Electrical Engineering and Computer Science and the Laboratory for Information and Decision Systems. She illustrated how AI can expedite power grid optimization by incorporating physics-based constraints into neural networks, potentially addressing intricate power flow challenges at “10 times, or even greater, velocity compared to traditional models.”
AI is already contributing to carbon emissions reductions, according to examples shared by Antonia Gawel, global director of sustainability and partnerships at Google. Google Maps’ fuel-efficient routing feature has “helped prevent more than 2.9 million metric tons of GHG [greenhouse gas] emissions since its launch, equivalent to removing 650,000 fuel-based vehicles from the road for a year,” she stated. Another Google research initiative employs artificial intelligence to assist pilots in avoiding contrails, which account for about 1 percent of the global warming impact.
AI’s potential to accelerate materials discovery for energy applications was emphasized by Rafael Gómez-Bombarelli, the Paul M. Cook Career Development Associate Professor in the MIT Department of Materials Science and Engineering. “AI-guided models can be trained to transition from structure to property,” he remarked, facilitating the development of materials essential for both computing and efficiency.
Ensuring growth with sustainability
Throughout the symposium, participants grappled with the balance between swift AI deployment and its environmental repercussions. While AI training garners most attention, Dustin Demetriou, senior technical staff member in sustainability and data center innovation at IBM, referenced a World Economic Forum article suggesting that “80 percent of the environmental footprint is estimated to stem from inferencing.” Demetriou underscored the necessity for efficiency across all artificial intelligence applications.
Jevons’ paradox, where “efficiency improvements tend to increase overall resource consumption rather than diminish it,” is another aspect to consider, warned Emma Strubell, the Raj Reddy Assistant Professor in the Language Technologies Institute at Carnegie Mellon University. Strubell advocated for viewing electricity usage in computer centers as a finite resource necessitating judicious allocation across various applications.
Several presenters proposed innovative strategies for merging renewable sources with existing grid infrastructure, including potential hybrid solutions that combine clean installations with existing natural gas facilities possessing valuable grid linkages. These strategies could offer substantial clean capacity nationwide at reasonable costs while minimizing reliability challenges.
Navigating the AI-energy conundrum
The symposium underscored MIT’s pivotal role in formulating solutions to the AI-energy challenge.
Green discussed a new MITEI program focused on computing centers, power, and computation that will function alongside the extensive scope of MIT Climate Project research. “We aim to tackle a highly complex issue that spans from power sources to the algorithms that provide value to the customers — in a manner that is acceptable to all stakeholders and effectively addresses all requirements,” Green remarked.
Symposium participants were surveyed about priorities for MIT’s research by Randall Field, MITEI director of research. The immediate results ranked “data center and grid integration issues” as the highest priority, followed by “AI for expedited discovery of advanced materials for energy.”
Additionally, attendees indicated that most view AI’s capability pertaining to energy as a “promise,” rather than a “peril,” although a significant number remain uncertain about its ultimate effects. When asked about priorities in power supply for computing facilities, half of the respondents indicated carbon intensity as their primary concern, with reliability and cost following closely behind.