BMClogo

Explosive growth in AI-powered computing centers is causing an unprecedented surge in power demand, potentially overwhelming grid and derailing climate targets. At the same time, artificial intelligence technology can revolutionize energy systems, thereby accelerating the transition to clean energy.

“We are in potentially huge changes throughout the economy,” William H. Green, director of the MIT Energy Initiative (Mitei) (Mitei) (Mitei), a director of the Mitei Spring Symposium in the Mitei Department of Engineering, under the “AI and Promission” on May 13. and achieve our clean energy goals” while seeking “get the benefits of AI without some harm”. The challenge of energy demand in data centers and the potential benefits of AI to energy transition is Mitei’s research priorities.

A shocking energy demand for AI

From the outset, the seminar highlighted prominent statistics about AI’s appetite for power. After decades of flat electricity demand in the United States, computing centers now consume 4% of the nation’s electricity. Despite great uncertainty, some forecasts suggest that this demand could increase to 12-15% by 2030, which is largely driven by AI applications.

Vijay Gadepally, a senior scientist at MIT Lincoln Laboratory, highlights the scale of AI consumption. “The power required to maintain some of these large models is doubled almost every three months,” he noted. “A ChatGpt conversation uses as much power as charging a phone, and producing images consumes a bottle of water for cooling.”

Facilities that require 50 to 100 MW of power are emerging rapidly in the U.S. and globally, driven by leisure and institutional research needs that rely on large language programs such as Chatgpt and Gemini. Gadepally cited the fundamental relationship of this relationship, Cattai CEO Sam Altman’s congressional testimony: “The cost of intelligence, the cost of AI will be integrated into the cost of energy.”

“The energy demand for AI is a major challenge, but we also have the opportunity to use this vast computing power to contribute to climate change solutions,” said Evelyn Wang, vice president of Energy and Climate for the U.S. Department of Energy and former senior research project organization, Energy (ARPA-E).

Wang also noted that innovations developed for AI and data centers (such as efficiency, cooling technology and cleaning power solutions) may have a wide range of applications outside of the computing facility itself.

Strategies for clean energy solutions

The seminar explores multiple ways to deal with AI energy challenges. Models proposed by some panelists show that while AI can increase emissions in the short term, its optimization capabilities can be reduced significantly after 2030 through more efficient power systems and accelerated clean technology development.

Emre Gençer, co-founder and CEO of Sesame Sustainability and former chief research scientist at Mitei, said the research shows the cost difference in powering the computing center with clean electricity. Gençer’s analysis shows that the cost in the central U.S. is lower due to complementary solar and wind resources. However, achieving zero emission power will require a large number of battery deployments (five to 10 times higher than the medium carbon scenario), and two to three times more expensive to drive.

“If we want to zero emissions with reliable power, we need other technologies besides renewable energy and batteries, which is too expensive,” Gençer said, pointing to the necessary supplements “long-term storage technology, small modular reactors, geothermal or hybrid methods.”

There has been a renewed interest in nuclear power due to energy demand in data centers. “Data center space has become a major focus for the constellation,” she said, highlighting how their demand for reliability and carbon-free electricity reshapes the power industry.

Can AI accelerate energy transition?

According to Priya Donti, assistant professor of the Department of Electrical Engineering and Computer Science and the Silverman family career development professor, said artificial intelligence can greatly improve power systems. She shows how AI can accelerate power grid optimization by embedding physics-based constraints into neural networks and possibly solve complex power flow problems in “10 or even greater speeds” compared to traditional models.

According to an example shared by Antonia Gawel, Google’s director of global sustainability and partnerships, AI has reduced carbon emissions. Google Maps’ fuel-efficiency route feature “helps prevent a decrease in 2.9 million tons of greenhouse gas (greenhouse gas) emissions since its launch, which is equivalent to driving 650,000 fuel-type cars out of a year.” Another Google Research project uses artificial intelligence to help pilots avoid creating fences, accounting for about 1% of the impact of global warming.

Paul MM Cook M. Associate Professor of Career Development in Cook Rafael Gómez-Bombarelli highlights the potential of AI to accelerate material discovery for power applications. “AI can be trained to monitor models from structure to property,” he noted. He was able to develop developments of materials that are critical to computing and efficiency.

Ensure growth through sustainable development

Throughout the workshop, participants worked hard to balance the rapid deployment of AI with environmental impacts. Despite the biggest concern in AI training, Dustin Demetriou, senior technician for sustainability and data center innovation in IBM, quoted a World Economic Forum article that stated that “an estimated 80% of environmental footprint is estimated to be due to inference.” Demetriou highlights the need for efficiency in all artificial intelligence applications.

Jevons’ paradox that “efficiency improvement tends to increase overall resource consumption rather than reduce it” is another factor to consider, warning Emma Strubell is an assistant professor at Raj Reddy at the School of Languages ​​and Technology, School of Computer Science at Carnegie Mellon University. Strubell advocates viewing computing center power as a limited resource that requires thoughtful allocation across different applications.

Some presenters discussed new ways to integrate renewable resources with existing grid infrastructure, including potential hybrid solutions that combine cleaning installations with existing gas plants that have already established valuable grid connections. These methods can provide a large amount of cleaning capacity across the United States at reasonable costs while minimizing the impact of reliability.

Navigation of Ai-Energy Paradox

The workshop highlights the central role of MIT in developing solutions to AI-Electricity challenges.

Green talks about a new Mitei program on computing centers, power and computing that will run with the full spread of research in the MIT climate project. “We will try to solve a very complex problem by providing the actual algorithms that provide value to our customers, which is acceptable to all stakeholders and truly meet all needs,” Green said.

Randall Field, Director of Research at Mitei, conducted a survey on the priorities of MIT research. Real-time results rank “data center and grid integration issues” as the top priority, followed by “accelerating AI that can accelerate energy materials.”

In addition, participants revealed that most people view the potential of AI as a “commitment” rather than a “danger”, although a considerable portion remains uncertain about the final impact. When asked about the power priority of computing facilities, half of the respondents chose carbon intensity as their most concerned focus and followed by reliability and cost.

Source link