BMClogo

In Part 2 of our two-part series The environmental impact of generating artificial intelligence,,,,, MIT News Explore some ways in which experts are working to reduce the technology’s carbon footprint.

The energy demand for generating AI is expected to continue to increase significantly over the next decade.

For example, an International Energy Agency report in April 2025 predicts that by 2030, the global power demand for data centers (compute infrastructure with training and deployment of AI models) will more than double to about 945 Terawatt-Hours. Although not all operations performed in data centers are AI-related, this total is slightly higher than Japan’s energy consumption.

In addition, Goldman Sachs Research’s August 2025 analysis forecasts will meet the 60% increase in electricity demand in data centers by burning fossil fuels, thereby increasing global carbon emissions by about 220 million tons. By comparison, driving a gas car for 5,000 miles produces about 1 ton of carbon dioxide.

These statistics are surprising, but at the same time, scientists and engineers at MIT and around the world are working on innovations and interventions to mitigate AI’s balloon carbon footprint, from improving the efficiency of algorithms to rethinking the design of data centers.

Consider carbon emissions

Talk about reducing the carbon footprint that generates AI is often centered on “operating carbon” – emissions used by powerful processors (called GPUs) in data centers. Vijay Gadepally, a senior scientist at MIT Lincoln Laboratory, said it often ignores “specific carbon”, which is the emissions generated by building data centers first.

Build and renovate a data center that is built from a large amount of steel and concrete, filled with air conditioning units, computing hardware and miles of cable, consuming a lot of carbon. In fact, the environmental impact of building data centers is one of the reasons companies such as Meta and Google are exploring more sustainable building materials. (Cost is another factor.)

Additionally, the data center is a huge building – the world’s largest, China Television Mongolia Information Park, about 10 million square feet – with about 10 to 50 times the energy density of a normal office building, Gadepally added.

“The operational aspect is just part of the story. Some things we are working to reduce operational emissions may also reduce the carbon we embodied, but we need to do more in that in the future.”

Reduce operational carbon emissions

There are many similarities with household energy-saving measures when reducing operational carbon emissions in AI data centers. First, we can simply turn on the lights.

“Even from an efficiency standpoint, your house has the worst light bulbs, turning them off or dimming them is always less energy than having them completely explode,” Gadepally said.

In the same way, Supercomputing Center research shows that GPUs in data centers “reject” GPUs, so they consume about a third of their energy to minimize the performance of AI models, while also making the hardware more likely to cool.

Another strategy is to use less energy-intensive computing hardware.

Requires generated AI workloads, such as training new inference models for GPT-5 (e.g. GPT-5), often require many GPUs to work simultaneously. Goldman Sachs analysis estimates that state-of-the-art systems may soon be able to use up to 576 connected GPUs immediately.

However, engineers can sometimes achieve similar results by reducing the accuracy of the computing hardware, perhaps by switching to a less capable processor tuned to handle a specific AI workload.

There are also some measures that improve the efficiency of training deep learning models that are eager to be eager to be deployed before they are deployed.

Gadepally’s team found that the power used to train AI models is about half the power to get an accurate 2 or 3 percentage points. Stop the training process as early as possible can save a lot of energy.

“In some cases, 70% accuracy is sufficient for a specific application, such as a recommended e-commerce system,” he said.

Researchers can also take advantage of measures to improve efficiency.

For example, a postdoctoral fellow at the Supercomputing Center noted that the group might run a thousand simulations during the training process to select two or three best AI models for its project.

By building a tool that allows them to avoid wasting about 80% of their computing cycles, they greatly reduce the energy demand for training without decreasing model accuracy, Gadepally said.

Improve utilization efficiency

Continuous innovations in computing hardware, such as dense arrays of transistors on semiconductor chips, can still make significant improvements in the energy efficiency of AI models.

Neil Thompson, MIT’s Computer Science and Artificial Intelligence Laboratory and lead researcher at MIT’s startup of the digital economy, said that while the energy efficiency improvements on most chips have been slowing since around 2005, the amount of computing that GPUs can do has been increasing by 50% to 60%.

“The Moore’s Law trend to get more and more transistors on chips is still crucial for many such AI systems, because operations that operate simultaneously are still very valuable for improving efficiency,” Thomspon said.

More importantly, his group’s research shows that the efficiency gained from new model architectures can solve complex problems faster, consuming less energy to achieve the same or better results, doubled every eight or nine months.

Thompson coined the term “Negaflop” to describe this effect. “negawatt” represents the power saved due to energy-saving measures, “negaflop” is a computational operation that does not require execution due to algorithm improvements.

These could be unnecessary components of “trimming” neural networks or adopting compression techniques that enable users to do more with less computation.

“If you need to use a very powerful model to accomplish your task today, in just a few years, you might be able to do the same with a significantly smaller model, which will bring less environmental burden. Making these models more efficient is the most important thing to reduce the environmental cost of AI,” Thompson said.

Maximize energy saving

While reducing the overall energy use of AI algorithms and computing hardware reduces greenhouse gas emissions, not all energy is the same, Gadeplally added.

“The carbon emissions of 1 kWh vary greatly even during the day and throughout the month and year,” he said.

Engineers can leverage the flexibility of AI workloads and data center operations to leverage these changes to minimize emissions. For example, some generated AI workloads do not need to be completed simultaneously.

Deep Jyoti Deka, a research scientist at MIT Energy Initiative at MIT Energy Initiative, says that doing some calculations later can be of great help in reducing the carbon footprint of the data center when more electricity comes from renewable resources for solar and wind.

Deka and his team are also working on “smarter” data centers where AI workloads from multiple companies using the same computing devices are flexible to improve energy efficiency.

“By looking at the system as a whole, our hope is to minimize energy use and dependence on fossil fuels, while still maintaining reliability standards for AI companies and users,” Deka said.

He and others at Mitei are building a data center flexibility model that takes into account the different energy requirements of training deep learning models and deploying the model. Their hope is to reveal the best strategies for scheduling and simplifying computing operations to improve energy efficiency.

Researchers are also exploring the use of long-term energy storage units in data centers that store excessive energy when needed.

With these systems, data centers can use storage energy generated by renewable sources during high demand, or avoid using diesel backup generators if there is fluctuation in the grid.

“Here, long-term energy storage can be a game-changer because we can design operations that really change the combination of systems emissions to rely more on renewable energy,” Deka said.

Additionally, researchers at MIT and Princeton University are developing a software tool called GenX investment program that can be used to help companies identify the ideal place to find data centers to minimize environmental impacts and costs.

Location can have a significant impact on reducing the carbon footprint of your data center. Meta, for example, operates a data center in Lulea, a city in northern Sweden, where cooler temperatures reduce the power required to cool the computing hardware.

Some governments are even thinking outside the box (a further away) and are even exploring the construction of data centers on the moon, where they can operate with almost all renewable energy.

AI-based solutions

Currently, the expansion of renewable energy generation on Earth is not in sync with the rapid growth of AI, which is the main obstacle to reducing its carbon footprint.

The local, state and federal review process required for new renewable energy projects can take years.

Researchers at MIT and elsewhere are exploring the use of AI to speed up the process of connecting new renewable energy systems to the power grid.

For example, the generated AI model can simplify research on determining how new projects will affect the interconnection of the power grid, a step that usually takes years to complete.

AI can play an important role in accelerating the development and implementation of clean energy technologies.

“Machine learning is great for solving complex situations, and the grid is said to be one of the largest and most complex machines in the world,” Turliuk added.

For example, AI can help optimize solar and wind forecasts, or determine the ideal location for a new facility.

It can also be used for predictive maintenance and failure detection of solar panels or other green energy infrastructure, or the ability to monitor transmission lines to maximize efficiency.

By helping researchers collect and analyze large amounts of data, AI can also inform targeted policy interventions aimed at getting the biggest “huge expenses” from areas such as renewable energy, Turliuk said.

To help policy makers, scientists and businesses consider the multifaceted costs and benefits of AI systems, she and her collaborators have developed a net climate impact score.

The score is a framework that can be used to help determine the net climate impact of AI projects and take into account future emissions and other environmental costs and potential environmental benefits.

Ultimately, the most effective solution may be caused by collaboration between companies, regulators and researchers, while academia is leading.

“It’s important every day. We are just getting through the impact of climate change and can’t do anything until it’s too late,” she said.

Source link