
AI's carbon footprint remains a riddle three years into the genAI revolution, thanks to AI makers' secrecy and the difficulty of accurate measurements at scale. Why it matters: Chatbot users say they're concerned about AI's climate impact, but with too many complex variables and not enough data, we're still effectively just guessing.
The big picture: AI consumes massive amounts of energy and water both during model training and when the models are responding to users — a phase known as "inference."
- But "massive amounts" are hard to visualize or to compare with the energy we use to play Fortnite, store our text messages in the cloud, mine bitcoin or stream the "White Lotus" finale.
- "People say if you use ChatGPT or Perplexity or whatever you use, you're killing the environment. I say, no, that's not true. Actually, it's no worse [than if] you download and play Roblox," Chris Mattmann, data and AI chief at UCLA, recently told podcaster Shira Lazar.
By the numbers: OpenAI, Anthropic, xAI and Google did not share internal numbers on how much energy it takes to train and run their generative AI models.
- One oft-cited rule of thumb suggested that querying ChatGPT used roughly 10 times more energy than a Google search — 0.3 watt-hours for a traditional Google search compared with 2.9 watt-hours for a ChatGPT query.
- Epoch AI, a research group that publishes third-party estimates of AI energy use, said in February that the 2.9 watt-hour calculation is likely an overestimate since AI models and the hardware running them are both more efficient now.
- But the rise of bigger models, especially deep research and reasoning models, could also be tugging this figure in the opposite direction.
Zoom in: The carbon footprint of training a model depends on many factors, including the number of parameters in the model, the total hours of training, and the type and efficiency of the power usage.
- Carbon emissions from AI training are steadily increasing, according to the eighth edition of Stanford's Artificial Intelligence Index, released last Monday.
- Training AlexNet, an early AI model, emitted 0.01 tons of carbon in 2012. Training GPT-4 in 2023 produced an estimated 5,184 tons, and training Llama 3.1 405B in 2024 produced 8,930 tons, per the report.
- "For perspective," the Stanford report says, "the average American emits 18 tons of carbon per year."
Meta shares training information in its model cards and also claims effectively zero carbon impact because of the company's net-zero policies and practices.
Between the lines: Beyond carbon emissions and water use, data centers currently draw power from a few concentrated locations, often in marginalized communities.
- "The training for an AI model typically happens in one location, and the size of those power draws has been doubling every year," Thomas Wilson, principal tech executive at the nonprofit energy research institute EPRI, told Axios.
- Discussions of AI's environmental impact "do not give enough attention to environmental equity — the imperative that AI's environmental costs be equitably distributed across different regions and communities," the Harvard Business Review reported last year.
Zoom out: The amount of energy used to train a model previously far exceeded the amount of energy used for inference. But as usage grows, some experts say this is no longer true.
- "Enabling millions to use generative AI in their daily lives, and then fine-tuning the models to improve their performance draws large amounts of energy long after a model has been developed," per MIT News.
What they're saying: Absent clear numbers, there's a lot of back-of-the-napkin math at work on this topic.
- Anthropic pointed Axios to a Substack post called "Using ChatGPT is not bad for the environment," by Andrew Masley, a former physics instructor and current head of D.C.'s Effective Altruism group.
- Masley's calculations show that living car-free, switching to green heating or avoiding one transatlantic flight has far more impact than reducing ChatGPT prompts by a few million.
The amount of energy used by a chatbot isn't the point, Masley argues.
- "People who care about climate change should spend much less time worrying about how to reduce their individual emissions and much more time thinking about how to bring about systematic change to make our energy systems better."
- "Answering the question about environmental impact is almost separate from the amount of power demand," Wilson said. "Regardless of what the power use is, the environmental impact depends on how you generate that power."
Yes, but: AI itself holds potential for developing new approaches to reducing its own climate impact.
- A new report from International Energy Agency estimates that "broad application of AI-led solutions" could reduce energy-related emissions by around 5%, including 8% in electronics manufacturing.
- Last year the World Economic Forum cited reports that AI could potentially mitigate 5-10% of global greenhouse gas emissions.
What's next: The new administration's singular focus on winning the AI race against China is sidelining environmental concerns, leading to a new energy "pragmatism" or "realism."
- Data center companies have reportedly pinged almost half of the 13 major U.S. electric utility companies to request "volumes of power that would exceed their peak demand or existing generation capacity," according to Reuters.
The bottom line: Even if some well-meaning users stop using ChatGPT, the AI industry will still demand untold amounts of energy.
- "We need energy in all forms. Renewable, nonrenewable, whatever. It needs to be there and it needs to be there quickly," former Google CEO Eric Schmidt told the House Energy and Commerce Committee Wednesday.