Axios AI+

April 14, 2025
I'm visiting a new town called (checks notes) ... San Francisco. Seems nice. Today's AI+ is 1,257 words, a 4.5-minute read.
Situational awareness: The Federal Trade Commission's antitrust suit against Meta challenging the firm's decade-old acquisitions of Instagram and WhatsApp goes to trial today.
1 big thing: AI's climate impact is still a black box
AI's carbon footprint remains a riddle three years into the genAI revolution, thanks to AI makers' secrecy and the difficulty of accurate measurements at scale.
Why it matters: Chatbot users say they're concerned about AI's climate impact, but with too many complex variables and not enough data, we're still effectively just guessing.
The big picture: AI consumes massive amounts of energy and water both during model training and when the models are responding to users — a phase known as "inference."
- But "massive amounts" are hard to visualize or to compare with the energy we use to play Fortnite, store our text messages in the cloud, mine bitcoin or stream the "White Lotus" finale.
- "People say if you use ChatGPT or Perplexity or whatever you use, you're killing the environment. I say, no, that's not true. Actually, it's no worse [than if] you download and play Roblox," Chris Mattmann, data and AI chief at UCLA, recently told podcaster Shira Lazar.
By the numbers: OpenAI, Anthropic, xAI and Google did not share internal numbers on how much energy it takes to train and run their generative AI models.
- One oft-cited rule of thumb suggested that querying ChatGPT used roughly 10 times more energy than a Google search — 0.3 watt-hours for a traditional Google search compared with 2.9 watt-hours for a ChatGPT query.
- Epoch AI, a research group that publishes third-party estimates of AI energy use, said in February that the 2.9 watt-hour calculation is likely an overestimate since AI models and the hardware running them are both more efficient now.
- But the rise of bigger models, especially deep research and reasoning models, could also be tugging this figure in the opposite direction.
Zoom in: The carbon footprint of training a model depends on many factors, including the number of parameters in the model, the total hours of training, and the type and efficiency of the power usage.
- Carbon emissions from AI training are steadily increasing, according to the eighth edition of Stanford's Artificial Intelligence Index, released last Monday.
- Training AlexNet, an early AI model, emitted 0.01 tons of carbon in 2012. Training GPT-4 in 2023 produced an estimated 5,184 tons, and training Llama 3.1 405B in 2024 produced 8,930 tons, per the report.
- "For perspective," the Stanford report says, "the average American emits 18 tons of carbon per year."
Meta shares training information in its model cards and also claims effectively zero carbon impact because of the company's net-zero policies and practices.
Between the lines: Beyond carbon emissions and water use, data centers currently draw power from a few concentrated locations, often in marginalized communities.
- "The training for an AI model typically happens in one location, and the size of those power draws has been doubling every year," Thomas Wilson, principal tech executive at the nonprofit energy research institute EPRI, told Axios.
- Discussions of AI's environmental impact "do not give enough attention to environmental equity — the imperative that AI's environmental costs be equitably distributed across different regions and communities," the Harvard Business Review reported last year.
Zoom out: The amount of energy used to train a model previously far exceeded the amount of energy used for inference. But as usage grows, some experts say this is no longer true.
- "Enabling millions to use generative AI in their daily lives, and then fine-tuning the models to improve their performance draws large amounts of energy long after a model has been developed," per MIT News.
What they're saying: Absent clear numbers, there's a lot of back-of-the-napkin math at work on this topic.
- Anthropic pointed Axios to a Substack post called "Using ChatGPT is not bad for the environment," by Andrew Masley, a former physics instructor and current head of D.C.'s Effective Altruism group.
- Masley's calculations show that living car-free, switching to green heating or avoiding one transatlantic flight has far more impact than reducing ChatGPT prompts by a few million.
The amount of energy used by a chatbot isn't the point, Masley argues.
- "People who care about climate change should spend much less time worrying about how to reduce their individual emissions and much more time thinking about how to bring about systematic change to make our energy systems better."
- "Answering the question about environmental impact is almost separate from the amount of power demand," Wilson said. "Regardless of what the power use is, the environmental impact depends on how you generate that power."
Yes, but: AI itself holds potential for developing new approaches to reducing its own climate impact.
- A new report from International Energy Agency estimates that "broad application of AI-led solutions" could reduce energy-related emissions by around 5%, including 8% in electronics manufacturing.
- Last year the World Economic Forum cited reports that AI could potentially mitigate 5-10% of global greenhouse gas emissions.
What's next: The new administration's singular focus on winning the AI race against China is sidelining environmental concerns, leading to a new energy "pragmatism" or "realism."
- Data center companies have reportedly pinged almost half of the 13 major U.S. electric utility companies to request "volumes of power that would exceed their peak demand or existing generation capacity," according to Reuters.
The bottom line: Even if some well-meaning users stop using ChatGPT, the AI industry will still demand untold amounts of energy.
- "We need energy in all forms. Renewable, nonrenewable, whatever. It needs to be there and it needs to be there quickly," Eric Schmidt told the House Energy and Commerce Committee Wednesday.
2. Bill takes on AI data centers' climate impact
Sen. Sheldon Whitehouse (D-R.I.) introduced a bill last week aimed at addressing rising greenhouse gas emissions from AI data centers and crypto operations.
Why it matters: The bill — shared with Axios — represents a serious effort to address the climate impacts of the data center demand influx and is a stage-setter for the larger conversation on AI and power use.
Driving the news: The Clean Cloud Act, with Sen. John Fetterman (D-Pa.) as a co-sponsor, would assess a carbon emissions fee on power consumption by data centers and crypto miners.
- "It's a marker for where this industry needs to be," Whitehouse said during a discussion in his office. "And I think if you're the AI [or] crypto industry coming in, you've got to think long and hard about what you want your public reputation to be if you intend to stick around for a while."
- Tech companies, he said, "have more money than God, so they can readily pay for clean energy or offsets to make up for what they're doing by dropping their massive load onto these grid systems."
If you need smart, quick intel on energy and climate policy for your job, get Axios Pro Policy.
3. Training data
- Saturday: President Trump announced computers, smartphones and electronics would be exempt from his newest tariffs. Sunday: Trump's commerce secretary said the exemption was temporary, and Trump said new tariffs on chips were coming within a week. (Axios)
- China has stopped exports of some rare earth minerals that are critical to making the computer chips that power AI. (New York Times)
- OpenAI CEO Sam Altman said in a TED conference interview that ChatGPT now reaches 10% of the world and usage doubled in a few weeks. Go deeper: Watch the full interview. (Forbes, TED)
- Louisiana is giving an algorithm much of the deciding power when it comes to who gets parole. (ProPublica)
4. + This
Hate to break it to you, but the voices of Elon Musk and Mark Zuckerberg coming out of crosswalk buttons in Silicon Valley this weekend weren't real. The crosswalks were hacked and the voices were probably AI clones.
Thanks to Scott Rosenberg and Megan Morrone for editing this newsletter and Matt Piper for copy editing.
Sign up for Axios AI+





