December 04, 2023
Hi, it's Ryan. I confess to already breaking out my holiday season sweaters. Today's AI+ is 1,232 words, a 4.5-minute read.
Situational awareness: Spotify is cutting 17% of its staff, or roughly 1,500 people globally, in its latest round of layoffs this year.
1 big thing: AI's energy problem
Running AI is much more energy-intensive than other forms of computing, but as leaders gather for the COP28 global climate summit in Dubai, we know relatively little about AI's net impact on climate change.
Why it matters: Increasing adoption of AI may make it one of the biggest uses of energy globally — putting pressure on AI providers to measure and publish data on energy use and energy sources.
- A growing body of evidence suggests using AI to improve physical systems — from farming processes to supply chains and buildings — will allow us to avoid around 10% of today's emissions.
Driving the news: AI has been a top theme for COP28.
- The United Nations and Microsoft announced on Thursday an AI-powered climate data hub to track progress in reducing emissions because, as Microsoft's Brad Smith said in a statement, "you can't fix what you can't measure."
The context: Other high-growth, high-emission parts of the global economy including civil aviation and commercial shipping have bowed to pressure to measure and reduce their carbon emissions — but there's no such plan on the table for AI or data centers more generally.
State of play: Overall data center energy consumption is growing despite impressive efficiency gains, according to the International Energy Agency — everything beyond that is a matter of dispute.
- While cloud computing and other data centers use about 1% of today's electricity, Britain's National Grid predicts data centers will consume just under 6% of U.K. electricity by 2030.
- Microsoft, Google and Nvidia are more optimistic.
- Global datacenter workloads increased by 9x between 2010 and 2020, according to Microsoft — but meeting that demand required only a 10% increase in electricity use.
- Google researchers say some papers grossly overestimated machine learning energy use, and that energy consumption of machine learning could soon plateau — and then shrink.
- Nvidia, which sells 95% of the graphics processing units (GPUs) used for AI, also notes that GPUs are more energy efficient than other types of chips. The GPUs it will ship in 2023 will collectively consume barely 1/1000th of the energy consumed in the U.S. each year.
- AI models such as Mistral 7B and Meta's Llama 2 use up to 100 times less energy than OpenAI's GPT4, while Google DeepMind launched a product for the discovery of faster computer algorithms, which the company hopes will cut the amount of energy needed to run AI.
Flashback: A Dutch PhD student grabbed headlines in October, by concluding in a peer-reviewed paper that AI could use as much electricity as the Netherlands or Sweden by 2027. But this is just 0.5% of global energy consumption.
The other side: It's possible to run power-hungry AI systems without wrecking the planet.
- AI has been demonstrated to cut data center cooling costs by 40%, and big tech companies are some of the leading investors in clean energy — committed to zero carbon energy systems between 2030 and 2040.
- One of the early uses of AI in the energy sector has been to improve predictions around weather and energy supply and demand.
- In California, companies with annual revenues over $1 billion will have to report their emissions by 2026.
The bottom line: While companies are getting better at setting climate targets, lack of transparency around energy use and the absence of standardized reporting requirements from regulators means we're going to be debating AI's climate impact for years to come.
2. Exclusive: Runway incorporates Getty into its AI
AI startup Runway has reached a deal with Getty Images that will allow the company to offer a more "commercially safe" version of its text-to-video engine, executives from both companies tell Ina Fried.
Why it matters: There's a growing push by AI c0mpanies to offer options for businesses with greater legal protections against copyright claims.
- Both Adobe and Getty Images have stressed that their generative AI engines have been trained on licensed content. Getty debuted its text-to-image service in September, while Adobe has added generative AI tools throughout the year, both within existing products and under the Firefly moniker.
- Some engines trained more broadly have also offered to indemnify customers against copyright claims. OpenAI, for example, said at its November DevDay that it will offer legal protections to enterprise and API customers.
Between the lines: Runway's partnership with Getty is designed to target businesses that want to combine their intellectual property with an AI engine without fear of copyright lawsuits.
- Some key details of the Runway-Getty service, such as pricing, have yet to be announced.
- For now, Runway is working directly with enterprise customers and currently has a waiting list, though Runway CEO Cristóbal Valenzuela said there may be a more self-service option down the road.
What they're saying: Valenzuela said the Runway-Getty model provides businesses something akin to a prefabricated house that they can then customize, including with their own style, products and characters.
- "Most of the time it won't do 100 percent of the job," Valenzuela said in an interview. "It does 90 percent and then you change what you want."
Getty chief product officer Grant Farhall tells Axios that Getty's generative AI text-to-image tool has shown it's possible to create "high quality, effective AI generative models from a set of content and data that is responsibly sourced."
- "Now we're gonna prove it again with Runway for video," Farhall said.
3. AI boost for developing new materials
AI tools and sophisticated robotics are speeding the quest to engineer urgently needed new materials, Axios' Allison Snyder reports.
Driving the news: Google DeepMind researchers reported this week that a new AI model discovered more than 2.2 million hypothetical materials, including 381,000 stable new materials that are candidates for scientists to attempt to make and test in a lab.
What's happening: The batteries, solar cells, semiconductor chips and other devices required for the next generations of the electrical grid, computing and other technologies hinge on the development of novel materials.
- AI and materials science top the list of U.S. federal grants to industry over the past six years, according to a report published this week from Georgetown's Center for Security and Emerging Technology.
- China now dominates the field of materials engineering by several key metrics, including publications, employment and degrees awarded.
What's next: Predicting that a possible structure is stable doesn't guarantee it can actually be made.
- In another paper published this week, researchers at the Lawrence Berkeley National Laboratory (LBNL) report results from a lab outfitted with robotics guided by AI to autonomously synthesize crystals.
- The recipes for the materials were suggested by AI models that use natural language processing to analyze existing scientific papers and were then optimized as the AI system learned from its failures.
4. Training data
- What happens when a city passes an ordinance written by ChatGPT but no one discloses the fact? (Associated Press)
- The founder of an audio erotica startup says she's being careful about replacing creators with AI. (Axios)
- Worth your time: Elon Musk, Larry Page, and a 2015 fire pit debate that lit the fuse of today's AI competition. (New York Times)
- In 2019 Sam Altman signed a "letter of intent" for OpenAI to spend $51 million on the chips from an AI startup called Rain. (Wired)
- ICYMI: Ina scooped on OpenAI pushing back the launch of its GPT store until early 2024. (Axios)
5. + This
Choose your own Threads adventure.
Thanks to Megan Morrone for editing this newsletter.