Axios AI+

May 14, 2024
I'll be covering Google I/O starting at 10am PT for Axios.com, and this afternoon I'm interviewing Google Health executive Karen DeSalvo onstage at Axios BFD.
Today's AI+ is 1,093 words, a 4-minute read.
- Speaking of Axios BFD, tune in virtually today starting at 2pm PT/5pm ET for our signature dealmaking summit happening live from San Francisco! Watch here.
1 big thing: A chatbot you can interrupt
The new voice assistant version of ChatGPT that OpenAI demonstrated yesterday jokes, chides, apologizes, pretends to blush — and knows how to deal with interruptions.
Why it matters: ChatGPT-4o shows off a new level of real-time conversational fluency — including the ability to understand context and shift gears when people talk over it — that AI assistants will need to win over the world's users.
Driving the news: The new ChatGPT-4o (that's a lowercase "o," for "omni") spent 15 minutes of livestreamed demo time with OpenAI leaders in front of an in-person audience of the company's employees.
- The bot spoke in a sprightly female voice, responding far faster to queries than previous generations of voice-bots, with more nuanced human inflection and better mimicked human emotion.
The big picture: The advances shown by OpenAI make the last generation of assistants — including Apple's Siri, Amazon's Alexa and Google Assistant — seem outdated.
- Google and Amazon are both trying to reinvigorate their assistants using generative AI, as Axios previously reported.
- Apple is said to be readying an upgrade for Siri as well, though it's also reportedly been talking with both OpenAI and Google about using their models.
- Google plans to focus on AI advances at its I/O developer conference later today, and has teased it will be improving conversation.
Zoom in: In OpenAI's demos, the company showed the latest ChatGPT reading a bedtime story with increasing levels of dramatic excitement, as well as in a faux robot voice.
- That's a huge improvement from the days, not that long ago, when a robotic voice was all you could get from a computer.
- At one point in the demo, OpenAI head of frontiers research Mark Chen asked ChatGPT for tips to calm his nerves, and the chatbot suggested deep breaths. When Chen responded by hyperventilating, ChatGPT replied, "Whoa, slow down a little bit there, Mark — you're not a vacuum cleaner!"
- The new ChatGPT can also carry a tune: Though its singing chops aren't yet up to "American Idol" level, they far eclipse my own capabilities.
Between the lines: The new chatbot gave off strong "Her" vibes, resembling Scarlett Johansson's AI assistant in the 2013 film, many observers inside and outside OpenAI agreed.
- "Technology aside, OpenAI remains so good at productizing this all in a way that gets people excited and wanting to use it," investor and writer MG Siegler posted on X. "They knew the assignment. The assignment was to create a real-life version of Samantha from 'Her'. And they did it."
Yes, but: Others were quick to point out that "Her" did not have a happy ending.
The intrigue: When a wide swath of mischievous humans start engaging with ChatGPT-4o, it will be sure to throw some wild swings, the way the original ChatGPT famously did.
- Even in the tightly controlled environment of a product rollout, as the OpenAI team tried to end the demo, ChatGPT piped up, out of nowhere, with the line, "Wow, that's quite the outfit you have on!"
- OpenAI's developers ignored the come-on, but you can bet plenty of users won't.
Our thought bubble: Making voice assistants friendly makes sense, but OpenAI seems to be deliberately aiming for a level of warmth that could get messy very quickly — for both users and the company.
OpenAI CTO Mira Murati acknowledged that ChatGPT-4o needs more testing and improvements.
- "We've been doing a ton of red teaming," Murati tells Axios. "But of course, you need to broaden up access and see what weird things people do with it."
What's next: OpenAI made the text and image capabilities of ChatGPT-4o available to some customers immediately, while saying an "alpha" version of the improved voice mode will be released to paid ChatGPT Plus subscribers "in the coming weeks."
2. What AI companies can learn from bitcoin miners
As the AI community hunts for solutions to meet its ever-growing energy needs, there are lessons to be learned from a controversial sub-category of the data-center market: bitcoin miners.
Why it matters: Generative AI's demand for computing power is expected to consume ever-growing amounts of electricity; new servers coming online over the next three years alone are set to eat up more power than a small country.
The big picture: Axios spoke to three different miners — Riot, Terawulf and Iren — who illuminated some lessons bitcoiners have learned that could serve AI.
Big tech companies are accustomed to acting like 800-pound gorillas (they sit down wherever they want to). Terawulf's Nazar Khan, chief operating officer and a co-founder, tells Axios that that is increasingly not going to be an option.
- Favorite places for data centers, such as Northern Virginia, he says, are hitting capacity.
- On the other hand, parts of the country not previously seen as desirable locations look a lot better today. He pointed to Columbus, Ohio, as such a place.
"The map of where you can go is expanding," Khan explains. Terawulf's founders have found that a 100GB fiber connection is feasible in a lot more places than it used to be.
They can take that a step further and consider fundamentally changing the architecture of computing, as miners already have, by cooling computers with liquid rather than air.
- Riot Platforms' head of research, Pierre Rochard, explains to Axios that traditional data centers are far less energy-intensive than bitcoin miners, commanding less than 20% of the power demand per square foot of facility.
- AI facilities, however, will probably be somewhere in the middle, density-wise, of those traditional data centers and bitcoin mining. That could be dense enough to try liquid cooling.
And then there's demand response. Bitcoin miners have found that they can get access to new streams of electricity by agreeing to turn operations off when demand peaks, as Axios has previously reported.
- AI operators are unlikely to be able to flip off that fast, though one segment of the operation right now — the one that crunches data for "learning" to improve AI models — can probably work more like bitcoin miners.
- Operations supporting "inference" — delivering answers to users — will be much more sensitive about downtime.
The intrigue: Whether AI providers join demand response programs or not, insiders expect AI to make demand response itself work better.
3. Training data
- Bay Area AI companies raised more than $27 billion in 2023, representing more than 50% of global funding. (Crunchbase News)
- Melinda Gates is leaving the Bill & Melinda Gates Foundation with $12.5 billion to focus on her own philanthropic endeavors. (Axios)
- Rumble, the conservative YouTube alternative, is suing Google again. (Axios)
4. + This
Google Gemini's AI-generated meeting note-taker has certainly learned how to say "no."
Thanks to Megan Morrone and Scott Rosenberg for editing this newsletter and to Caitlin Wolper for copy editing it.
Sign up for Axios AI+


/2024/05/14/1715654220463.gif?w=3840)

