
ERICA PANDEY: Hi, I’m Erica Pandey, host of this season of “How It Happened.” If there’s one word that describes how Elon Musk conducts himself as a leader, it’s “risk.” He says it himself.
ELON MUSK ARCHIVAL: Anything, which is significantly innovative, is gonna come with a significant risk of failure.
PANDEY: The kind of groundbreaking, futuristic innovation Musk has accomplished seems to demand an enormous appetite for risk — or at least a very high tolerance for it.
MUSK ARCHIVAL: I mean, if the outcome is exciting enough, then taking a big risk is worthwhile.
PANDEY: But Musk’s evaluation of risk doesn’t just have consequences for him or even for his investors. Because of the sheer scope of his influence in so many consequential spheres — transportation, space, telecommunications — when he takes a risk, he can also drag others along for the ride: his employees, his customers and the people who interact with his products, willingly or not.
Musk’s pattern has been to take the risk, take the leap, try the new thing — and learn from failure to fix it. He’s done this with rocket launches and with the rollout of Tesla’s self-driving software.
In April, Musk said he was buying Twitter and then tried to back out, setting up a legal battle with the social platform. If he loses and is forced to buy Twitter, Musk will be in control of the company.
It raises a question: Can his culture of risk be applied to a social media platform where misinformation and disinformation already exist — and where a reluctance to moderate content could potentially worsen the problem?
Musk himself has tweeted about how intense Twitter can be. My colleague, Zach Basu, is reading Musk's tweets.
@elonmusk: On Twitter, likes are rare & criticism is brutal. So hardcore. It’s great.
SpaceX and Tesla
PANDEY: In this episode, we lift the hood on two of Musk’s companies: SpaceX and Tesla. We’ll dig into what Musk’s risk-taking already means for all of us today — and what this philosophy could mean if Musk takes over Twitter.
From Axios, this is “How it Happened: Elon Musk vs. Twitter. Part Two: Empire of Risk.” ( could we please add the episode link and embed here?)
SPEAKER ON NBC NEWS: Newly released dashcam video shows another frightening Tesla crash.
SPEAKER ON NBC NEWS: In a first-of-its-kind case, a California man is charged with felony manslaughter for a crash that's been linked to Tesla's popular autopilot function.
SPEAKER ON CBS: A driver using the electric carmaker's autopilot system died after crashing into a highway barrier last month.
PANDEY: Car crashes kill nearly 40,000 people a year in the United States. The auto industry hopes to save lives by one day making self-driving cars that never crash. But starting in 2016, local news reports about Tesla accidents raised concerns among safety experts about whether Tesla's automated technology itself was to blame.
Musk is a big believer in the potential of automated driving. Here’s my colleague Zach Basu reading a tweet from Elon Musk about an early interaction of Tesla’s autonomous features called “autopilot.”
@elonmusk: Essentially, passive autopilot (car intervenes only when crash probability is high) cuts crashes in half. Active autopilot (car is driving itself) cuts crashes in half again. Doesn’t mean there are no crashes, but, on balance, autopilot is unequivocally safer.
PANDEY: When I hear that a car is on quote-unquote “autopilot,” I assume it fully drives itself, whether I’m in the front seat, back seat or asleep. But that's not really what is going on with autopilot or even with Tesla's most advanced technology called “full self-driving” mode. Instead, both are an ambitious patchwork of driver-assistance features.
Many new cars have features like lane-assist technology or automated braking if another car suddenly brakes ahead. What Tesla calls “full self-driving” has all of that, plus driving skills for city streets, although driver supervision is still required.
There's a lot of concern that these driver-assistance features could make drivers complacent – and roads less safe.
The National Highway Traffic Safety Administration, or NHSTA, released a report this summer identifying 392 reported accidents as of May 2022 involving cars operating with assisted driving features. 273, or roughly 69%, involved Teslas. NHTSA is the government body that regulates Tesla and the rest of the auto industry for safety.
NHTSA's investigation into Tesla
In 2021, the agency opened an investigation into Tesla’s autopilot system and then escalated that investigation nearly a year later. It’s digging into whether Tesla’s driver assistance systems increase the risk of car accidents. If NHTSA finds they do, the next step would be a recall to correct the problem.
So how did we get there? It has to do with what was happening inside NHTSA and with Musk’s own tolerance for risk — and how these two factors have come together to deeply, deeply concern safety experts.
JOANN MULLER: For about six years now, NHTSA hasn't even really had an administrator overseeing vehicle safety.
PANDEY: Joann Muller, Axios transportation reporter. The regulatory environment for vehicles over the last few years has been a mess. Technology has been advancing faster than regulators, and that's assuming the agency was fully staffed, starting at the top.
MULLER: President Trump never appointed anyone for his entire presidency. And so there was an acting administrator in the role.
President Biden nominated someone. Stepped into the job about a year into his presidency, but he left like two or three months into the job to take another position. So what you had is a real lack of leadership in the regulatory offices to govern this technology. And at the same time, Congress was kind of stalled on putting in some sort of federal legislation around the safety for self-driving cars.
What you're left with really is a patchwork of state laws. Some states have decided they wanna lead in this technology, but what happens is when you have a patchwork of laws, that there's really no clarity on what is safe and what is not. And that's the backdrop to all of this.
PANDEY: We reached out to NHTSA and to Tesla for comment, but NHTSA said they cannot comment on an open investigation and Tesla did not respond. We also reached out to Elon Musk and to his companies for comment for this podcast, and they did not respond.
While all this was going on, Musk and Tesla had been rolling out these increasingly advanced automated driving features. Currently, “full self-driving mode” is in beta, which means Tesla owners can opt into test-drive the feature if they are deemed eligible by Tesla. The company determines eligibility by collecting owners’ driving data and evaluating their safety record.
Tesla's rollout of “full self-driving” technology
MULLER: But for the rest of us, the people who share the roads with Teslas, who raise our kids in our neighborhoods where Teslas are parked at the end of the day, the people who cross the street and expect drivers to see us and acknowledge us. We didn't agree to be part of this experiment.
Most other companies that are developing self-driving technology, they're testing their cars on public roads too, but they're doing it with trained safety drivers behind the wheel, ready to take over in case there's a problem.
But in the case of Tesla, this is just software that's downloaded into your car overnight and you get no special training. So you don't know when someone is driving a Tesla by you whether that software is engaged or who's driving the car. And so that really puts everyone at risk. We're all guinea pigs in Tesla's rollout of “full self-driving” technology.
PANDEY: Back in 2021, Musk tweeted this message to early testers of FSD beta.
@elonmusk: Running pre production software is both work & fun. Beta list was in stasis, as we had many known issues to fix. Beta 9 addresses most known issues, but there will be unknown issues, so please be paranoid. Safety is always top priority at Tesla.
PANDEY: “But there will be unknown issues, so please be paranoid.” As of early fall 2022, Musk has said “full self-driving” is available to more than 160,000 Tesla owners.
MULLER: I spoke to the head of automotive testing from Consumer Reports about this newest rollout. He said, quote, “FSD beta is like putting a toddler on your lap. Like, look, my kid can drive! It’s not safe.”
PANDEY: So that’s 160,000 people driving in the U.S. and Canada who have the option to deploy this technology, to bring anyone around them into this big experiment.
Musk is selling people on this experiment by touting how cars could be used as robotaxis that would someday drive themselves entirely — and be a source of income for their owners. Here’s a tweet from 2019.
@elonmusk: When the car is FSD without supervision, ie robotaxi, you’ll be able to earn far more than monthly lease/loan cost by allowing others to use it. Managing a small fleet of robotaxis will be a career for many & much better than driving a single car.
Though today he still hasn’t delivered on that promise, Musk has continued to hype the potential to investors, like in a 2021 Q4 investor call.
MUSK ARCHIVAL: You just go from having an asset that is… has utility of perhaps 12 hours a week for a passenger car to maybe around 50, 60 hours a week.
Tesla crash reports
PANDEY: Meanwhile, the steady drip, drip, drip of Tesla crash reports worries one person in particular. His name might be familiar. Once upon a time, renowned consumer advocate, author and presidential candidate Ralph Nader was, like so many Americans, a huge fan of Tesla.
And if you’re wondering why Nader’s views on Tesla matter, his landmark 1965 book, “Unsafe at Any Speed,” led to the creation of NHTSA itself.
We wanted to speak to Nader because of a striking statement he put out in August, in which he called Tesla’s “full self-driving” technology, quote, “One of the most dangerous and irresponsible actions by a car company in decades,” end quote. Before this, though, he was a fan.
RALPH NADER: I rode in a Tesla probably seven, eight years ago. It was a spectacular ride. The early Teslas are really, really very, very well produced and of course, the feel of the rides, terrific. The seats were good on people's backs. And it obviously got great mileage.
PANDEY: His views were widely shared. In 2013, the Tesla Model S earned a score of 99 out of 100 from Consumer Reports, higher than any other car it had road-tested. That same year it earned a fivestar safety rating from NHTSA.
NADER: Who was the last person to do that in the auto industry? It's really quite extraordinary what he did. But it's being again and again occluded by what he's been doing in the last few years.
PANDEY: Making a cool electric car was never the end game for Musk, so he didn’t stop there. He wanted to keep innovating, to keep pushing the envelope. As he turned toward automation, he began hyping the features with names that seemed to overstate what they were capable of. It all concerned Nader.
NADER: He’s got a reckless technology. Reckless assurances about his automation, about how safe autonomous cars are gonna be. He, he didn't make the distinction with semi-autonomous, like lane changing and brakes and so forth. OK? No, no, he was talking fully autonomous cars. I would not ride in a modern, recent one. One, the problems they have. … And I wouldn't want to give it any legitimacy.
Tesla's factory problems
PANDEY: Working conditions inside Tesla's factory have also drawn scrutiny — and prompted numerous investigations. And Muller herself actually experienced a jarring incident at Tesla’s Fremont, California, factory in 2016.
MULLER: The first time I visited Tesla's factory in California, I was on an electric golf cart, having a tour with a Tesla representative. When a worker driving another tram crashed into us, it wasn't a huge accident. He just really clipped us while cutting a corner, but I'd never seen that kind of carelessness before.
I visited a lot of factories and most auto plants have a safety culture that includes rules for forklift operators and other drivers inside the factories. You have to make eye contact and you have to use specific hand gestures before proceeding so that everyone knows who has a right of way. To me, this crash, this incident really signaled a lack of discipline at Tesla.
PANDEY: A 2019 Forbes investigation looked at California state occupational safety and health administration violations, or OSHA violations, at major automakers from 2014 to 2018. And Forbes reported that Tesla had accumulated three times as many violations as its 10 largest competitors combined in that period.
The California OSHA violations were for incidents like major factory floor injuries ranging from severe bone fractures to severed fingertips.
Tesla’s autoworkers are not unionized. Unions in auto plants are historically an important muscle for workers to use to guarantee safety. It’s been widely reported that Tesla has quashed repeated unionization efforts in their plants.
PANDEY: After the break, Axios Space reporter Miriam Kramer will take us inside SpaceX, where Musk’s risk-taking has explosive stakes — literally.
[AD BREAK]
SpaceX rocket explosion
PANDEY: We’re back. Before the break, we looked at the ways that Tesla's new self-driving technology poses new risks to drivers — and potentially to everyone around them. Now, we’ll examine how risk works at SpaceX: A good place to start is rocket explosions.
ARCHIVAL OF A SPACEX EXPLOSION
MIRIAM KRAMER: What you just heard was an early failed test of a Starship prototype.
PANDEY: Miriam Kramer is the space reporter at Axios and has spent years covering SpaceX.
KRAMER: SpaceX actually builds failure and risk into the way it builds its rockets. Right now, the company is working on developing its Starship rocket in Texas and that involves blowing up a lot of hardware. …
PANDEY: Musk tweeted about that failure with a link to a video of the explosion.
@elonmusk: So … how was your night?
PANDEY: He replied to himself with a car joke — what else?
@elonmusk: It’s fine, we’ll just buff it out.
CASEY DREIER: Something that, uh, Elon has done is I think creating the fact that expectations and experimentations can happen again. That failure is OK in this process of rapid iteration, that it's OK to take risks.
PANDEY: That’s Casey Dreier, chief advocate and senior space policy adviser at The Planetary Society. He’s been a close observer of SpaceX for years.
DREIER: And what SpaceX has been able to do is normalize experimentation, normalize failures in a controlled way, right? These are, you know, tests. They're not failing with human lives or fundamentally with most of their payloads. “A test is worth a thousand expert opinions.” SpaceX has been very good at normalizing that process.
PANDEY: Musk is setting a tone for risk, establishing a culture for it … one that emanates out beyond his own ventures.
NASA's Artemis program and Starship
KRAMER: NASA is relying on SpaceX and its culture of risk to get people to the moon. Starship — that same rocket you heard explode earlier — is key to NASA's Artemis program to deliver people to the surface of the moon. The rocket hasn't been to orbit yet, but SpaceX already has a contract with NASA to use Starship as a lunar lander. Industry experts aren't convinced it'll work, particularly before 2025, when the first landing is expected. Because NASA is relying on SpaceX, it's tacitly endorsing its risk-taking behavior in the industry.
PANDEY: No one knows this better than Lori Garver, former deputy NASA administrator who played a role in SpaceX securing early NASA contracts.
LORI GARVER: SpaceX, they're not a, you know, a publicly traded company so they can change their plans. For a long time, for instance, on Starship, there were no government contracts. So you'd see a lot of changes and people would just be surprised. Again, love it, hate it in the government contracting way. One of the reasons it costs so much is you can't just make changes when you know a certain avenue you're going down doesn't work.
KRAMER: When there is a problem at SpaceX, Elon Musk is super hands-on. Elon has even said that any SpaceX employee is allowed to get in touch with him before a rocket launch if they have concerns that there’s a problem. He assesses the risk ultimately, and he wants to be that direct line and that decider.
LORI GARVER: He is hands-on. I mean, I know a lot of people who work there and he's there when there's a problem under the engine with his light on his cellphone, looking to see what's wrong and asking questions and making decisions, you don't get that from aerospace CEOs in general.
PANDEY: Musk does a lot of things that a typical CEO wouldn’t, including sleeping on the factory floor when things get stressful and basically never taking a break. He spoke with Axios CEO Jim Vandehei and co-founder Mike Allen for “Axios on HBO” about a really tough period for Tesla years ago.
MUSK ARCHIVAL: Yeah, absolutely. No one should put this many hours into work. This is not good. And people should not work this hard. I'm not, they should not do this. This is very painful, painful.
MIKE ALLEN ARCHIVAL: Painful In what sense?
MUSK ARCHIVAL: It's because it hurts my brain and my heart.
PANDEY: But in our reporting, we learned that this kind of pushing oneself to the limit isn’t just common at SpaceX, it’s actually essential to the company’s bottom line. Casey Dreier again —
DREIER: If you get people to work voluntarily 60, 80 hours a week and you pay them at the salaried rate of 40 hours. That's how you save money in aerospace. If you can wring every ounce of effort out of that top talent, you can do some pretty amazing things for a relatively minimal cost.
PANDEY: So … we’ve learned that decisions about risk at Tesla have ramifications for everyone on the road, and decisions about risk at SpaceX can impact the entire space industry. We don’t know how Musk’s approach to risk could shake up Twitter. But the consequences could be huge.
DAN PRIMACK: If Twitter is owned by one person, what that one person thinks about the company and how much they care about maintaining it, growing it, improving it, well, that impacts everybody else in the public square.
PANDEY: Dan Primack is a business editor at Axios. He’s covered Musk for decades and has been covering the lead-up to the October trial, which will determine whether Musk is forced to buy Twitter. If Musk controls the company, his views on free speech will become globally influential. And in an appearance on the All-In podcast in May, Musk defined free speech like this.
MUSK ARCHIVAL: Some of the smartest people in history have said, have, have, have thought about it and said like free speech is important for a healthy democracy. It is important and free speech only matters. When does free speech [matter] most? It's when someone you don't like saying something you don't like.
PANDEY: He’s also shared his thoughts about free speech on Twitter itself, of course, in April of this year.
@elonmusk: By "free speech", I simply mean that which matches the law. I am against censorship that goes far beyond the law. If people want less free speech, they will ask government to pass laws to that effect. Therefore, going beyond the law is contrary to the will of the people.
PANDEY: That sounds really thought out, right? Musk has a lot of big ideas on how to fix Twitter, like these pronouncements on free speech. But he doesn’t address how he would make or enforce rules about what crosses the line.
Some of Musk’s peers are optimistic that he will tackle social media the same way he has disrupted space and other industries. Reid Hoffman, LinkedIn co-founder, has known Musk since they overlapped at PayPal in the early 2000s.
REID HOFFMAN: He's gonna drive for innovation and change. I do think he's right about more — if you scale human identity validation, I think that's a good thing. I suspect the Twitter people would even agree with the way of doing that.
My general belief in these things is having a true north defined about what you're doing and why it is. Have a theory about why it's good for society.
It's a different thing to be a voice on it saying the things you want versus responsible to the amazing cacophony of a lot of human voices, including people who do not necessarily have good intent.
Rebuilding Twitter
PANDEY: At the end of September, texts between Elon Musk and others, including Twitter founder Jack Dorsey and Twitter CEO Parag Agrawal, illuminated a key aspect of Musk’s thinking. The texts were revealed in Delaware court filings.
Across several messages, Musk made clear he wanted to reshape Twitter by hiring only engineers and coders. That approach has worked for him at his companies like Tesla and SpaceX. But Twitter isn't a hardware company. Its product is human communication itself.
If Musk rebuilds Twitter with an army of software developers alone, he'll be going against everything we now know about how to improve social networks. Coding may be difficult, but managing social behavior is a thousand times harder.
The stakes are high with Twitter. The platform functions as a global public square. Engineering Twitter comes close to engineering society.
So why do the people in charge of such an essential company want to become part of Musk’s empire of risk? That’s next time on … “Elon Musk v. Twitter.”
[CREDITS]
I’m Erica Pandey. Amy Pedulla is reporter-producer. Naomi Shavin is senior producer. This series was reported by the Axios newsroom, including Dan Primack, Miriam Kramer, Joann Muller, Javier E. David, Jonathan Swan, Sara Fischer, Ina Fried, Hope King and me. Fact-checking by Jacob Knutson. Zach Basu is reading Elon Musk’s tweets.
Scott Rosenberg and Alison Snyder are series editors. Sara Kehaulani Goo is the editor-in-chief and executive producer. Mixing and sound design by Ben O'Brien. Music supervision by Alex Sugiura. Theme music and original score by Michael Hanf.
Special thanks to Axios co-founders Mike Allen, Jim VandeHei and Roy Schwartz. And thanks to Lucia Orejarena, Priyanka Vora and Brian Westley. If you’re enjoying the season so far, please take a moment to rate and review the show.
We’ll be back soon. Thanks for listening.