Microsoft CEO Satya Nadella opens up on book tour
Nadella's book "Hit Refresh" goes on sale Tuesday. Photo: Microsoft
If it seems like Microsoft CEO Satya Nadella is all over the media, it's not your imagination. He has a new book out called "Hit Refresh," and the promotional interviews have generated some new nuggets:
Microsoft should not have bought Nokia's phone business. It's long been clear that Nadella didn't like the deal, which he rapidly unwound upon becoming CEO. But in the book, Nadella makes it clear he had voiced his opposition directly to then-CEO Steve Ballmer.
- "I voted no," Nadella writes, referring to an open poll Ballmer did of the senior leadership team. "We were chasing our competitors' taillights."
- He noted that months later Microsoft wrote off the entire value of the deal and laid off thousands of former Nokia workers. "It was heartbreaking."
Nadella once shadowed Reed Hastings. Concerned that he had spent his entire career at one company, he spent a year being mentored by the Netflix CEO, who was a member of Microsoft's board.
- "I had not seen any other large organization or a fast-growing organization from the inside," Nadella said in an interview with the Washington Post. "He let me do that for a little while. That was the kind of thing I sought out.
Thoughts on AI worries: Both Nadella and Microsoft co-founder Bill Gates think Elon Musk's concerns about machines with smarter-than-human intelligence are overblown.
- "The so-called control problem that Elon is worried about isn't something that people should feel is imminent," Gates told WSJ. Magazine as part of a joint interview with Nadella. "This is a case where Elon and I disagree. We shouldn't panic about it. Nor should we blithely ignore the fact that eventually that problem could emerge."
The need to control design and data: Nadella raised what he says are more pressing concerns, including the near-term issues resulting from machines making decisions based on data fed to them by humans.
- "There are still a lot of design decisions that get made, even in a self-learning system, that humans can be accountable for," he said. "So we can make sure there's no bias or bad data in that system. There's a lot I think we can do to shape our own future instead of thinking, 'This is just going to happen to us.'"