Facebook's new plan to nip product misuse in the bud
Over the last year, Facebook has quietly built up a small "responsible innovation" team with a huge mission: Look at upcoming products and help prevent them from being misused to cause harm.
Why it matters: Facebook has spent much of the last few years dealing with the shortcomings of its existing products, especially with regard to misinformation, data privacy and abuse.
Details: The team is led by Zvika Krieger, who was was exploring similar issues as head of a World Economic Forum center in San Francisco before joining Facebook last year.
- Facebook hasn't talked much publicly about Krieger's team or its work since we scooped his hire last year, but he shared some details Tuesday night in a World Economic Forum panel I moderated.
- Krieger said the team, which works directly with those developing new products, brings together people with backgrounds in anthropology, sociology, neuroscience, human rights, diversity and inclusion, civil rights and law.
- Though Facebook has made past efforts around responsible innovation, Krieger's unit gives Facebook as a central team that acts as a resource across the company.
How it works: Krieger said his main goal is to engage product teams early, when he can still ask whether the product should be built at all and if it should, how to identify and mitigate unintended misuses.
- "If you are just a box-checking exercise or if a product is about to launch in a week and (someone) says, hey, can you do an ethics check on this, that’s not a real engagement," Krieger said.
- At the same time, he is trying to send the message to the rest of Facebook that his team isn't just a roadblock. "Considering potential harms doesn't have to slow teams down. In fact, it often saves them time down the road," he said. "The worst-case scenario is not realizing the potential harm until after launch — that is what we are trying to avoid as much as we can."
The big picture: Facebook's next generation of products is likely to be more pervasive and intrusive than today's News Feed and groups, including virtual worlds, augmented reality glasses with facial recognition and other powerful technologies.
- Facebook has already said its Reality Labs unit will begin testing augmented reality glasses later this year in order to learn what people's expectations are around a product that might be able to record the world around it, among other potential areas of concern.
Yes, but: Building a healthy reflex for internal self-criticism can come hard inside a huge successful corporation. Look at Google, which created a team devoted to studying the ethics of artificial intelligence, then pushed out some of those same people in an explosive public controversy.
My thought bubble: It’s too bad there weren't enough people asking these questions when Facebook was building its original product — or at any time during its first decade, when so many of its recent problems germinated.