Murder in Cleveland: Facebook's latest fiasco

- Sara Fischer, author ofAxios Media Trends
Cleveland police said on Sunday that a manhunt was underway after a man uploaded a video of himself killing someone to Facebook and then going on Facebook Live and claiming to have committed multiple other homicides "which are yet to be verified," according to police.
Facebook was unable to pull the Live broadcast down in real time, and failed to remove the killing video for several hours after the fact. The suspect, identified by police as Steve Stephens, is still at large.
Why it matters: There have long been questions as to whether Facebook is a media company or a technology company. It may be easier to think of it this way: Is Facebook a publisher that monetizes quality, or a distributor that monetizes quantity? This is a tough question for even Facebook to answer, because its mission reflects the former but their business model reflects the latter. Case-in-point:
The big picture for Facebook: How it will deal with this problem piling on others, such as the publisher backlash to Instant Articles and the still-bubbling concerns over its role in the rise of fake news.
The big picture for the media: How Facebook handles its latest video fiasco could set a lot of precedents for everyone in the digital ecosystem. Livestream and crowdsourced content involves risk for everyone: platforms, publishers, consumers and advertisers. There's no real regulation around either, forcing everyone to make some tough decisions around how to weigh the risk of an imperfect technologies:
- Algorithms: Facebook has long used humans to regulate content on its site, but its impossible for humans to catch everything in real time. The company has tested algorithms and artificial intelligence to weed out this type of content from being uploaded in the first place, but the technology isn't there yet and miscalculations could inhibit users from accessing the technology who want to use it safely.
- Human oversight: Facebook has people checking content around the clock to make sure if it that doesn't adhere to their "community standards," it's removed. But the interpretation of those standards is often up to the judgement of people, not computers, to determine what should or should not live online. Facebook last week cited "human error" for failing to remove child pornography content on its site, even though it violated its standards.
- Mid-roll ads: Monetizing unregulated live or crowdsourced content can be difficult. Facebook says it's testing mid-roll ads with select publishers with the intention of scaling the technology further so that eventually a wide array of publishers can monetize their content with ads. How wide Facebook lets that publisher set get will affect what type of content companies ads are exposed to. (Google's YouTube crisis shows how messy this situation can be when you let the publisher set become infinite.)