Davos crowd sees power and perils of data
"Data is the new oil" has long been a popular idea in tech. This year's Davos event grappled with the double-edged nature of that saying.
Why it matters: With autonomous cars, satellites and ubiquitous sensors capturing ever more of what happens in our world, data accumulation will only keep accelerating. While these stockpiles can help society, their risks keep growing too.
Blockchain advocates point to a world in which people have the nuanced ability to share only specific and necessary elements of data.
- For example, today, a bar verifying your age checks your driver's license — and gets to see a raft of your personal info. A blockchain-based ID system could limit that to just sharing your photo and age.
- "These new technologies allow you to only convey those facts, and preserve your data," Decentraland cofounder Esteban Ordano told me at Davos.
Yes, but: More data means more opportunities for governments to spy on their own citizens.
- To do their jobs, autonomous cars must capture images of everything in their path. That will potentially make vast new libraries of information available to authoritarian governments and overzealous law enforcement authorities.
- As U.S. states criminalize abortion, prosecutors could turn to tech data hoarders for evidence such as the purchase of a pregnancy test online, fertility information from a period-tracking app or location data placing a user near a health clinic.
At Davos, human rights groups called for a moratorium on the sale and use of spyware, such as NSO group's Pegasus.
- Access Now executive director Brett Solomon told Axios that spyware is only the "pointy end" of a much bigger surveillance-industry spear: "The surveillance sector is a massive multi-billion-dollar market which is currently under- regulated and has insufficient controls within it."
- The rise of biometrics — such as face recognition and eye-scanning — makes the problem "much worse," he added.
Between the lines: Creators and operators of data-gathering tech are beginning to accept that they need to take responsibility for their immensely powerful tools.
- "This idea has got to die that we are a neutral platform," Will Marshall, CEO of satellite imagery company Planet, told Axios.
- Marshall is a firm believer that some technologies have an intrinsic bent toward good and others toward harm. Either way, he says, it's the responsibility of product developers to recognize, interrogate and minimize the potential for misuse.
Be smart: Much of the personal data is being collected without the explicit permission of the people the data identifies. And data you share with one purpose in mind can easily be used for a different one you might not want — or that could harm you.
What's next: All this data is being used to train machine-learning algorithms that form the foundations for AI systems making decisions on humans' behalf.