AI segments a 3D eye scan into sections representing different types of tissue. Animation: DeepMind
Doctors at a U.K. eye hospital are getting algorithmic help interpreting the results of 3D eye scans, using a system developed at Google's DeepMind that can identify more than 50 eye problems and recommend a course of action with human expert-level accuracy.
Why it matters: DeepMind's system shows an intermediate step in its work, and tells doctors how confident it is in its assessment. This is crucial, because AI systems are often too opaque to be able to explain their reasoning, making them risky to deploy in high-stakes environments like hospitals.
Netflix CEO David Wells plans to step down after 14 years with the streaming giant, the company announced Monday — he'll exit after helping the company choose a successor.
Why it matters: Wells is the second C-suite executive to exit Netflix this summer. In May, Chief Communications Officer Jonathan Friedland exited after making insensitive remarks.
Many Google services on Android devices and iPhones store your location data even if you’ve used privacy settings that say they will prevent them from doing so, AP tech writer Ryan Nakashima reports.
Why it matters: "The finding is the latest instance in which a technology company has violated its own promises to protect user privacy."
In a shift that is roiling typically cocooned computer scientists, some researchers — uneasy in part about the role of technology in the 2016 election — are urging colleagues to determine and mitigate the societal impact of their peer-reviewed work before it's published.
The big picture: The push — meant to shake computer scientists out of their labs and into the public sphere — comes as academics and scientists are suffering the same loss of popular faith as other major institutions.
Sometimes, a computer science researcher produces a paper whose findings, if published, might lead to societal harm. Now, some experts are questioning the default course of action: publishing the paper anyway, potential damage be damned.
Why it matters: The call to suppress some research challenges decades-old principles in computer science and could slow work in a field that drives the economy, helps define the future of work and is the subject of intense global competition.
Heading back to Axios' San Francisco office after a meeting with a Berkeley professor, I nearly collided with an icebox-sized tub with wheels and a flagpole, sporting Cal colors.
What it's doing: The sidewalk robotis one of around two dozen that roam UC Berkeley and nearby parts of town, delivering food to students and residents. Kiwi, the Berkeley-based company behind this bot, has already made more than 10,000 deliveries, Techcrunch reports.
We always assumed technology and the naked transparency of social media would feed people’s taste for freedom and thirst for democracy.
The big picture: Right now, that assumption looks flawed: Technology might actually solidify the standing of despots and provide them with a new way to exert their power.