Jun 13, 2018 - Technology

Uber researches AI system to detect users in "unusual state"

Uber logo on a smartphone screen

Photo: NurPhoto/Getty Images

Uber is seeking to patent a machine-learning system that can predict when a user is acting "unusually." The patent was filed in December 2016, published last week, and first reported by CNN.

Why it matters: Such a system could help keep vulnerable or intoxicated passengers safe by pairing them with specially trained drivers. It could be useful for drivers, who could get a heads up before picking up a passenger who's behaving erratically.

Between the lines: The patent application never says exactly what it means by "users having an unusual state," but drunkenness seems like a good candidate.

How it would work:

  • The system learns how individual users tend to interact with the app by regularly measuring things like their typing accuracy, walking speed, or even the angle they hold their phone.
  • Referring to that baseline, the system can detect when a person is interacting with the Uber app abnormally. Too many typos or slow reactions to in-app prompts might tip off the system that something's off.
  • If the user seems only minimally impaired, the system can let a driver know to expect a tipsy passenger or recommend a well-lit pickup location. If the user seems more than a little intoxicated, Uber might only match them up with a subset of drivers, bar them from shared rides like Uber Pool — or even prevent the user from getting a ride at all.

What they're saying: An Uber spokesperson said: "We are always exploring ways that our technology can help improve the Uber experience for riders and drivers. We file patent applications on many ideas, but not all of them actually become products or features."

Where it's headed: Machine learning that can detect changes in user behavior has a lot of potential. Similar systems authenticate computer users without making them log in: If a computer detects that its user is typing, scrolling and clicking in an unusual way, it might assume that the wrong person is accessing the machine and can lock itself down.

Go deeper