Feb 26, 2019 - Energy & Environment
Expert Voices

How sensor data can help cities improve public spaces

Illustration of city made out of data

Illustration: Rebecca Zisser/Axios

Urban sensors are often used to monitor illegal activity — from identifying drivers that run red lights to generating predictive crime maps — but they can also collect data for quality-of-life improvements like reducing emissions and preventing car accidents.

The big picture: Municipalities could use anonymized, secure sensor data in combination with advanced computing to better understand how people travel through and use public space, without sacrificing individual privacy.

What’s happening: Academic researchers and urban planners can leverage the latest sensors — including “camera-as-sensor” technologies, which convert a camera’s optical image into an electronic signal — to gather and share insights about roads, intersections and public spaces.

  • Yes, but: Ensuring the data collected is protected and anonymized will be crucial to earning public trust in these efforts.

Details: These projects rely on a combination of video streams, computer vision, AI, and edge computing (computing done near the data source, not in the cloud).

  • Metro21: Smart Cities Institute at Carnegie Mellon University collaborated with Urban Data Eye and the Pittsburgh Downtown Partnership to identify pedestrian movement patterns based on CCTV footage and live-streaming. The project found that furniture and tree placement impact where pedestrians congregate, which could inform planning or redesign decisions.
  • Platform Pittsburgh, another Metro21 project, analyzed urban video streams to identify frequent sites of near-miss accidents, which could guide the creation of safer intersections. It is also tracking emissions based on images of truck's tailpipes.
  • Numina has partnered with city planners, mobility companies, and other stakeholders in Jacksonville, Las Vegas and St. Louis to collect and analyze data on the walkability and bike-ability of urban areas using light-post-mounted sensors and computer vision.

This technology has applications outside urban areas, as well. Project Diversita developed a camera and computing system that can identify over 5,000 different animal species to inform wildlife research and management.

The bottom line: Applying machine learning and edge computing to sensor and camera data could inform future urban planning initiatives, the allocation of public space, and even conservation efforts.

Karen Lightman is executive director of Metro21: Smart Cities Institute at Carnegie Mellon University.

Go deeper