How AI could help prevent violence against seniors, kids
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Allie Carl/Axios
Professors at the University of Iowa are developing an AI technology that can detect violent behavior, and they hope it someday can even predict violence before it happens.
Why it matters: Researchers argue that this AI and camera technology could provide protective surveillance for vulnerable populations such as seniors and children when they're under care in nursing homes and daycares.
- Iowa is facing a childcare and nursing home staffing crisis that can lead to substandard care and less monitoring. The researchers' new technology doesn't require more staffing but would add a layer of protection.
Driving the news: Aislinn Conrad, a social work professor, and Karim Abdel-Malek, a mechanical engineering professor, are creating an AI camera system that detects violent behaviors, such as kicking and hitting, and notifies caretakers and authorities.
How it started: In 2003, Abdel-Malek developed a "virtual soldier" — a human simulator used by the military that shows how a soldier would perform specific tasks, based on their height, weight and other measurements.
- It allows military officials to test specific scenarios without burdening or accidentally injuring people, including determining how much weight someone can carry in a pack or how additional armor can influence mobility.
- Abdel-Malek and Conrad want to use the program's kinetic motion data to detect real-time violence by analyzing limb movements.
How it works: The technology would ultimately be similar to a nanny cam "on steroids," Conrad says.
- The camera would be "asleep" until a detected violent or jarring motion wakes it up. If it detects movements like kicking or hitting, it would start recording and notify someone.
- Someday, Conrad says, they hope the camera will be able to detect violent behavior before it happens, based on cues a person exhibits before acting out.
State of play: Currently, they're working to train the AI program to better recognize physical violence. They plan on recording hundreds of actors attacking dummies to help train it.
- "AI is generative," Conrad says. "As we train it more, it will become more refined and precise in its ability to not only tell when abuse has happened but the type of abuse."
Between the lines: Conrad says they're very aware of the ethical concerns with this kind of technology and are working to reduce the risk of incorrect persecution, for example.
What's next: Researchers are fundraising to finish the product's development.
- Once they secure the money, they expect the product development to take 18 months.
