A new AI camera can mimic signaling patterns of the human brain, reports Scientific American.
The camera's photo sensors are like a pair of eyes that "wake up" and start looking around when they detect changes in light and movement. Next, the "eyes" send electrical signals to the circuits that control the backend of the camera and mimic the neuron pathways of the human brain. As the camera keeps working and sending more signals, its internal "neuron network" will learn to react more quickly to external stimuli, such as traffic lights changing and pedestrians crossing.
Why it matters: The camera does not start recording until it detects stimuli — a feature which makes it a more intelligent device that saves memory and power. This new technology also shows that AI, often requiring expensive and large-scale hardware, can be applied to devices small enough to be affixed to a drone.