
Lazaro Gamio / Axios
Robots are normally only good at the task they are programmed to do. Now, scientists have programmed robotic controllers to learn to avoid obstacles and cooperate by mimicking Darwinian evolution.
Just like genes, the robot's programming could mutate and be selected upon. Additionally, like in evolution, the robots could co-opt previous parts of their programming to find solutions more quickly. Because the robots were connected online, when they were close together they were also able to exchange their evolving robo-genomes.
Why it matters: Robots are usually only programmed for one task, and when confronted with an obstacle, they can falter. Online evolution allows them to adapt and learn to find solutions outside of their programming.
"It is not directly analogous to nature, but it is inspired by nature," Luís Correia, a scientist at the University of Lisbon in Portugal and an author of the study told Axios.
What they did: In the study, published Wednesday in the journal Royal Society Open Science, the scientists gave the robots three tasks: avoiding obstacles while moving in as fast as possible with little detours, searching for and finding a small target, and a cooperative task that involved gathering together. Robots that had evolved solutions to the cooperative aggregation task then had faults that mimicked hardware problems injected into their programming to see if the robots could evolve work-arounds.
The bottom line: In the past, online evolution in real-world robots wasn't feasible because there wasn't enough processing power, so learning took days. (Imagine a robot running into an obstacle over and over again, damaging itself as it slowly evolves the programming to go around.) But these real-life robots were given an hour to complete their tasks, and appear to be able to adapt to damage acquired while learning. Although the tasks were very simple, this shows that online evolution is another viable type of robot learning and might let them address more complex tasks in the future.