Sign up for our daily briefing
Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Denver news in your inbox
Catch up on the most important stories affecting your hometown with Axios Denver
Des Moines news in your inbox
Catch up on the most important stories affecting your hometown with Axios Des Moines
Minneapolis-St. Paul news in your inbox
Catch up on the most important stories affecting your hometown with Axios Twin Cities
Tampa Bay news in your inbox
Catch up on the most important stories affecting your hometown with Axios Tampa Bay
Charlotte news in your inbox
Catch up on the most important stories affecting your hometown with Axios Charlotte
In a turn away from vision, a team at MIT has created a feline robot that attempts to better approximate how humans and animals actually move, navigating stairs and uneven surfaces guided only by sensors on its feet.
Why it matters: Many ambulatory robots rely on substantial recent improvements in computer-vision, like advanced cameras and lidar. But robots will be more nimble and more practically interact with humans with the addition of "blind" vision — a sixth sense of feeling that most living things have for their surroundings.
What's going on: Computer vision alone can result in a robot with slow and inaccurate movements, says MIT's Sangbae Kim, designer of the Cheetah 3.
- "People start adding vision prematurely and they rely on it too much," Kim tells Axios, when it's best suited for big-picture planning, like registering where a stairway begins and knowing when to turn to avoid a wall. So his team built a "blind" version in order to focus on tactile sensing.
How the blind version works: Two algorithms help the Cheetah stay upright when it encounters unexpected obstacles.
- One determines when the bot plants its feet, by calculating how far a leg has swung, how much force the leg is feeling, and where the ground is.
- The other governs how much force the robot should apply to each leg to keep its balance, based on the angle of the robot's body relative to the ground.
- The sensors can also adjust to external forces, like a researcher's friendly kick from the side.
The result is a quick, balanced robot: The researchers measure the force on each of the Cheetah's legs straight from the motors that control them, allowing it to move fast — at 3 meters per second, or 6.7 miles an hour — and jump up onto a table from a standstill. These tricks make the 90-pound bot look surprisingly nimble.
Cheetah's design emphasizes "sensors that you and I take for granted," said Noah Cowan, director of the LIMBS robotics lab at Johns Hopkins University.
- Humans unconsciously keep track of where their arms and legs are — and the forces acting on them — to help stay balanced and move smoothly. MIT’s Cheetah “feels” its legs in a similar way.
The Cheetah's capabilities resemble some of the robots produced by the ever-secretive Boston Dynamics, which in May released a video of its four-legged SpotMini navigating autonomously through its lab with the help of cameras.
- It's not clear whether Boston Dynamic robots use tactile technology like Kim's, and the company did not respond to an email.
Vision will probably always play a role in walking robots, even if tactile sensing becomes commonplace. Velodyne, the most prominent lidar manufacturer — and a supplier for Boston Dynamics — says its technology can see further than three football fields in day or night, collecting 8 million datapoints every second from all directions. "The sensor also scans in a full 360 degrees which is impossible for a human to do," Frank Bertini, Velodyne's UAV and robotics business manager, tells Axios.
- Kim's team plans to add cameras back onto the Cheetah in order to help it get around complex environments.
- They also want to add a grasping arm that a human can control from afar.
- The resulting bot could be well-suited for rescue operations or doing dangerous inspections in human-unfriendly environments.
Go deeper: