Stories

Robot builds IKEA chair, almost autonomously

Two robot arms assembling an IKEA chair
Robot assembles IKEA’s Stefan chair. Photo: NTU Singapore

A robot is able to put together an IKEA chair in about 20 minutes, researchers report in a study published in Science Robotics on Wednesday.

Why it matters: The relatively complex task involves what tends to be considered human traits — fine-motor movements, hand-eye coordination, and proper calibration of the amount of force needed to put something together. This brings robots closer to fully autonomous functions that could be applied to multiple industries.

What they did: A team from Nanyang Technological University in Singapore took a Stefan IKEA chair kit and placed the pieces randomly within the robot's environment. Unlike prior experiments (others used specially designed grippers for grasping or designated the needed pieces with reflective markers, etc.) the team only used commercial off-the-shelf (COTS) hardware, including a group of industrial robot arms, parallel grippers, force-detection sensors, and a 3D camera for "vision."

  • The use of COTS is important, study author Quang-Cuong Pham tells Axios, because
    "human 'hardware' is very generic: the same eyes and hands are used to assemble many different objects. Using COTS hardware forces us to reproduce that genericity into our framework, instead of designing ad-hoc hardware for every new task."
  • The study is building on earlier research by the group — in 2015, they were able to get the robot to insert a pin in a hole.

What they did:

  • The robots quickly identified the correct parts, coordinated actions to construct the chair, determined and used the appropriate force necessary to grip the parts and place the pins into the correct holes completely.
  • Overall, the assembly took 20 minutes and 19 seconds, of which almost 9 minutes was to execute the assembly.
"Key advances are in vision, planning, control, integration of multiple software and hardware components, bimanual manipulation. These advances can be used in multiple manufacturing tasks," Pham tells Axios.

Caveat: The authors say the sequence of the steps was hardcoded "through a considerable engineering effort" and they hope recent advances in AI combined with these findings will lead to fully autonomous robots.

UC-Berkeley's Ken Goldberg tells Wired that overcoming the need to pre-program is key to future development. He says to Wired:

“The big challenge is to replace such carefully engineered special purpose programming with new approaches that could learn from demonstrations and/or self-learn to perform tasks like this."

What's next: Pham agrees. He says with this study, the coding told the system "what to do" and then the robot figured out the "how," but he hopes AI advances will help the system realize the "what to do" aspect as well. Pham says:

"In this work, we were interested in achieving the low-level capabilities ("how to do"), such as perception, planning, control, rather than in the high-level reasoning ("what to do"). Those low-level capabilities are crucial for other industrial tasks (e.g.: handling, drilling, glue dispensing, assembly, inspection, etc.) We are also planning to integrate AI methods in our future work to automate the high-level reasoning."

Go deeper: See the actual video here from NTU Singapore.