Advertisement

Surprise! We can teach 'bi-manual' robots how to stir-fry food to pure bliss

We're not kidding.

Surprise! We can teach 'bi-manual' robots how to stir-fry food to pure bliss
3d rendering robot hand holding frying pan. PhonlamaiPhoto/iStock

Robotics is making great strides in a variety of fields, including some quite unusual ones. Researchers at the Idiap Research Institute in Switzerland, the Chinese University of Hong Kong (CUHK) and Wuhan University (WHU) have now engineered a machine learning-based method to teach robots to stir-fry like professional chefs, according to a report TechXplore published on Friday.

Intelligent robots that can prepare food 

"Our recent work is the joint effort of three labs: the Robot Learning & Interaction group led by Dr. Sylvain Calinon at the Idiap Research Institute and the Collaborative and Versatile Robots laboratory led by Prof. Fei Chen Cuhk and the lab led by Prof. Miao Li at WHU," Junjia Liu, one of the researchers who carried out the study, told TechXplore.

"Our three labs have been studying and working together for about ten years. We have a particular interest in making intelligent robots that can prepare food for people."

The new research hopes to create a robotic chef, something that has thus far been very difficult to achieve.

"While domestic service robots have been developed considerably in recent years, creating a robot chef in the semi-structured kitchen environment remains a grand challenge," Liu said.

"Food preparation and cooking are two crucial activities in the household, and a robot chef that can follow arbitrary recipes and cook automatically would be practical and bring a new interactive entertainment experience."

To achieve such a complex task as stir-frying, Liu and his team first had to train a bimanual coordination model known as a "structured-transformer." They did this using human demonstrations.

"This mechanism regards coordination as a sequence transduction problem between the movements of both arms and adopts a combined model of transformer and GNN to achieve this," Liu explained.

Advertisement

"Thus, in the online process, the left-arm movement is adjusted according to the visual feedback, and the corresponding right-arm movement is generated by the pre-trained structured-transformer model based on the left-arm movement."

Cooking both at home and in public

Liu now hopes that his new and improved model could one day introduce the development of robots that can cook meals both at home and out in public. It could also be used in developing robots that can perform other tasks that involve the use of two arms and hands. One good example is this already-popular pizza-making robot.

"We will now introduce higher dimensional information to learn more humanoid motion in kitchen skills, such as visual and electromyography signals," Liu concluded.

"The estimation of semi-fluid contents in this work was simplified as two-dimensional image segmentation, and we only used the relative displacement as the desired target. Thus, we also plan to propose a more comprehensive framework that consists of both the movements of bimanual manipulators and the state change of the object."

Advertisement

The results of the study were published in the journal IEEE Robotics and Automation Letters.

Abstract:
This letter describes an approach to achieve well-known Chinese cooking art stir-fry on a bimanual robot system. Stir-fry requires a sequence of highly dynamic coordinated movements, which is usually difficult to learn for a chef, let alone transfer to robots. In this letter, we define a canonical stir-fry movement, and then propose a decoupled framework for learning this deformable object manipulation from human demonstration. First, dualarms of the robot are decoupled into different roles (a leader and follower) and learned with classical and neural network based methods separately, then the bimanual task is transformed into a coordination problem. To obtain general bimanual coordination, we secondly propose a Graph and Transformer based model—Structured-Transformer, to capture the spatio-temporal relationship between dual-arm movements. Finally, by adding visual feedback of contents deformation, our framework can adjust the movements automatically to achieve the desired stir-fry effect. We verify the framework by a simulator and deploy it on a real bimanual Panda robot system. The experimental results validate our framework can realize the bimanual robot stir-fry motion and have the potential to extend to other deformable objects with bimanual coordination.

Follow Us on

GET YOUR DAILY NEWS DIRECTLY IN YOUR INBOX

Stay ahead with the latest science, technology and innovation news, for free:

By subscribing, you agree to our Terms of Use and Privacy Policy. You may unsubscribe at any time.