As robots make their means into a wide range of real-world environments, roboticists try to make sure that they will effectively full a rising variety of duties. For robots which are designed to help people of their houses, this contains family chores, reminiscent of cleansing, tidying up and cooking.
Researchers on the Idiap Research Institute in Switzerland, the Chinese language University of Hong Kong (CUHK) and Wuhan University (WHU) have just lately developed a machine learning-based technique to particularly educate robots to grasp stir-fry, the Chinese language culinary cooking method. Their technique, offered in a paper revealed in IEEE Robotics and Automation Letters, combines using a transformer-based mannequin and a graph neural community (GNN).
“Our recent work is the joint effort of three labs: the Robot Learning & Interaction group led by Dr. Sylvain Calinon at the Idiap Research Institute and the Collaborative and Versatile Robots laboratory led by Prof. Fei Chen Cuhk and the lab led by Prof. Miao Li at WHU,” Junjia Liu, one of many researchers who carried out the research, instructed TechXplore. “Our three labs have been studying and working together for about ten years. We have a particular interest in making intelligent robots that can prepare food for people.”
Dr. Calinon, Prof. Chen and Prof. Miao have been attempting to boost the cooking expertise of robots for a number of years now. Of their current research, they determined to give attention to the Chinese language culinary arts, particularly stir-fry, a cooking method that entails frying substances over high heat whereas stirring them, typically utilizing a Wok pan.
“While domestic service robots have been developed considerably in recent years, creating a robot chef in the semi-structured kitchen environment remains a grand challenge,” Liu mentioned.
“Food preparation and cooking are two crucial activities in the household, and a robot chef that can follow arbitrary recipes and cook automatically would be practical and bring a new interactive entertainment experience.”
Stir-fry, the cooking fashion that the crew centered on of their current paper, includes advanced bimanual expertise which are tough to show to robots. To successfully do that, Liu and his colleagues first tried to coach a bimanual coordination mannequin often known as a “structured-transformer” utilizing human demonstrations.
“This mechanism regards coordination as a sequence transduction problem between the movements of both arms and adopts a combined model of transformer and GNN to achieve this,” Liu defined. “Thus, in the online process, the left-arm movement is adjusted according to the visual feedback, and the corresponding right-arm movement is generated by the pre-trained structured-transformer model based on the left-arm movement.”
The researchers assessed their mannequin’s efficiency each in simulations and on a bodily two-handed robotic platform, often known as the Panda robotic. In these exams, their mannequin allowed the robot to efficiently and realistically reproduce the motions concerned in stir-fry.
“The main contribution of this paper is to consider the coordination mechanism of bimanual robots explicitly in the form of sequence transduction,” Liu mentioned. “In contrast with classical studying from demonstration strategies and deep learning/reinforcement learning based methods, our decoupled framework skillfully combines both these techniques. In fact, it can have both the generalization of the former and the expressivity of the latter.”
Sooner or later, the mannequin launched by this crew of researchers might allow the event of robots that may cook dinner meals each in house environments and at public venues. As well as, the identical method could possibly be used to coach robots on different duties that contain using two arms and fingers. In the meantime, Liu and his colleagues plan to proceed engaged on their mannequin, to enhance its efficiency and generalizability.
“We will now introduce higher dimensional information to learn more humanoid motion in kitchen skills, such as visual and electromyography signals,” Liu added. “The estimation of semi-ﬂuid contents in this work was simpliﬁed as two-dimensional image segmentation, and we only used the relative displacement as the desired target. Thus, we also plan to propose a more comprehensive framework that consists of both the movements of bimanual manipulators and the state change of the object.”
Junjia Liu et al, Robotic Cooking With Stir-Fry: Bimanual Non-Prehensile Manipulation of Semi-Fluid Objects, IEEE Robotics and Automation Letters (2022). DOI: 10.1109/LRA.2022.3153728
© 2022 Science X Community
A method to show bimanual robots stir-fry cooking (2022, June 17)
retrieved 17 June 2022
This doc is topic to copyright. Aside from any truthful dealing for the aim of personal research or analysis, no
half could also be reproduced with out the written permission. The content material is supplied for info functions solely.
When you’ve got any considerations or complaints relating to this text, please tell us and the article might be eliminated quickly.
- Jaya Hey 2.0 | ‘Jai He 2.0’ release on 75th anniversary of independence, 75 singers lent their voices to this national anthem presented by Harshvardhan Nevatia.
- Rakesh Jhunjhunwala left after taking Akasa Air to the heights of the sky, the CEO of the aviation company said this
- Rakesh Jhunjhunwala was last in public at the inaugural flight of Akasa Air