Applying vision-guided graph neural networks for adaptive task planning in dynamic human robot collaborative scenarios

MA, Ruidong, LIU, Yanan, GRAF, Erich W and OYEKAN, John (2024). Applying vision-guided graph neural networks for adaptive task planning in dynamic human robot collaborative scenarios. Advanced Robotics, 1-20. [Article]

Documents
34585:757030
[thumbnail of Ma-ApplyingVision-GuidedGraph(VoR).pdf]
Preview
PDF
Ma-ApplyingVision-GuidedGraph(VoR).pdf - Published Version
Available under License Creative Commons Attribution.

Download (10MB) | Preview
Abstract
The Assemble-To-Order (ATO) strategy is increasingly becoming prevalent in the manufacturing sector due to the high demand for high-volume personalised and customised goods. The use of Human-Robot Collaborative (HRC) Systems are increasingly being investigated in order to make use of the dexterous strength of human hands while at the same time make use of the ability of robots to carry massive loads. However, current HRC systems struggle to adapt dynamically to varying human actions and cluttered workspaces. In this paper, we propose a novel neural network framework that integrates both Graph Neural Network (GNN) and Long Short-Term Memory (LSTM) for adaptive response during HRC scenarios. Our framework enables a robot to interpret human actions and generate detailed action plans while dealing with objects in a cluttered workspace thereby addressing the challenges of dynamic human-robot collaboration. Experimental results demonstrate improvements in assembly efficiency and flexibility, making our approach the first integration of iterative grasping and flexible HRC within a unified neural network architecture.
More Information
Statistics

Downloads

Downloads per month over past year

View more statistics

Metrics

Altmetric Badge

Dimensions Badge

Share
Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Actions (login required)

View Item View Item