We are happy to share the recent publication “Human–Robot Interaction and Tracking System Based on Mixed Reality Disassembly Tasks” (Robotics 14(8):106), by Calderón‑Sesmero et al., which presents an innovative approach to human–robot collaboration through a mixed reality (MR) interface. The system allows operators to guide a collaborative robot using multimodal interaction—voice, gesture, and gaze—supported by AI-based object tracking and smart task validation.
The work explores how immersive technologies, when combined with deep learning and robotics, can reduce cognitive load, increase safety, and improve task efficiency in industrial disassembly operations. The proposed solution enables a more intuitive interaction between humans and robots, showing promising results both in terms of usability and performance.
This resonates strongly with ARISE’s mission to foster human-centric, adaptive, and accessible robotics in industrial settings. The use of real-time perception, seamless operator input, and robust system validation aligns with the principles ARISE promotes for developing safe and effective human–robot collaboration systems.
If you’re curious to learn more about how mixed reality and AI-driven perception can enhance industrial tasks, you can access the full article here.