World Embedded Interfaces for Human-Robot Interaction
HICSS '03 Proceedings of the 36th Annual Hawaii International Conference on System Sciences (HICSS'03) - Track 5 - Volume 5
Time Follower's Vision: A Teleoperation Interface with Past Images
IEEE Computer Graphics and Applications
A mixed reality approach to undergraduate robotics education
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 2
Object detection for a mobile robot using mixed reality
VSMM'06 Proceedings of the 12th international conference on Interactive Technologies and Sociotechnical Systems
Ecological Interfaces for Improving Mobile Robot Teleoperation
IEEE Transactions on Robotics
Proceedings of the 11th International Conference of the NZ Chapter of the ACM Special Interest Group on Human-Computer Interaction
SIMPAR'10 Proceedings of the Second international conference on Simulation, modeling, and programming for autonomous robots
Immersion with robots in large virtual environments
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Evaluating the effectiveness of mixed reality simulations for developing UAV systems
SIMPAR'12 Proceedings of the Third international conference on Simulation, Modeling, and Programming for Autonomous Robots
Hi-index | 0.00 |
Mobile robots are increasingly entering the real and complex world of humans in ways that necessitate a high degree of interaction and cooperation between human and robot. Complex simulation models, expensive hardware setup, and a highly controlled environment are often required during various stages of robot development. There is a need for robot developers to have a more flexible approach for conducting experiments and to obtain a better understanding of how robots perceive the world. Mixed Reality (MR) presents a world where real and virtual elements co-exist. By merging the real and the virtual in the creation of an MR simulation environment, more insight into the robot behaviour can be gained, e.g. internal robot information can be visualised, and cheaper and safer testing scenarios can be created by making interactions between physical and virtual objects possible. Robot developers are free to introduce virtual objects in an MR simulation environment for evaluating their systems and obtain a coherent display of visual feedback and realistic simulation results. We illustrate our ideas using an MR simulation tool constructed based on the 3D robot simulator Gazebo.