Intuitive Robot Integration via
Virtual Reality Workspaces

Minh Q. Tram, Joseph M. Cloud, and William J. Beksi
The University of Texas at Arlington

Abstract

As robots become increasingly prominent in diverse industrial settings, the desire for an accessible and reliable system has correspondingly increased. Yet, the task of meaningfully assessing the feasibility of introducing a new robotic component, or adding more robots into an existing infrastructure, remains a challenge. This is due to both the logistics of acquiring a robot and the need for expert knowledge in setting it up. In this project, we address these concerns by developing a virtual reality robotic workspace (VRRW). Our proposed framework enables natural human-robot interaction through a visually immersive representation of the workspace. The main advantages of our approach are the following: (i) independence from a physical system, (ii) flexibility in defining the workspace and robotic tasks, and (iii) an intuitive interaction between the operator and the simulated environment. Not only does our system provide an enhanced understanding of 3D space to the operator, but it also encourages a hands-on way to perform robot programming. We evaluate the effectiveness of our method in applying novel automation assignments by training a robot in virtual reality and then executing the task on a real robot.


Citation

If you find this project useful, then please consider citing our paper.

@inproceedings{tram2023intuitive,
  author={Tram, Minh Q. and Cloud, Joseph M. and Beksi, William J.},
  booktitle={Proceedings of the IEEE International Conference on Robotics and Automation (ICRA)},
  title={Intuitive Robot Integration via Virtual Reality Workspaces},
  pages={11654--11660},
  year={2023}
}

License

VRRW is licensed under the Apache License, Version 2.0.