Robot Control Through VR

Robot Control Through VR

Low-latency VR control system I built to pipe hand gestures through ROS and drive industrial robotic arms.

This project explores remote robot manipulation through VR and inverse kinematics. It spans two tracks—real-time mimicry control and data-driven motion playback—both of which informed my later work on hand-interaction fidelity in Microsoft Mesh.

Mimicry Control

In the first phase, I built a VR system that lets users drive robotic arms with natural hand and arm gestures. ROS data flows between Unity and the robot over a lightweight network bridge, keeping latency low enough for precise manipulation.

Key Details

  • Custom inverse-kinematics solver blends Leap Motion hand poses with robot joint limits to avoid singularities.
  • Safety interlocks monitor joint velocity, collision volumes, and operator intent before commands leave the VR client.
  • Visual overlays show reachable space and predicted joint poses so users can plan motions before committing.

Motion Replay

When COVID-19 closed the lab, I pivoted to a data-replay pipeline. The interpreter ingests timestamped joint angles from spreadsheets and generates Unity animation clips that reproduce captured robot motion.

What It Unlocked

  • Researchers could review real-world experiments inside VR without needing lab access, making it easier to annotate mistakes and plan retries.
  • Demonstrated that low-bandwidth data exports are sufficient to recreate believable robot motion inside Unity—an insight that later guided my remote-collaboration prototypes at UW Graphics Group.

© made with ❤️ by Jack