OpenArm VR Teleoperation Overview
This project enables real-time teleoperation of the OpenArm robot using VR human body estimation streaming data.
Key Features
- 7DOF Control: Maps both position and rotation for natural and intuitive manipulation.
- Gesture-based Gripper Control: Uses VR finger distance for real-time open/close actions.
- Scalable Mapping: Can incorporate full arm motion (upper/lower arm) for more human-like movements.
- Real-time Operation: Runs at ~20 FPS with TCP data streaming.
VR Teleoperation simulation
We are currently developing a real-time VR teleoperation system for the OpenArm dual-arm robot. The current prototype is available for full teleoperation and control simulation at NVIDIA Isaac Labs, mapping VR (Meta Quest 3) hand and arm motions directly to the robot's end-effector for different simulated operations, such as picking and placing objects.
What’s Next?
- Integration with the real OpenArm hardware to enable teleoperation in real-world scenarios.
- Refined motion retargeting for human-like arm movement using advanced IK and VR tracking.
- Improved gripper control and haptic feedback for better interaction with objects.
Stay tuned – real-world OpenArm VR teleoperation is coming soon!