Releases: Improbable-AI/VisionProTeleop
VisionProTeleop v2.50 : External Network Connection, Isaac Lab Support, etc.
🎉 VisionProTeleop v2.50 Release Notes
🌟 Highlights
This release introduces external network connectivity, native simulation streaming for MuJoCo and Isaac Lab, public dataset sharing, and a companion iOS app, making Apple Vision Pro even more useful for robotics research and development.
🆕 Major New Features
🌐 External Network Mode (Remote Connectivity)
No more same-WiFi requirement! Connect Vision Pro and your Python client from anywhere in the world.
- Room Code System: Vision Pro generates a room code (e.g.,
"ABC-1234") that you use instead of an IP address - WebRTC with TURN Relay: Automatic NAT traversal using signaling and TURN servers
- Seamless API: Just change
ip="192.168.1.100"toip="ABC-1234"— everything else works the same
# External network - use room code instead of IP
s = VisionProStreamer(ip="ABC-1234")
s.configure_video(device="/dev/video0", format="v4l2")
s.start_webrtc()🎮 Native Simulation Streaming
MuJoCo AR Streaming
Stream MuJoCo simulations directly to Vision Pro as native AR objects rendered by RealityKit:
- Automatic MJCF → USD conversion with USDZ caching
- Real-time pose updates via WebRTC (more efficient than video streaming)
- Configurable world-to-AR placement with
relative_toparameter
s.configure_mujoco("robot.xml", model, data, relative_to=[0, 0, 0.8, 90])
s.start_webrtc()
while True:
mujoco.mj_step(model, data)
s.update_sim() # Stream poses, not rendered framesIsaac Lab AR Streaming 🆕
Full support for NVIDIA Isaac Lab simulations:
- Stream Isaac Lab scenes with native RealityKit rendering
- Multi-environment support with
env_indicesselection - Automatic USD export and caching
s.configure_isaac(scene=env.scene, relative_to=[0, 0, 0.8, 90], env_indices=[0])
s.start_webrtc()📱 Companion iOS App: Tracking Manager
A new iOS app for managing your VisionProTeleop workflow:
- Recording Management: Browse, playback, and manage cloud recordings
- 3D Visualization: View synchronized video + skeleton overlays
- Camera Calibration: Perform intrinsic/extrinsic calibration with visual guidance
- Remote Settings: Configure Vision Pro app settings from your iPhone
- Public Sharing: Share recordings with the research community
📊 Public Dataset Sharing
Create and access community datasets of egocentric manipulation videos:
- Share recordings via CloudKit (data stays in your cloud, just made shareable)
- Browse and download others' public recordings
- Full Python API for dataset access
from avp_stream.datasets import list_public_recordings, download_recording
recordings = list_public_recordings()
download_recording(recordings[0], dest_dir="./data")🔧 New Tracking Capabilities
Enhanced Hand Tracking API
New attribute-style access for cleaner code (fully backward compatible):
data = s.get_latest()
# New API (recommended)
data.right.wrist # (4, 4) transform
data.right.indexTip # (4, 4) fingertip
data.right.pinch_distance # float: thumb-index distance
data.right.wrist_roll # float: axial rotation
# Legacy API (still works)
data["right_wrist"] # Same dataPredictive Hand Tracking 🆕
Compensate for system/network latency with ARKit's predictive tracking:
- Configurable prediction offset (0-100ms)
- Reduces perceived latency for responsive teleoperation
Marker & Image Tracking 🆕
Track ArUco markers and custom reference images:
markers = s.get_markers()
for marker_id, info in markers.items():
pose = info["pose"] # (4, 4) transform matrixStylus Tracking 🆕
Track Logitech Muse stylus (visionOS 26.0+):
stylus = s.get_stylus()
if stylus["tip_pressed"]:
pressure = stylus["tip_pressure"] # 0.0-1.0📹 Egocentric Recording Improvements
UVC Camera Support
- Direct USB camera connection via Developer Strap
- CAD models for 3D-printable mounting brackets in
assets/adapters/ - Full frame access without Apple Enterprise approval
Camera Calibration
- Intrinsic Calibration: Camera parameters via Tracking Manager iOS app
- Extrinsic Calibration: Camera-to-Vision Pro transform
- Visual guidance and validation tools
☁️ Cloud Storage Expansion
New Storage Providers
- Google Drive 🆕
- Dropbox 🆕
- iCloud Drive (existing)
Recording Contents
Recordings now include:
- Video (H.264/H.265)
- All tracking data (hands, head, markers, stylus)
- Simulation data (if using MuJoCo/Isaac streaming)
- Metadata and calibration info
📚 New Examples
| Example | Description |
|---|---|
00_hand_streaming.py |
Basic hand tracking |
09_mujoco_streaming.py |
MuJoCo AR simulation |
10_teleop_osc_franka.py |
OSC-based Franka teleoperation |
11_diffik_aloha.py |
Differential IK with ALOHA robot |
12_diffik_shadow_hand.py |
Shadow Hand teleoperation |
13_g1_freefall.py |
Unitree G1 simulation |
14_kuka_allegro_streaming.py |
KUKA + Allegro Hand |
15_franka_visionpro_teleop.py |
Complete Franka teleop example |
16_spot_demo.py |
Boston Dynamics Spot |
17_maker_detection_streaming.py |
ArUco marker tracking |
18_public_datasets.py |
Public dataset browser |
19_logitech_muse_stylus.py |
Stylus tracking demo |
🙏 Acknowledgements
Thanks to all contributors and the robotics research community for feedback and feature requests that shaped this release.
📥 Upgrade Instructions
# Python package
pip install --upgrade avp_stream
# VisionOS app
Update from App Store
# iOS companion app (NEW)
Install "Tracking Manager" from App StoreLow-Latency Video/Audio Streaming
🎉 VisionProTeleop with Low-Latency Video/Audio Streaming
Excited to release VisionProTeleop v2.0 with full support for real-time, bidirectional video and audio streaming!
✨ What's New
WebRTC Video Streaming
- Stream your robot's camera feed back to Vision Pro with ** low round-trip latency**
- Support for both mono and stereo video streams
- Flexible camera input: physical devices, synthetic frames, or custom processing pipelines
- No network configuration required
Performance Benchmarks
- Wireless mode: Stable sub-100ms latency at 720p
- Wired mode: 50ms latency even at 4K stereo resolution
- Full benchmark methodology available in our documentation
🚀 Getting Started
Update to the latest version:
pip install --upgrade avp_streamStart streaming video with just one additional line:
from avp_stream import VisionProStreamer
s = VisionProStreamer(ip="10.31.181.201")
s.start_streaming(device="/dev/video0", format="v4l2",
size="640x480", fps=30, stereo_video=False)
while True:
r = s.latest
# Your teleoperation code here📱 App Update Required
Make sure to update the Tracking Streamer app on your Vision Pro from the App Store to access video streaming features.
📚 Documentation
Check out our updated examples folder for complete usage patterns including:
- Frame callback processing
- Stereo depth visualization
- Audio file streaming
- Camera device configuration