🤖 AI Summary
Arvos is an open-source tool that turns an iPhone (12 Pro or newer) into a research-grade sensor hub, streaming LiDAR, RGB camera, IMU, ARKit pose, GPS — and optionally Apple Watch IMU — to a web client over an embedded WebSocket server (port 8765). Developers clone the repo, build the Xcode project, start the app on the phone, and connect via a browser-based Web Studio (real-time 3D point cloud viewer, live camera feed, IMU/GPS charts, diagnostics). Streams use nanosecond-synchronized timestamps, binary WebSocket messages (PLY for point clouds, JPEG for images), and provide intrinsic calibration metadata; recording/export supports MCAP and H.264. Typical rates: camera 5–30 FPS (1920×1080), LiDAR 1–5 FPS, IMU 50–200 Hz, ARKit pose 30–60 Hz, Apple Watch 50–100 Hz, GPS 1 Hz.
For the AI/ML and robotics community, Arvos lowers the barrier to collecting tightly synchronized multimodal datasets and prototyping SLAM, visual-inertial odometry, sensor fusion, and 3D reconstruction workflows. Key integrations include a Python SDK (pip install arvos-sdk) for real-time processing and MCAP playback, Foxglove-style architecture for streaming, and configurable sensor modes (Full, RGBD, Visual-Inertial, etc.). Licensed under GPLv3, the project encourages contributions (new sensors, ROS2/RTSP support, optimizations) and is geared toward researchers, developers, and students needing lightweight, reproducible mobile sensor capture.
Loading comments...
login to comment
loading comments...
no comments yet