🤖 AI Summary
True3D Labs released a demo called WindowMode that turns your laptop screen into a convincing “window” into a 3D scene by using ordinary webcams and head tracking rather than specialized hardware. The NextJS demo uses MediaPipe’s FaceLandmarker to detect eye positions and estimates head distance from the webcam using apparent eye diameter plus the camera FOV. That metric head pose is used to compute an off‑axis projection matrix (implemented in their spatial-player library), so the scene is re‑rendered from the user’s real viewpoint—creating parallax and depth that make the content appear to sit behind the screen. Landmark processing runs in a small web worker for responsiveness, and the demo renders voxel scenes (.vv) and volumetric videos (.splv) using a splat-based renderer.
This matters because it provides a practical, software-only way to deliver head‑tracked, gaze‑contingent 3D experiences across standard devices—useful for volumetric video playback, spatial UI experiments, AR/VR-lite interfaces, and more. Developers can try spatial-player from npm and create content with spatialstudio (pip), and the team says both tools will be open‑sourced soon. The approach’s key technical takeaways are reliable eye-based distance estimation, off‑axis projection for portal realism, support for voxel/volumetric formats, and a lightweight architecture suitable for web deployment.
Loading comments...
login to comment
loading comments...
no comments yet