The Headspace project demonstrates a "holographic" window illusion using a standard webcam. It uses off-axis 3D projection to update the camera perspective based on the user's head position in real-time, creating the illusion that objects are physically present inside the screen.
Conventional 3D rendering (like in video games) assumes the camera is a static "eye" looking through the center of the screen. If you move your head, the image doesn't change, breaking the illusion of depth.
Off-Axis Projection solves this by treating the monitor as a physical window.
-
Tracking: We track your head position
$(x, y, z)$ relative to the center of the screen. - Asymmetric Frustum: Instead of a symmetrical pyramid, we skew the camera's viewing frustum. If you move left, the frustum shears right to keep the near plane aligned with the physical screen.
- Result: The 3D scene appears to stay fixed in space behind the screen, just like real objects behind a window.
The math relies on defining the frustum bounds
This ensures that the projection plane always matches the physical dimensions of your monitor.
- Real-time Head Tracking: Uses Google MediaPipe FaceLandmarker for robust, low-latency tracking directly in the browser.
- Iris Depth Sensing: Estimates true physical distance by tracking iris separation, allowing natural "lean-in" zoom interactions.
- Off-Axis Projection: Calculates the correct skewed frustum to maintain a perfect window illusion from any angle.
- Time-Based Smoothing: Implements frame-rate independent LERP filters to ensure buttery smooth camera movement on any device (30-120 FPS).
- Configurable: Easy tuning of sensitivity, smoothing, and scene dimensions via
src/lib/config.ts. - Modern Stack: Built with React, Vite, TailwindCSS, and Three.js (React Three Fiber).
-
Clone the repository:
git clone https://github.com/raztronaut/headspace.git cd headspace -
Install dependencies:
npm install
-
Start the development server:
npm run dev
-
Open your browser to the local URL (usually
http://localhost:5173). -
Allow Camera Access: The app requires webcam access to track your head position. All processing is done locally on your device; no video is ever sent to a server.
-
Calibrate: Sit comfortably in front of your screen and click "Calibrate Center" to set the zero point.
Run the unit test suite to verify the math and store logic:
npm testThis project is optimized for deployment on Vercel or any static hosting provider.
- Push your code to a GitHub repository.
- Import the project in Vercel.
- Vite defaults will just work (Framework Preset: Vite).
- The included
vercel.jsonhandles SPA routing and security headers automatically.
To build for production locally:
npm run buildThe output will be in the dist/ directory. You can preview the production build locally with:
npm run previewRazi Syed
- Twitter: @raztronaut
- GitHub: @raztronaut
MIT

