Skip to content

The Headspace project demonstrates a "holographic" window illusion using a standard webcam. It uses off-axis 3D projection to update the camera perspective based on the user's head position in real-time, creating the illusion that objects are physically present inside the screen.

License

Notifications You must be signed in to change notification settings

raztronaut/headspace

Repository files navigation

Headspace: Holographic Head Tracking Demo

The Headspace project demonstrates a "holographic" window illusion using a standard webcam. It uses off-axis 3D projection to update the camera perspective based on the user's head position in real-time, creating the illusion that objects are physically present inside the screen.

Holographic Illusion Demo

How it Works

Conventional 3D rendering (like in video games) assumes the camera is a static "eye" looking through the center of the screen. If you move your head, the image doesn't change, breaking the illusion of depth.

Off-Axis Projection solves this by treating the monitor as a physical window.

Off-Axis Projection Diagram

  1. Tracking: We track your head position $(x, y, z)$ relative to the center of the screen.
  2. Asymmetric Frustum: Instead of a symmetrical pyramid, we skew the camera's viewing frustum. If you move left, the frustum shears right to keep the near plane aligned with the physical screen.
  3. Result: The 3D scene appears to stay fixed in space behind the screen, just like real objects behind a window.

The math relies on defining the frustum bounds $(left, right, top, bottom)$ dynamically:

$$ left = (-width/2 - headX) \cdot \frac{near}{headZ} $$

This ensures that the projection plane always matches the physical dimensions of your monitor.

Features

  • Real-time Head Tracking: Uses Google MediaPipe FaceLandmarker for robust, low-latency tracking directly in the browser.
  • Iris Depth Sensing: Estimates true physical distance by tracking iris separation, allowing natural "lean-in" zoom interactions.
  • Off-Axis Projection: Calculates the correct skewed frustum to maintain a perfect window illusion from any angle.
  • Time-Based Smoothing: Implements frame-rate independent LERP filters to ensure buttery smooth camera movement on any device (30-120 FPS).
  • Configurable: Easy tuning of sensitivity, smoothing, and scene dimensions via src/lib/config.ts.
  • Modern Stack: Built with React, Vite, TailwindCSS, and Three.js (React Three Fiber).

Installation

  1. Clone the repository:

    git clone https://github.com/raztronaut/headspace.git
    cd headspace
  2. Install dependencies:

    npm install

Usage

  1. Start the development server:

    npm run dev
  2. Open your browser to the local URL (usually http://localhost:5173).

  3. Allow Camera Access: The app requires webcam access to track your head position. All processing is done locally on your device; no video is ever sent to a server.

  4. Calibrate: Sit comfortably in front of your screen and click "Calibrate Center" to set the zero point.

Testing

Run the unit test suite to verify the math and store logic:

npm test

Deployment

This project is optimized for deployment on Vercel or any static hosting provider.

Vercel (Recommended)

  1. Push your code to a GitHub repository.
  2. Import the project in Vercel.
  3. Vite defaults will just work (Framework Preset: Vite).
  4. The included vercel.json handles SPA routing and security headers automatically.

Manual Build

To build for production locally:

npm run build

The output will be in the dist/ directory. You can preview the production build locally with:

npm run preview

Technologies

Author

Razi Syed

License

MIT

About

The Headspace project demonstrates a "holographic" window illusion using a standard webcam. It uses off-axis 3D projection to update the camera perspective based on the user's head position in real-time, creating the illusion that objects are physically present inside the screen.

Topics

Resources

License

Stars

Watchers

Forks