// Project

Mirage

PythonComputer VisionMediaPipe3D UIGesture Recognition

About this project

A personal engineering project to design and build a real-time interactive holographic display system inspired by the gesture-driven interfaces seen in Iron Man. Combines computer vision, real-time hand tracking, 3D UI rendering, and optional physical projection hardware into a cohesive system.

Key Features

Hand Tracking

Real-time hand landmark detection via MediaPipe, tracking 21 points per hand through a standard webcam with no special hardware required.

Gesture-to-UI Mapping

Physical hand gestures — pinch to select, swipe to navigate, rotate to scroll — are translated into interactive 3D UI controls in real time.

Holographic Visual Layer

A stylized UI with scanlines, glow effects, and parallax depth designed to look holographic — built for eventual projection onto a Pepper's Ghost display.

Leap Motion Integration

Planned upgrade from webcam to Leap Motion controller for far more precise finger tracking and sub-millimeter hand detection.

Physical Projection

End goal: a Pepper's Ghost setup using holographic film or a monitor-based pyramid, with the Leap Motion mounted below for a full hands-free holographic interface.

Tech Stack

PythonMediaPipeOpenCVWebGLThree.js

Development Roadmap

MediaPipe Hand Tracking Setup
Gesture Detection (Pinch, Swipe, Rotate)
3D UI Scene (Web or Unity)
Gesture-to-UI Control Mapping
Holographic UI Styling
Leap Motion Upgrade
Demo Video Recording
Pepper's Ghost Physical Build