Ballshooter

Large Format Multitouch Displays & Tangible Space Invaders

Laser Light Plane multitouch sensing architecture
Schematic of the Laser Light Plane (LLP) multitouch sensing architecture, illustrating the optical path from infrared laser through a line-generating lens across the acrylic surface, with an infrared camera positioned below to capture contact-point scattering when a finger disrupts the planar beam.

Through the IIT Delhi and Google Summer of Code 2009 program I initiated exploration of both the hardware, form, and interface side of multitouch gestural interfaces.

Engineering

We utilize LLP, to build a large projection and gesture-tracking interface. Infrared light from single or multiple lasers shine just above the surface. The laser plane of light is about 1mm thick and positioned as close to the touch surface as possible.

An IR pen, finger, or object hits the light plane, the blobs are seen by an infrared camera below the surface initiating a TUIO event. The TUIO events on server-side are understood by several interactive applications. Gestures are parsed and events are interpreted in openFrameworks, or Flash AS3.

Explored various hardware modifications, scales, gestures, and interactions with object tracking, fiducials markers, and application use-cases. Github

Infrared camera multitouch tracking
Composite view of the physical multitouch apparatus: a ceiling-mounted thermal camera for touch detection (upper left), an overhead projector on an articulated arm for rear-projection display (upper right), a marking-menu image-editing application demonstrating multitouch manipulation (lower left), and a finger-drawn painting application rendered on the projected surface (lower right).
Multitouch display engineering diagram
Engineering drawing detailing the projector and camera alignment geometry, including the hot-mirror optical fold path, FireFly MV varifocal camera mount, Toshiba TDP-EW25 short-throw projector placement, and dimensioned frame assemblies for the projector housing and mirror bracket (all units in centimeters).
Blob tracking on multitouch surface
Screenshot of the Big Blobby blob-tracking application operating at 59 fps on a 320x240 infrared source feed: the left panel displays raw IR contact points, while the right panel shows numbered blob identifiers (IDs 44--48) output via the TUIO protocol, with the image-processing filter chain -- background subtraction, smoothing, highpass, and amplification -- visible in the lower panel.

See-through transparent displays and interaction. Developed at MIT Media Lab (prism-based projection screen) with my UROP Meghana Bhat.

Developed during internship at IIT-Delhi for Design Degree Show 2009.

Space Invaders in the physical world. In team with Rahul Motiyar, Rajdeep, and Aman, IIT Bombay.

Backlight removed Transparent LCD Leap-motion tracking.