Archive for the ‘Projector-Based Tracking’ Category

h1

Red Chair by Teemu Määttänen

March 23, 2009

Red Chair by Teemu Määttänen

Advertisements
h1

Re-implementation of Projector Calibration with Embedded Light Sensors

February 26, 2009

Using projectors as light sources we can transform the surface properties of arbitrarily complex objects. The color of a surface is normally defined by the amount of light reflected/absorbed by the particular pigment/material of the object. In spatial augmented reality we lift the properties of the surface into the light source. I.e. instead of illuminating a red object with white light, we can illuminate a white object with red light.

We are in the process of developing a toolkit for the development of projection mapping/spatial augmented reality applications. Our focus is on projection mapping for theater and new media installation pieces using tracking with embedded light sensors.

Currently, we are researching human computer interaction issues with spatial augmented content authoring. We are focusing on improving the content pipeline (which is largely non-existent). Traditional video editing and computer graphics workflows are not designed for animated texturing and easily matching physical and digital models.

In this video we are calibrating a digital model of the box to the physical box using embedded light sensors. We re-implemented projector calibration with embedded light sensors by Johnny Chung Lee. Essentially, the projector displays a pattern which photosensors embedded into the box recieve and decode. With this information, we can determine the positions of the sensors in the projectors frame of reference. With enough sensors (at least six) we can determine the parameters of the projector (the lens etc.) along with the rotation and orientation of the box. Then using a game engine, we render a 3D model of the box from the projectors frame of reference. Finally, applying an animated texture to the 3D model of the box, yeilds the demonstration above. This technique is explained extensively by Johnny Chung Lee.

h1

TI’s BeagleBoard and DLP Pico Projector

January 14, 2009

There was a lovely little article on MAKE Magazine describing utilizing a TI BeagleBoard along with the DLP Pico Projector Development Kit. Thus for about $500 in total you can have a lovely linux based processor with a pico projector.

This board might have a number of uses in SAR applications.

MAKE: Blog: TI’s BeagleBoard and DLP Pico Projector == Linux everywhere.

h1

Christie Matrix StIM – Simultaneous Visible and Infrared Illumination

January 6, 2009

Christie has recently announced a projector capable of simultaneous visible and infrared illumination, the Christie Matrix StIM. The projector will be in production soon and is geared towards military simulation/training with Night Vision Goggles. This may enable simultaneous projection based tracking and display as described by Johnny Chung Lee.

h1

Johnny Chung Lee

December 27, 2008

Johnny Chung Lee is the master of the YouTube instructional video. In fact, Johnny’s “Head Tracking for VR using the Wii Remote” was nominated for best instructional video of 2007.

Johnny has tons of awesome videos about using the wii-mote for motion tracking. There are videos and source-code for head, finger and pen tracking enabling VR and multi-touch surfaces for the hobbyist. He has inspired hundreds of home-made VR projects and even has a forum. His blog also provides up-to-date information on his tinkerings.

Johnny’s lesser known work involves automated projector calibration using embedded light sensors and actually comprises his doctoral thesis at CMU. Instead of using high cost camera based tracking systems, Johnny presents a method for using low cost projectors for simultaneous tracking and content display. Essentially, light sensors on the object that is being tracked deduce their own position given a pattern that is being projected.

Black and white gray code projections are used to determine the pixel position of each sensor in reference to the projector. When this information is paired with a digital model of the object and the sensor positions, a system of equations solves for the parameters of the projector and the object. The solution yields the intrinsic parameters of the projector lens and the 3d position and orientation of the object.

Johnny demonstrated what interactive projector based tracking would look like in the video below (and a corresponding UIST 2008 paper). Note that the hybrid infrared/visible light projection in this video was faked using the wii-mote (and thus can only track 4 points).

I will be posting an in-depth explanation of this process soon. Hopefully with some nice videos.