Re-implementation of Projector Calibration with Embedded Light SensorsFebruary 26, 2009
Using projectors as light sources we can transform the surface properties of arbitrarily complex objects. The color of a surface is normally defined by the amount of light reflected/absorbed by the particular pigment/material of the object. In spatial augmented reality we lift the properties of the surface into the light source. I.e. instead of illuminating a red object with white light, we can illuminate a white object with red light.
We are in the process of developing a toolkit for the development of projection mapping/spatial augmented reality applications. Our focus is on projection mapping for theater and new media installation pieces using tracking with embedded light sensors.
Currently, we are researching human computer interaction issues with spatial augmented content authoring. We are focusing on improving the content pipeline (which is largely non-existent). Traditional video editing and computer graphics workflows are not designed for animated texturing and easily matching physical and digital models.
In this video we are calibrating a digital model of the box to the physical box using embedded light sensors. We re-implemented projector calibration with embedded light sensors by Johnny Chung Lee. Essentially, the projector displays a pattern which photosensors embedded into the box recieve and decode. With this information, we can determine the positions of the sensors in the projectors frame of reference. With enough sensors (at least six) we can determine the parameters of the projector (the lens etc.) along with the rotation and orientation of the box. Then using a game engine, we render a 3D model of the box from the projectors frame of reference. Finally, applying an animated texture to the 3D model of the box, yeilds the demonstration above. This technique is explained extensively by Johnny Chung Lee.