Paper, acrylic gesso, PVA, foamcore, two video projectors, two computers, two IR web cams, custom software in Processing. Dimesions variable. Exhibited in Kerfuffle at Bumbershoot 2009, Seattle. Curated by Lele Barnett and Chris Weber.abstraction installation interactive projection mapping sculpture
Crystalline Chlorophyll is an interactive digital sculpture. Its surface starts as an icy-crystalline shimmering and is slowly overcome by a verdant-green growth of texture during the course of the exhibition based on the movement of the visitors observing it. By tracking movement in the room it randomly adds a bit of green to the surface using generative pseudo-organic image algorithms. When the sculpture is left alone for a while the mossy layer decays revealing the crystal-like surface again, waiting for the next group of visitors.
The physical sculpture is built from a 3D virtual model using cutouts from it’s flattened triangles printed on 11X17″ card stock. The surface of the sculpture is painted with white gesso. The original object was designed in Blender, it’s facets were unfolded using the “unfold” python script. The unfolded meshes are then exported as .svg UV maps. More information on the technique of going from 3D model to papercraft can be found in this tutorial.
Two ceiling mounted video projectors project from opposite sides onto the sculptural surface. The images are corrected for distortion by mapping textures to a 3D model of the real-world object, utilizing techniques also found in video game technology.
The work’s software was written in the Processing programming language. The software loads a 3D model of the sculpture, made in Blender, and allows for manual rotation, scaling and positioning of the object so that it can be aligned to the real world object. When the two models align the physical version has a projection-mapped surface that can be treated as though it were a 3D software texture-map. The software generates a real-time texture on the object’s surface using camera tracking to seed random image generation. Virtual lighting completes the effect.
Camera tracking is done with a simple difference algorithm. Each channel (RGB) of each pixel of the camera image is tracked for the amonut of change from the previous frame. If the number is above a certain threshold a random number is generated to determine if the associated pixel in the real-time texture map should change. A height map generated in Blender partially determines the probability of a pixel changing from the static “ice-crystal” texture to a “mossy” one. Other influences are neighboring pixels values and the amount of change in the camera pixel.