While much research has focused on tangible lighting interfaces embedded in physical objects and smartphones as remote control, there has not been sufficient attention on how the expressivity of bodily movement can be used when designing interactions with home lighting. Therefore, we investigate interaction with lighting technology beyond the smartphone and physical controllers and examine the potentials of the emerging in-air gestural interaction style for lighting control.
Tangible Lights serves as a novel interface for in-air interaction with lighting, drawing on existing knowledge from the tangible world. The system is an implementation of our visionary concept, where everyday lighting is interacted through in-air gestures in a meaningful way.
Tor Sørensen (http://torsorensen.dk)
Oskar D. Andersen (http://oskarandersen.dk/)
Tim Merritt (http://ixd.net/)
Tangible Lights enables the user to customize the light setting at the table with precise control through several, individual illuminated regions. Each illuminated region can be manipulated freely in the space above the tabletop. As a result, the position and size of each individual, illuminated region can be manipulated as desired through a set of interconnected in-air gestures.
The name Tangible Lights stems from our intentions of creating an interface, where the user feels as if she is "holding onto the lights" and controlling it at her fingertips. The interactions designed provide a set of interconnected actions for manipulating the light setting. Actions include spawning, selecting, deselecting, moving, scaling, and removing lights. To simplify the interactions, we draw inspiration from known daily life actions such as grabbing, holding onto a plate or cup.
Technically, the platform consists of a short-throw projector and a Microsoft Kinect sensor (see first image above). The short-throw projector serves as the light source as it provides an easy and dynamic way to position an arbitrary amount of illuminated regions on the tabletop. The Kinect sensor continuously streams depth maps to the gesture recognition software at 30 frames per second.
Originally, Tangible Lights was one out of eight prototypes developed for my Master's thesis at Aarhus University (my thesis is described briefly on the next page).