Experiential Lighting

The vision of pervasive computing is now mainstream. These connected devices permeate every aspect of our lives. But, we remain tethered to arcane user interfaces. Unlike consumer devices, building appliances and utilities perpetuate this outdated vision. Lighting control is a prime example. Here, we show how a data-driven methodology --using people and sensors-- enables an entirely new method of lighting control.

We are evaluating new methods of interacting and controlling solid-state lighting based on our findings of how participants experience and perceive architectural lighting in our new lighting laboratory (E14-548S). This work, aptly named "Experiential Lighting," reduces the complexity of modern lighting controls (intensity/color/space) into a simple mapping, aided by both human input and sensor measurement. We believe our approach extends beyond general lighting control and is applicable in situations where human-based rankings and preference are critical requirements for control and actuation. We expect our foundational studies to guide future camera-based systems that will inevitably incorporate context in their operation (e.g., Google Glass).


Matt Aldrich Experiential Lighting Development and Validation of Perception-based Lighting Controls PhD Thesis (2014)

Principle Investigator: Joseph Paradiso

Research Group: Responsive Environments group at the MIT Media Lab

Research Assistants: Matt Aldrich, Nan Zhao

Collaborators: Susanne Seitinger from Philips Lighting