The exemplar project was an interesting intersection of my work as an undergraduate researcher at the MIT Media Lab, building devices like the smart helmet and Stanford HCI's work in building toolkits to allow designers to easily prototype physical computing devices.
I spent a considerable part of the summer of 2005 building, fixing and debugging a smart helmet. The smart helmet was a platform to simplify a persons interaction with the world around them as they rode their bicycle. The helmet had features like - if you tilt your head to a side, blinkers corresponding to that side would switch on.
The guys at Stanford HCI were building toolkits like d.tools to help designers attach hardware sensors and actualors (and leds) to computers to help prototype sensor interactions. However, doing so still involved considerable knowledge of programming and reading datasheets for the corresponding sensors.
In Exemplar, we developed a system that where you could program sensor interactions by example - i.e., you could perform the action while holding down the record button and the system would slowly learn what you meant. The sensor data was visualized and the user could interact with it so as to simplify the machine learning involved.