The "Digital Baton", which we have built at the Media Lab, is a handheld device that incorporates several sensing modes; i.e., it tracks the horizontal/vertical position of an IR LED at the baton tip for precise pointing (using a synchronously demodulated PSD photosensor to avoid problems with latency and background light) and uses a 3-axis 5G accelerometer array for fast detection of directional beats and large gestures. It also features 5 force-sensitive resistors potted in the urethane baton skin for measuring distributed finger/hand pressure. This multiplicity of sensors enables highly expressive control over electronic music; the performer can "conduct" the music at a high level, or descend into a "virtuoso" mode, actually controlling the details of particular sounds. We have used this baton for several projects, including hundreds Brain Opera performances at several worldwide venues. The Digital Baton was developed by a large team at the MIT Media Lab, including Teresa Marrin (seen playing the instrument in the photo above), Chris Verplaetse, Maggie Orth, and Joe Paradiso. This video clip (4.6 MEG MPEG) shows MIT electrical engineering undergraduate Kai-yuh Hsiao performing a piece he wrote for the baton. A basic musical sequence is modified in several ways by the baton data. Beats add accents; a forward beat also changes the tonality, with the new tonal map selected as a function of the baton tip position. Finger pressure does several things, including adjusting the speed of the sequence and selecting modes; i.e., mapping a guitar sound onto the tracker data, such that vertical tip position controls the guitar pitch and horizontal position controls timbre. video clip (7.3 MEG Quicktime)shows Teresa Marrin demonstrating and performing Brain Opera mappings with the baton.