N E G R O P O N T E

Message: 34
Date: 4.1.96
From: <nicholas@media.mit.edu>
To: <lr@wired.com>
Subject:

Affective Computing

Roz Picard, a professor at MIT, believes that computers should understand and exhibit emotion. Absurd? Not really. Without the ability to recognize a person's emotional state, computers will remain at the most trivial levels of endeavor. Think about it. What you remember most about an influential teacher is her compassion and enthusiasm, not the rigors of grammar or science.

Consider one of the simplest forms of affect: attention. Isn't it irritating to talk to someone as his attention drifts off? Yet all computer programs ignore such matters. They babble on as if the user were in a magical state of attentiveness.

Raising the interest rates
When kids do poorly in school it is often because they are learning things outside the curriculum, which may include how to fight or how to market sex appeal. Regardless, they are learning. Certain subjects naturally engage them. With a topic like mathematics, often the problem is not inability to learn, it's a low interest rate (alas, not the financial one). If the material is too simple, the student may be bored. If it's too difficult, the student may become frustrated. In either case, an opportunity for learning is missed - an opportunity for the "aha" that comes with discovery, coupled with a neuropeptide rush that beats anything found on the playground. The instantaneous delight on the student's face says it all - "I want to learn more!"

When a child is working with a good personal tutor, that child's affective state is a key communicator. When the child gets frustrated, the tutor adjusts her approach. When the child shows increased interest, the tutor might suggest new challenging roads to explore. The tutor both recognizes and expresses emotion. In short, to be effective she must be affective.

Emotional communication usually relies on tone of voice, facial expression, and body language. How many hours (days?) have you lost trying to straighten out a miscommunication that occurred via email? Of course, you didn't mean it the way it "sounded." Your tone was misunderstood. We might say email is affect-limited. Emotions, such as ;-), are a weak substitute. Affect is important; if it's missing, people tend to fill it in and often wrongly.

A truly personal computer
Classic theories of emotion are inconsistent because of the absence of common affects; people are different from one another. We cannot even agree on the physiological response from person to person. Lie detectors can be fooled, yet a friend can usually catch you in a lie.

Recognizing and understanding emotion is both possible and meaningful when the process relies on knowledge of a particular person. One friend may flush red when upset; another may breathe more rapidly. Sensors exist that recognize changes in facial expression, heart rate, blood pressure, and more. Add pattern recognition and learning, and a computer could begin to understand a particular person's affective state. The results will be personal, not universal, and that is the point.

Wearable computers (see Wired 3.12, page 256) are part of the solution, especially when they can be placed in basic, universal items. They will not be restricted to perceiving only the visible and vocal forms of affect expression but will have the capacity to get to know you. If you wish, your wearable computer could whisper in your ear, perhaps after playing for a few too many hours with a few too many kids, "Patience, the birthday party is almost over." Interactive games might detect your level of fear and give bonus points for courage.

While taking measurements of an MIT student playing Doom, we expected electromyogram (jaw-clenching) responses to peak during high-action events. Instead, the biggest peak, significantly higher than the others, occurred when the student had trouble configuring the software. What if Microsoft could access a database of affective information from people interacting with its software and modify the parts that annoy people the most?

Emotional intelligence
Unless it is used like film or music - essentially as a vehicle for human expression - expressive computing may strike you as over the edge. After all, isn't freedom from emotional vagaries one of the advantages of a computer? You certainly don't want to wait for your computer to become interested in what you have to say before it will listen. Should a computer be limited to recognizing emotions and yet be prohibited from having emotions?

Too much emotion is clearly undesirable; we all know it wreaks havoc on reasoning. However, consider recent scientific findings regarding people who are essentially emotionally impaired (suffering from a tragic kind of brain injury). These people do not merely miss out on a luxurious range of feelings; they also lack basic rational decision-making abilities. The conclusion is that not enough emotion also impairs reasoning. Similarly, after decades of artificial intelligence efforts, unemotional, rule-based computers remain unable to think and make decisions. Endowing computers with the ability to recognize and express emotion is the first challenge; on its heels is a greater one - emotional intelligence.

For example, an affective steering wheel might sense you're angry (anger is a leading cause of automobile accidents). But what should it do? Prohibit you from driving while you, with escalating anger, rip out its sensors? Of course not.

Emotional intelligence is a question of balance - a tutor reading emotional states and knowing when to encourage and when to let it rest. Until recently, computers have had no balance at all. It's time to recognize affect as a facet of intelligence and build truly affective computers.

This column was co-authored with Rosalind W. Picard (rwpicard@media.mit.edu), NEC Career Development Professor of Computers and Communications at the MIT Media Lab.

Next Issue: Caught Browsing Again

[Back to the Index of WIRED Articles | Back to Nicholas Negroponte's Home Page | Back to Media Lab Home Page]
[Previous | Next]

[Copyright 1996, WIRED Ventures Ltd. All Rights Reserved. Issue 4.04 April 1996.]