By Ahmed Banafa
What if your computer could empathize with you? The evolving field known as affective computing is likely to make it happen soon. Scientists and engineers are developing systems and devices that can recognize, interpret, process, and simulate human affects or emotions. It is an interdisciplinary field spanning computer science, psychology, and cognitive science. While its origins can be traced to longstanding philosophical enquiries into emotion, a 1995 paper on affective computing by Rosalind Picard catalyzed modern progress.
The more computers we have in our lives, the more we are going to want them to behave politely and be socially smart. We don’t want them to bother us with unimportant information. That kind of common-sense reasoning requires an understanding of our emotional state. We’re starting to see such systems perform specific, predefined functions, like changing in real time how you are presented with the questions in a quiz, or recommending a set of videos in an educational program to fit the changing mood of students.
How can we make a computer that responds appropriately to your emotional state? Researchers are using sensors, microphones, and cameras combined with software logic. A device with the ability to detect and appropriately respond to a user’s emotions and other stimuli could gather cues from a variety of sources. Facial expressions, posture, gestures, speech, the force or rhythm of key strokes, and the temperature changes of a hand on a mouse can all potentially signify emotional changes that can be detected and interpreted by a computer. A built-in camera, for example, may capture images of a user. Speech, gesture, and facial recognition technologies are being explored for affective computing applications.
Just looking at speech alone, a computer can observe innumerable variables that may indicate emotional reaction and variation. Among these are a person’s rate of speaking, accent, pitch, pitch range, final lowering, stress frequency, breathiness, brilliance, loudness, and discontinuities in the pattern of pauses or pitch.
Gestures can also be used to detect emotional states, especially when used in conjunction with speech and face recognition. Such gestures might include simple reflexive responses, like lifting your shoulders when you don’t know the answer to a question. Or they could be complex and meaningful, as when communicating with sign language.
A third approach is the monitoring of physiological signs. These might include pulse and heart rate or minute contractions of facial muscles. Pulses in blood volume can be monitored, as can what’s known as galvanic skin response. This area of research is still in relative infancy but it is gaining momentum and we are starting to see real products that implement the techniques.
Recognizing emotional information requires the extraction of meaningful patterns from the gathered data. Some researchers are using machine learning techniques to detect such patterns.
Detecting emotion in people is one thing. But work is also going into computers that themselves show what appear to be emotions. Already in use are systems that simulate emotions in automated telephone and online conversation agents to facilitate interactivity between human and machine.
There are many applications for affective computing. One is in education. Such systems can help address one of the major drawbacks of online learning versus in-classroom learning: the difficulty faced by teachers in adapting pedagogical situations to the emotional state of students in the classroom. In e-learning applications, affective computing can adjust the presentation style of a computerized tutor when a learner is bored, interested, frustrated, or pleased. Psychological health services also benefit from affective computing applications that can determine a client’s emotional state.
Robotic systems capable of processing affective information can offer more functionality alongside human workers in uncertain or complex environments. Companion devices, such as digital pets, can use affective computing abilities to enhance realism and display a higher degree of autonomy.
Other potential applications can be found in social monitoring. For example, a car might monitor the emotion of all occupants and invoke additional safety measures, potentially alerting other vehicles if it detects the driver to be angry. Affective computing has potential applications in human-computer interaction, such as affective “mirrors” that allow the user to see how he or she performs. One example might be warning signals that tell a driver if they are sleepy or going too fast or too slow. A system might even call relatives if the driver is sick or drunk (though one can imagine mixed reactions on the part of the driver to such developments). Emotion-monitoring agents might issue a warning before one sends an angry email, or a music player could select tracks based on your mood. Companies may even be able to use affective computing to infer whether their products will be well-received by the market by detecting facial or speech changes in potential customers when they read an ad or first use the product. Affective computing is also starting to be applied to the development of communicative technologies for use by people with autism.
MIT Media lab is one place that has done extensive work on affective computing. Its projects include something called the galvactivator. It’s a glove-like wearable device that senses a wearer’s skin conductivity and maps values to a bright LED display. Increases in skin conductivity across the palm tend to indicate physiological arousal, so the display glows brightly. This may have many potentially useful purposes, including self-feedback for stress management, facilitation of conversation between two people, or visualizing aspects of attention while learning.
Recently, Google introduced an app for its wearable Glass product called MindRDR, which can detect changes in electrical signals emanating from your brain. Concentrate hard enough, and Glass will take a picture of whatever it is you’re looking at. Concentrate even harder and it will post the photo to Twitter or Facebook. This is a prime example of how close we are to using affective computing in everyday life. Along with the revolution in wearable computing technology, affective computing is poised to become more widely accepted, and there will be endless applications for affective computing in many aspects of life.
Ahmed Banafa teaches in the School of Information Technology at Kaplan University. The views expressed in this article are solely those of the author and do not represent the views of Kaplan University.