Can you imagine a sensing glove monitoring a person’s physiological reactions to adapt desktop applications, mobile devices, or ‘intelligent’ environments to the person’s current needs? That’s exactly the device (EREC Emotion Glove) shown in the picture on the left (click on it for a larger version) and proposed by Fraunhofer Institute.
In the following, I discuss about computers measuring our emotions with 3 researchers working respectively in the United States, Germany, and Canada: N. Sadat Shami (Cornell University), Christian Peter (Fraunhofer Institute for Computer Graphics), Regan Mandryk (University of Saskatchewan). At CHI 2008, they are organizing the workshop Measuring Affect in HCI: Going Beyond the Individual, which is taking place today.
What do you mean exactly by "Measuring Affect" and why it is an important field of research?
N. Sadat Shami: “For those of us interested in ‘affective computing’ – the broad research area that investigates the role of emotions or affect in technology – ‘measuring affect’ is a contentious topic. To some, measuring affect entails wiring up people and gauging physiological responses such as heart rate, perspiration and pupil dilation. To others with more of a social science bent, measuring affect involves administering questionnaires that attempt to capture a user’s affective state. Both physiological measures and self-reported measures have their individual strengths and weaknesses. As we go from the individual context to the group or social context where multiple users are interacting with technology, measuring affect becomes even more complicated.
The increasing popularity of technologies such as iPods, iPhones, or Virtual Worlds may have something to do with the emotional responses these technologies generate. Providing an engaging user experience may be dependent on evoking positive emotions. Being able to correctly detect,
sense or recognize the emotional responses associated with technology use thus becomes essential in evaluating the success of the design of our technologies.”
Christian Peter: “Measuring affect means to access physiological changes in a person caused by emotions. Each emotion has a specific "pattern" of physiological changes. For instance, fear has the pattern "cold, sweaty hands, increased heart rate, tensed muscles", or joy has the pattern "warm hands, increased heart rate, grinning face, spontaneous gestures". Such signs can be detected and
analysed by computers.
Why measuring it? First, it’s an emotional thing 🙂 . With humans, somebody who ignores the feelings of others is not liked as much as somebody who shows some sort of emotional feedback, or empathy. Why should it be different with computers?”
Regan Mandryk : “There are many situations where we might want our computer systems to ‘know’ how a user is feeling. For example, we now use our computers for applications that aren’t productivity related like organizing and editing digital photos and communicating with loved ones. Emotion is highly relevant to these types of applications and being able to express ourselves
emotionally in these types of interfaces would be very beneficial.”
How is human-computer interaction changed by introducing affect measurement?
N. Sadat Shami: “If computers can correctly detect the affective state of humans, they can ‘intervene’ intelligently. Imagine how nice it would be if computers could detect that an individual is sad and play music to cheer him up. Or detect anger or boredom while driving and recommend actions that might ameliorate that state. Or detect anxiety or frustration in an emergency services
worker and alert his supervisors. Or even technology that senses the affective tone in a face-to-face or online meeting and can suggest to an individual participant that she might be appearing too confrontational or dominating to others. There are a variety of such situations one can think
of where humans will benefit through the ability of computers to correctly detect our emotions.”
Christian Peter: “Computers which say "sorry" when they can’t find the information or are happy with the user about a longed-for email will be liked much more than computers are nowadays. People will establish an emotional bond to their computers, just as they have to their fellow-humans. Actually most of us already have that emotional attitude toward computers, mainly a negative one.
Second, for e.g. office workers, applications paying attention to their user’s emotional and mental state and adapting to their current needs increase productivity, help to prevent errors caused by distracted or absentminded employees, and make staff more satisfied at work.
Third, products need to be liked to be successful on the market. Just think of a popular music player. It’s certainly not the functionality it offers that make it that attractive. But its appealing design and the feeling it offers when using it is unique and the reason people prefer it over other products. By giving a product the ability to assess whether its current action was liked or not, whether its owner could do with some help now or be grateful for a relaxing game or some good news from a friend would make it an appreciated companion and superior over emotionally "ignorant" competitors, as Don Norman’s explains in emotional design.”
Could you give me a few examples of particularly interesting work that is being presented today at your workshop?
N. Sadat Shami: “The papers submitted to our workshop fall into roughly four categories. There are papers that describe models of affect measurement, papers that discuss measuring affect by the language used by humans, papers that explain the role of specific physiological signals in measuring affect, and papers that deal with measuring affect in the group context.
Katherine Isbister and her colleagues have re-appropriated the Nintendo Wii controller in their ‘Wriggle!’ project to collect data about how humans collectively express emotion through gesture. They intend to gain an understanding of shared emotional dynamics during multi-player gaming experiences.
Fatma Nasoz and Christine Lisetti will present a system that receives physiological input from technologies such as non-invasive wearable computers, BodyMedia SenseWear Armband and Polar Chest Strap. The system can also receive input from language tools. It then does systematic analysis of that input using neural networks and machine learning algorithms to classify each input into specific emotions.”
What sorts of new applications do you imagine for the future, thanks to
the research that is being discussed at your workshop?
Christian Peter: “ First, personalized assistive applications, paying attention not only to task to be accomplished, but also to current capabilities and needs of user.
Second, smart homes, particularly for elderly, supporting in everyday life, maintaining and establishing social inclusion, supporting therapies, and providing taylord support for independant living.”
Regan Mandryk: “Computer gaming is an interesting application area. Until very recently, most advances in gaming were on the hardware side, or in creating more photorealism in the graphics. There has been recent improvement in gaming interfaces (e.g. Wii, Rockband), and the next advance could be in user context awareness. You can imagine that adding emotion to computer game
play could create very magical and fantastic gameplay experiences.”
© 2008, Il Sole 24 Ore. Web report from CHI 2008.