Wait, what?

Looks like you came here from Geekosystem. Don't worry, everything is still here. We've just combined forces with The Mary Sue to bring you more and better content, all in one place.

Emotion Reading Technology May Soon Become Big Business

So you want to know if that girl you just met is actually into you, what your boss is thinking but isn’t saying about your work, or if your friend is really still mad at your about forgetting their birthday? Well, good luck, because interpreting the emotions of others is incredibly hard. But, for those among us for whom the visual expression of emotion is a complex, confusing mystery, the burgeoning field of emotion reader technology may be able to help.

Emotion readers come in many forms. There are those that focus on facial expressions, those that zone in on body positioning and movement, and those that take a more physiological approach measuring body temperature and heart rate. But can all this data actually help people improve their social interactions with others? Yes it can, it has, and there are several companies looking to cash in on taking emotion reading technology mainstream. But is society really ready for more transparency?

Emotions are the black box of the human mind, and we’ve been trying to break that box open for decades. Back in the 1970′s US psychologist Paul Ekman identified what he believed to be the seven core emotions: happiness, sadness, fear, anger, disgust, contempt and surprise. Modern research has revised Ekman’s seven emotions to a set that is more easily identified based on visual clues. Research by Rana el Kaliouby and Simon Baron-Cohen at Cambridge University settled on six emotion based facial states: thinking, agreeing, concentrating, interested, confused, and disagreeing. Armed with a more tech-savvy emotional lexicon the researchers sought to develop a system that could identify emotional states.

The researchers hired actors to mime the expressions, and asked volunteers to describe what the expressions meant, using the majority answer as correct to establish a visual library of what certain cues mean. When we are engaged in a conversation with someone we pantomime our emotions, unconsciously sending signals to show whether or not we are following someone else’s train of thought. But these signals fail us often, not because they don’t show what we are thinking but because they are misinterpreted and also mostly unnoticed.

El Kailouby teamed up with Rosalind Picard from MIT’s Media Lab to develop glasses that can read visual emotional clues and give the wearer tips about what the other person is feeling. A camera is positioned inside the glasses, and connected to a wire that runs down to an attached computer. The camera identifies 24 feature points on a subject’s face. Software developed by Picard then analyses the micro-expressions, determining how often they appear and for how long. The computer then compares that data with its bank of known expressions to figure out what is going on. The computer summarizes the data and a summary gets communicated to the wearer using an earphone and a light on the lens of the glasses. A good social interaction gets a green light, when things are heading downhill a red light will appear.

While developing the prototype of their glasses the researchers found that the average person only interprets expressions correctly 54 percent of the time. This told the researchers that aside from people with social disorders like autism, there is a mass market of people out there who are not very good at reading others. The software correctly identifies expressions 64 percent of the time, but the researchers are working on a new algorithm that will be able to pick up even more nuanced emotions. The researchers now have a company called Affectiva, that sells their expression recognition software.

In addition to facial expressions, there are other involuntary “honest signals”, identified by Alex Pentland of MIT Media Lab to show social signals like gesture mirroring, and fluctuations in the tone and pitch of voice. To capture these signals and depict them visually, Pentland developed an electronic badge that hangs around the neck. The badge has audio sensors that record how aggressive the wearer is, the pitch, and the volume and clip of their voice, among other factors. They called the device their jerk-o-meter. The information it gathers can be sent wirelessly to a smartphone or any other device that can display it graphically to inform users about what the other person is conveying.

The MIT researchers realized the monetary value of an emotion reader, and upgraded their more appropriately named sociometric badges to also analyze speech patterns to identify units of speech that make a person sound persuasive. The technology is being marketed to companies to teach employees how to sound more persuasive when talking to customers. The team claims that their device can improve telephone sales by 20 percent. Other members of Pentland’s team at MIT used their doctoral research to found a start-up called Sociometric Solutions that already has several customers, including of Bank of America.

The other way to read emotion is through the body’s responses to conversations like temperature and skin conductance. Picard (who developed the emotion reading glasses) developed a glove-like device called the QSensor that picks up signals like when skin becomes clammy or increases in conductance. Picard has also shown that it is possible to measure heart rate without any surface contact with the body. Software linked to a webcam can read information about heart rate, blood pressure and skin temperature based on, color changes in the subject’s face.

Researchers working in the field of emotion reading have already started branching out, founding companies to market and sell their products. It seems highly likely that at some point in the future we’ll have a multi-feature emotion reader that combines the facial cues, body positioning, and physiological changes that tell observers what their conversation partner is really thinking.

But just because you might soon have access to the ability to interpret the emotional cues of others doesn’t mean that people will be any better at dealing with interpersonal interactions. Knowing what is going on in a situation isn’t the same thing as having empathy, and being able to respond to someone else’s feelings appropriately. Researchers, like Picard and Pentland, have already expressed concern that emotion reading devices should only be used with consent. But if the devices really do hit the mass market, its seems unlikely that the researcher’s intentions for open communication about who is using an emotion reader will be upheld. While society might need a way to help people understand each other better, whether or not society is ready for the implications of covert use of technological emotion readers remains to be seen.

(via New Scientist)

Filed Under |

© 2014 The Mary Sue   |   About UsAdvertiseNewsletterJobsContributorsComment PolicyPrivacyUser AgreementDisclaimerContact RSS

Dan Abrams, Founder
  1. Mediaite
  2. The Mary Sue
  3. Styleite
  4. The Braiser
  5. SportsGrid
  6. Gossip Cop