Mission

The Center for Law, Brain & Behavior puts the most accurate and actionable neuroscience in the hands of judges, lawyers, policymakers and journalists—people who shape the standards and practices of our legal system and affect its impact on people’s lives. We work to make the legal system more effective and more just for all those affected by the law.

The Emoticon on Your Face

By Courtney Humphries | The Boston Globe | February 21, 2012

What’s in a face? We generally see it as a window into our inner lives — so much so that it’s possible to read our emotions from our facial expressions. And in recent decades, we have become enchanted by the notion that with a little specialized knowledge, we can read these feelings very, very accurately. A program launched at Logan Airport last year has trained security personnel to converse with passengers while scanning their facial movements for suspicious emotions. Companies like Affectiva, a spinoff of MIT’s Media Lab, are developing ways to automatically judge a person’s mood in part by observing the movements of facial muscles. And the recent TV show “Lie To Me” was built on the premise that its main character could read hidden meanings in facial expressions and body language.

The premise that you need only the proper training to read emotion on our faces rests on a long history of research into the origins and function of facial expressions. Based on numerous studies, specialists have come to believe that these expressions of feeling are basic, automatic, and universal. Sadness, happiness, anger, and disgust are common to all humans, they have posited, and are expressed and recognized in a common way. Call it the emoticon theory of facial expression: Surely anyone can distinguish from 🙂 from (:.

But a number of psychologists are now arguing that this established view oversimplifies how people express and perceive emotions. They say that scientists have ignored equally compelling research over the years showing that context and culture affect how we interpret facial expressions, and that we don’t produce them in clearly readable ways. In sum, these researchers are suggesting that happy and sad expressions are not basic, evolutionary responses that take the same form all over the world, but cultural categories that we create from a much more complex emotional reservoir.

Lisa Feldman Barrett, a psychologist at Northeastern University, is one of this growing group of scientists calling for a more nuanced view of the relationship between face and feelings. She argues that while we may carry an archetype of each expression in our minds, in reality we make facial expressions less consistently, and our ability to recognize emotions is not as simple as just reading faces. “It’s not like there’s one signal for anger,” she says. “Yet we feel it and perceive it in others. How can that be?” She believes that recent research shows the answer depends, much more than we’ve come to believe, on context, culture, and the details of the situation. And in trying to force facial expressions into a series of archetypes, science may be missing the full picture: an emotional spectrum that isn’t easily parsed into emoticons.

THE FACE IS an astounding canvas of emotion — look at an actor’s ability to convey emotional depths with a raised eyebrow, a tight smile, or a drop-mouthed stare. Scientists have long speculated about the purpose of such expressions as frowning or smiling. In his 1872 work “The Expression of the Emotions in Man and Animals,” Charles Darwin proposed that facial expressions were universally recognized and that each evolved separately — perhaps originally for some purpose such as avoiding a predator, and eventually simply becoming a habit associated with emotion.

Then, in the 1960s and ’70s, psychologists including Paul Ekman (who is now widely renowned for his work in the field) launched a new wave of interest in the evolutionary origins of facial expressions. When given a set of options, Ekman found, people from different places and cultures, including the isolated people of Papua New Guinea, could correctly name the emotions conveyed in a set of posed facial expressions. This suggested that, in contrast to previous findings that people’s interpretations of faces depend on context and culture, facial expressions signal a distinct set of basic emotions that were recognizable across cultures. It suggested that these expressions were probably more biological than we had realized, and that emotions could be spotted on the face via certain objective markers: A “Duchenne smile” in which muscles crinkle the eyes, for instance, signals true enjoyment rather than a fake smile.

This idea made its way into psychology and neuroscience. A complicated facial-encoding system was developed to measure the expressions of subjects in psychology studies. The posed, static images used in those seminal studies became tools, a shortcut to understanding emotion. Neuroscientists, for instance, have shown these universal angry or sad faces to subjects in brain scanners to map how emotions are recognized and processed in the brain. Increasingly, expressions were stripped of context — without background, bodies, hair, or obvious gender.

The strength and details of Ekman’s findings have been questioned over the years, most notably by James Russell, a psychologist at Boston College, in the mid-1990s. But the debate flared up again in a recent issue of the journal Current Directions in Psychological Science, highlighting a larger debate about the implications of his work and the scientific paradigm it launched. In that issue, psychologists Azim Shariff and Jessica Tracy detail accumulated evidence that they argue makes the case for an evolutionary view of emotional expressions. Some, they say, may have evolved for a physiological purpose — widening the eyes with fright, for instance, to expand our peripheral vision. Others may have evolved as social signals. Meanwhile, in a commentary, Barrett lays out a point-by-point counterargument. While humans evolved to express and interpret emotions, she contends, specific facial expressions are culturally learned.

Barrett believes that the universality of recognizing facial expressions is “an effect that can be easily deconstructed,” if, for instance, subjects are asked to give their own label to faces instead of choosing from a set of words. In another recent paper in the same journal, she argues that a growing body of research shows our perception of facial expressions is highly dependent on context: People interpret facial expressions differently depending on situation, body language, familiarity with a person, and surrounding visual cues. Barrett’s own research has shown that language and vocabulary influence people’s perception of emotions. Others have found cultural differences in how people interpret the facial expressions of others — a study found that Japanese people, for instance, rely more than North Americans on the expressions of surrounding people to interpret a person’s emotional state.

Takahiko Masuda, a psychologist at the University of Alberta who led that study, says that science needs to move beyond a reductive focus on facial expressions. “Now we really have to pose the question, how do human beings actually understand another person’s emotion, and what information do they use?” To these researchers, context is not something to be stripped away but a critical part of how we interpret emotions. “Clearly humans are smart, and they figure out what other people are feeling based on cues,” says Russell. “But it’s not an automatic signal; it’s a kind of puzzle.” In a movie, for instance, we may feel we can read an actor’s relatively impassive face, but all along we’re helped by the scene, lighting, and music.

Barrett also points to research showing that people don’t consistently produce basic facial expressions in real life. She believes they serve more as symbols and communication tools internal to cultures and to communities. Research shows, for instance, that people are more likely to smile when watching a funny video with another person than when watching it alone, showing that smiles are not just spontaneous expressions but social cues.

Linda Camras, an experimental psychologist at DePaul University, has been studying facial expressions in infants to see if they show automatic expressions before learning cultural norms. In fact, she finds the opposite. While babies have clear positive and negative states — as all parents know, they tend to cry when anything is wrong — they don’t produce distinct expressions for emotions that have been posited to be basic, like anger, sadness, or fear.

THERE IS GREAT practical appeal to the notion that facial expressions objectively correspond to basic emotions: If that’s true, then humans and perhaps even automatic systems can be usefully trained to read someone’s inner state. That’s the goal of researchers like Jeffrey Cohn, a psychologist at the University of Pittsburgh who is developing ways to automatically detect facial expressions. His hope is that nuances of expression might help clinicians diagnose depression or measure pain — the scales for which currently rely solely on a patient’s own self-assessment.

But it was in looking for just such a tool that Barrett began to doubt that facial expressions are really so distinct and objectively recognizable. Earlier in her career, in studying depression and anxiety, she asked research subjects to write down their feelings throughout the day. Some people could express different emotions clearly, but others reported a jumble: sadness, anger, fear, anxiety. She’s tried to look for more objective ways to measure emotions — facial expressions, body language, physiological measures, brain imaging — and with little success thus far. “If you remove the perceiver and just measure objectively what the person is doing, you don’t find signatures of anger or sadness or fear,” she says. “It turns out there is no objective criterion for any emotion.”

The larger problem, she believes, is that emotions aren’t wired into the categories we give them; they’re more fluid and complex. We may experience emotions the way we perceive color — as a continuous spectrum that we divide into arbitrary categories, shaped by language and culture. Facial expressions reflect this complexity. Most of us don’t cycle between emoticons; our faces contain a shifting portrait of feeling. If we’ve evolved anything, she believes, it’s the ability to make sense of our nuanced emotional states by putting them into categories. And rather than giving us the tools to automatically scan each other’s faces for happiness or pain, the role of science, she says, is to help us discover when such simple emotional categories may mislead us.


Read the original post in The Boston Globe.