Mission

The speed of technology in neuroscience as it impacts ethical and just decisions in the legal system needs to be understood by lawyers, judges, public policy makers, and the general public. The Massachusetts General Hospital Center for Law, Brain, and Behavior is an academic and professional resource for the education, research, and understanding of neuroscience and the law. Read more

When a Gun Is Not a Gun

By Lisa Feldman Barrett and Jolie Wormwood | The New York Times Sunday Review | April 17, 2015

The Justice Department recently analyzed eight years of shootings by Philadelphia police officers. Its report contained two sobering statistics: Fifteen percent of those shot were unarmed; and in half of these cases, an officer reportedly misidentified a “nonthreatening object (e.g., a cellphone) or movement (e.g., tugging at the waistband)” as a weapon.

Many factors presumably contribute to such shootings, ranging from carelessness to unconscious bias to explicit racism, all of which have received considerable attention of late, and deservedly so.

But there is a lesser-known psychological phenomenon that might also explain some of these shootings. It’s called “affective realism”: the tendency of your feelings to influence what you see — not what you think you see, but the actual content of your perceptual experience.

Affective realism illustrates a common misconception about the working of the human brain. In everyday life, your brain seems to be a reactive organ. You stroll past a round red object in the produce section of a supermarket and react by reaching for an apple. A police officer sees a weapon and reacts by raising his gun. Stimulus is followed by response.

But the brain doesn’t really work this way. The brain is a predictive organ. A majority of your brain activity consists of predictions about the world — thousands of them at a time — based on your past experience. These predictions are not deliberate prognostications like “the Red Sox will win the World Series,” but unconscious anticipations of every sight, sound and other sensation you might encounter in every instant. These neural “guesses” largely shape what you see, hear and otherwise perceive.

In every moment, your brain consults its vast stores of knowledge and asks, “The last time I was in a similar situation, what sensations did I encounter and how did I act?” If you’re in a produce section, your brain is already predicting that an apple is nearby. If you are in a part of town with a high crime rate, your brain may well predict a weapon. Only after the fact does your brain check the world to see if its prediction was right.

This process might seem backward compared with your daily experience (or even common sense). But it’s the way the brain is wired to work.

In a forthcoming paper in the journal Nature Reviews Neuroscience, one of us (Professor Barrett), in collaboration with the neuroscientist Kyle Simmons, demonstrates that these predictions originate in networks of neurons that are important for experiencing feelings, or affect (hence “affective realism”). These networks drive sensory neurons to fire before sights, sounds and other sensory information arrive from the world.

In addition, our lab at Northeastern University has conducted experiments to document affective realism. For example, in one study we showed an affectively neutral face to our test subjects, and using special equipment, we secretly accompanied it with a smiling or scowling face that the subjects could not consciously see. (The technique is called “continuous flash suppression.”) We found that the unseen faces influenced the subjects’ bodily activity (e.g., how fast their hearts beat) and their feelings. These in turn influenced their perceptions: In the presence of an unseen scowling face, our subjects felt unpleasant and perceived the neutral face as less likable, less trustworthy, less competent, less attractive and more likely to commit a crime than when we paired it with an unseen smiling face.

These weren’t just impressions; they were actual visual changes. The test subjects saw the neutral faces as having a more furrowed brow, a more surly mouth and so on. (Some of these findings were published in Emotion in 2012.)

In a dangerous, high-pressure situation such as a possible crime scene, it’s conceivable that some police shooters actually see a weapon when none is present. Other research from our lab (done by Professor Wormwood in collaboration with the psychologist David DeSteno) supports this possibility. In a set of five studies published in 2010 in the Journal of Personality and Social Psychology, we asked test subjects to identify whether individuals were holding guns or harmless objects like wallets and cellphones. Our subjects reported more weapons when they were feeling angry, regardless of what the individuals were actually holding.

Let us reiterate: We are not claiming that affective realism is the preferred explanation for police shootings that involve the misidentification of weapons. Nor are we claiming that racial bias has had nothing to do with such shootings. Indeed, affective realism may be one pernicious way in which racial bias expresses itself.

What we do know is that the brain is wired for prediction, and you predict most of the sights, sounds and other sensations in your life. You are, in large measure, the architect of your own experience.

This piece was originally published in The New York Times Sunday Review.

Lisa Feldman Barrett is Professor of Psychology at Northeastern University and a CLBB Faculty member. Jolie Wormwood is a postdoctoral research associate in Lisa Feldman Barrett’s lab.