HOLIDAY CLOSURE: We will be out of the office on December 25 and back on December 26. You can place orders as usual.

The Porchlight Business Book Awards longlist is here!

Excerpts

Biased: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do

April 02, 2019

Share

Jennifer L. Eberhardt reminds us that "racial bias is a human problem—one all people can play a role in solving," even and especially those who design business and social media platforms.

What if one of the keys to overcoming—or at least managing—the unconscious bias that lurks within us is to just slow down? In their pioneering work on heuristics and bias, Daniel Kahneman and Amos Tversky found that it is our fast, instinctive, and emotional modes of thought (what Kahneman termed our "System 1" thinking in Thinking, Fast and Slow) that are most prone to error. 

Professor Jennifer Eberhardt is a pioneering academic in her own right, but as the cofounder and co-director of SPARQ (Social Psychological Answers to Real-World Questions), she is most interested in bringing the understanding she and others have uncovered on prejudice and racial bias to bear on addressing social issues.

In the Conclusion to her new book, Biased: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do, Eberhardt writes about how a change of policy in the Oakland Police Department gave officers more time and space to make logical and conscious decisions rather than impulsive, reactionary ones, and lowered both the rate of officer involved shootings and officer injuries. She spoke of the change in a recent interview on NPR: 

 

They decided to change their foot-pursuit policies. Instead of chasing someone into a sort of a dark backyard or into a place where you couldn't see where they were and you know it was hard to get out—they were told not to do that—instead, to step back and set up a perimeter and call for backup.

So here's a situation, right, where you're giving officers more distance on the situation. You're giving them more time, options. You're allowing them to work through what the strategy should be and so forth. … Oakland used to have eight to nine officer-involved shootings every single year, but after they adopted this new foot-pursuit policy, they had eight officer-involved shootings across five years. …

Before they changed the foot-pursuit policy, you're placing officers in a situation where bias is more likely to take hold and to affect their decision-making, and you're putting them instead in a situation where they can slow it down and think it through and have the resources available to them to deal with it, and so they are less likely to have bias affect what they do.

 

Most of us are unlikely to be put in such life-and-death situations at work, or in our daily lives, but that doesn't mean we are free from bias and the real-world consequences that result from allowing the systems we create (even-faster, more automatic, and algorithmic) to perpetuate them.

Another real-world example Eberhardt discusses in the book is that of Nextdoor, "an online social networking service that serves as a sort of giant chat room for individual neighborhoods." They were having a problem with their “crime and safety” category on the site. They began seeing posts labelling blacks and Latinos "suspicious" when their behavior was routine, being labelled as threatening just for existing in the neighborhood.  "Instead of bringing neighbors closer together," writes  Eberhardt, "the platform exposed raw racial dynamics that generated hurt feelings, sparked hostilities, and fueled fierce online arguments."

She went to the company's San Francisco campus to meet Sarah Leary, one of the founders of Nextdoor, to discuss the problem, and solutions. Our friends at Viking were kind enough to share an excerpt from the book about what happened next. 

 

◊◊◊◊◊

 

Nextdoor needed to find a way to dial back the hair-trigger impulse that makes skin color alone arouse suspicion. Her team wanted to educate, not shame or alienate users who’d stumbled into trouble with awkward or insensitive postings. She found possible solutions in studies that show how bias is most likely to surface in situations where we’re fearful and we’re moving fast. I visited her office to share my expertise on the subject.

Speed is the holy grail of technology. Most tech products are created with the aim of reducing friction and guiding us through a process rapidly and intuitively. But the very thing that makes technology so convenient also makes it perilous where race and safety are concerned. The goal is to create an online experience for users that’s easy, quick, and fluid, allowing them to express themselves instantly. Yet these are exactly the kinds of conditions that lead us to rely on subconscious bias.

To curb racial profiling on the platform, they had to contemplate slowing people down. That meant adding steps to the process of posting about “suspicious people” but not making things so cumbersome that users dropped out. They needed something that would force people to look past the broad category of race and think about specific characteristics. So they developed a checklist of reminders that people have to click through before they can post under the banner of “suspicious person”:

Focus on behavior. What was the person doing that concerned you, and how does it relate to a possible crime?

Give a full description, including clothing, to distinguish between similar people. Consider unintended consequences if the description is so vague that an innocent person could be targeted.

Don’t assume criminality based on someone’s race or ethnicity. Racial profiling is expressly prohibited.

Research supports the notion that raising the issue of race and discrimination explicitly can lead people to be more open-minded and act more fairly, particularly when they have time to reflect on their choices.

The posting process was changed to require users to home in on behavior, pushing them past the “If you see something, say something” mind-set and forcing them to think more critically: if you see something suspicious, say something specific.

Adding friction to the process slowed things down a bit, but it did not lead to the huge drop-off in users that industry observers had predicted. What it did do was reduce the incidence of racial profiling: Nextdoor’s tracking suggests it is down by more than 75 percent. They’ve even adapted the process for international use, with customized filters for European countries, based on their mix of ethnic, racial, and religious tensions.

The approach offers benefits beyond reducing neighborhood animosity. That friction and the awareness it generates may make people more willing and better equipped to think and talk frankly about race. Conversations about racial issues in interracial spaces can be uncomfortable. It’s no wonder people tend to avoid them. Integration is hard work, and threat looms over the process. White people don’t want to have to worry that something they say will come out wrong and they’ll be accused of being racist. And minorities, on the other side of the divide, don’t want to have to wonder if they’re going to be insulted by some tone-deaf remark. The interaction required to move past stereotypes takes energy, commitment, and a willingness to let big uncomfortable issues intrude on intimate spaces—your home and your neighborhood.

Research shows that talking about racial issues with people of other races is particularly stressful for whites, who may feel they have to work harder to sidestep the minefields. Their physical signs of distress are measurable: Heart rates go up, blood vessels constrict, their bodies respond as if they were preparing for a threat. They demonstrate signs of cognitive depletion, struggling with simple things like word-recognition tasks.

Even thinking about talking about race can be emotionally demanding. In a study of how white people arranged the physical space when they knew they’d be in conversation with blacks, the arrangements varied based on the subject of those chats. When the study participants were told they’d be talking in small groups about love and relationships, they set the chairs close to one another. When they were told the topic was racial profiling, they put the chairs much farther apart.

Nextdoor can’t make the angst go away. But benefits accrue from nudging people to talk about race and consider the harm a thoughtless judgment can do. “What I have found is that this can be a personal journey,” Sarah said. “When you raise the issue with people, at first there might be a little bit of ‘Oh, come on.’ And then you explain and you get ‘Oh yeah, that makes sense.’ I think right now most people believe ‘I can only screw this up, so maybe I shouldn’t have that conversation.’ But if people believed that having the conversations actually led to better understanding, they’d be more willing.”    

She saw that happen in Oakland, when people came together to talk about their distress over racially biased posts. “I think people just get closed off, and they try to simplify the world with simple assumptions to get through their day,” she said. “But there’s a whole canopy of examples of people’s lives that are maybe more similar to yours than you assume. When you have direct connections with people who are different from you, then you develop an ability to recognize that.” So the scary black teenager in the hoodie in the dark turns out to be Jake from down the block, walking home from swim team practice.

The beauty of Nextdoor’s template is that it catches people before they’ve done anything wrong. “We try and be very mindful of going through the process of assuming good intent,” Sarah explained. “I think where it actually gets embarrassing for people is when they had good intentions and they put something out there, and they thought they were helping the neighborhood, and someone comes back and is like, ‘You’re a racist.’”

The tool gets users to stop and think before they post something that will land them in heated arguments with neighbors. Because once the comment is out there, it’s hard to dial things back. It’s not a productive conversation if one person is outraged over being labeled a racist and the other is feeling aggrieved about always having to be the person waving the flag and saying, “Do you realize what you just did?” When there’s more thoughtfulness and less defensiveness, honest conversations about race are possible.

Ultimately, we see our neighborhoods as an extension of our homes. And home is the place where you let your guard down; where you expect to feel loved, safe, and comfortable. But living with diversity means getting comfortable with people who might not always think like you, people who don’t have the same experience or perspectives. That process can be challenging. But it might also be an opportunity to expand your horizons and examine your own buried bias.

 

Excerpted from Biased: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do by Jennifer L. Eberhardt.
published by VIKING, an imprint of Penguin Random House LLC. 
Copyright 2019 by Jennifer L. Eberhardt.
All rights reserved.

 

ABOUT THE AUTHOR

Dr. Jennifer Eberhardt is a professor of psychology at Stanford and a recipient of a 2014 MacArthur “genius” grant. She has been elected to the National Academy of Sciences, the American Academy of Arts and Sciences, and was named one of Foreign Policy’s 100 Leading Global Thinkers. She is co-founder and co-director of SPARQ (Social Psychological Answers to Real-World Questions), a Stanford Center that brings together researchers and practitioners to address significant social problems.

We have updated our privacy policy. Click here to read our full policy.