Posted on April 11, 2019

Can Algorithms Reduce Racial Bias Rather Than Embed It?

Steve Dubb, Nonprofit Quartly, April 2, 2019

In an excerpt published in Next City from her book BIASED: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do, Stanford social psychologist (and MacArthur Fellow) Jennifer Eberhardt delves into the impact of implicit bias in perpetuating segregation and racial discrimination. More than half of whites, Eberhardt explains, say they would not move to an area that is more than 30 percent black, because they “believe that the housing stock would not be well maintained and crime would be high.”

More broadly, Eberhardt writes, studies by sociologists Lincoln Quillian and the late Devah Pager show that “the more Blacks there are in a community, the higher people imagine the crime rate to be — regardless of whether statistics bear that out.” {snip} Eberhardt adds that, “Black people are just as likely as whites to expect signs of disorder in heavily Black neighborhoods.”

As NPQ has noted, technology often embeds these biases in algorithms. {snip}

That said, as NPQ’s Jeanne Allen has pointed out, algorithms can be used to mitigate biases if there is “an intentional focus when developing, buying, or adapting data systems.”

In her excerpt, Eberhardt profiles one such example from Nextdoor, a social network that aims to help people “feel comfortable connecting with neighbors they’ve never met.” Often the network is used for conventional needs, such as finding a lost dog, getting babysitter recommendations, and so forth. But the firm found that the software was also being used to “warn” neighbors about “a stranger who seems out of sync with the prevailing demographic.”

The firm therefore changed its posting process to discourage this kind of use of the platform. As Eberhardt explains, “The posting process was changed to require users to home in on behavior, pushing them past the ‘If you see something, say something’ mindset and forcing them to think more critically: if you see something suspicious, say something specific.”

To curb racial profiling, the firm, Eberhardt explains, developed a checklist of reminders that users have to click through before they can post under the banner of “suspicious person.” {snip}

{snip}

The idea behind the prompts is to slow down rash, sometimes unconscious, thinking that leads to racial profiling behavior. The prompts did not eliminate racial profiling, but profiling, the firm says, fell more than 75 percent. Eberhardt adds that, “They’ve even adapted the process for international use, with customized filters for European countries, based on their mix of ethnic, racial, and religious tensions.”

{snip}