Posted on September 7, 2021

Facebook Apologizes After A.I. Puts ‘Primates’ Label on Video of Black Men

Ryan Mac, New York Times, September 3, 2021

Facebook users who recently watched a video from a British tabloid featuring Black men saw an automated prompt from the social network that asked if they would like to “keep seeing videos about Primates,” causing the company to investigate and disable the artificial intelligence-powered feature that pushed the message.

On Friday, Facebook apologized for what it called “an unacceptable error” and said it was looking into the recommendation feature to “prevent this from happening again.”

The video, dated June 27, 2020, was by The Daily Mail and featured clips of Black men in altercations with white civilians and police officers. It had no connection to monkeys or primates.

Darci Groves, a former content design manager at Facebook, said a friend had recently sent her a screenshot of the prompt. She then posted it to a product feedback forum for current and former Facebook employees. In response, a product manager for Facebook Watch, the company’s video service, called it “unacceptable” and said the company was “looking into the root cause.”

{snip}

Google, Amazon and other technology companies have been under scrutiny for years for biases within their artificial intelligence systems, particularly around issues of race. Studies have shown that facial recognition technology is biased against people of color and has more trouble identifying them {snip}

In one example in 2015, Google Photos mistakenly labeled pictures of Black people as “gorillas,” for which Google said it was “genuinely sorry” and would work to fix the issue immediately. More than two years later, Wired found that Google’s solution was to censor the word “gorilla” from searches, while also blocking “chimp,” “chimpanzee” and “monkey.”

{snip}