BBC News, February 4, 2013
A study of Google searches has found “significant discrimination” in advert results depending on the perceived race of names searched for.
Harvard professor Latanya Sweeney said names typically associated with black people were more likely to produce ads related to criminal activity.
In her paper, Prof Sweeney suggested that Google searches may expose “racial bias in society”.
Google has said it “does not conduct any racial profiling”.
In a statement to the BBC, the company said: “We also have an ‘anti’ and violence policy which states that we will not allow ads that advocate against an organisation, person or group of people.”
When placing ads with Google, companies are able to specify which keywords they would like to target.
“It is up to individual advertisers to decide which keywords they want to choose to trigger their ads,” the search giant said.
The study analysed the type of advertisements that appeared on Google when certain names were searched for.
It looked at Google.com’s core search engine, as well as the search function of Reuters.com — which also displays Google’s advertising.
Prof Sweeney’s investigation suggests that names linked with black people — as defined by a previous study into racial discrimination in the workplace — were 25% more likely to have results that prompted the searcher to click on a link to search criminal record history.
She found that names like Leroy, Kareem and Keisha would yield advertisements that read “Arrested?”, with a link to a website which could perform criminal record checks.
Searches for names such as Brad, Luke and Katie would not — instead more likely to offer websites that can provide general contact details.
“There is discrimination in the delivery of these ads,” concluded Prof Sweeney, adding that there was a less than 1% chance that the findings could be based on chance.
“Alongside news stories about high school athletes and children can be ads bearing the child’s name and suggesting arrest. This seems concerning on many levels.”
However, she was reluctant to pinpoint a cause for the discrepancies, saying that to do so required “further information about the inner workings of Google AdSense”.
She noted that one possible cause may be Google’s “smart” algorithms — technology which automatically adapts advertising placement based on mass-user habits.
In other words, it may be that the search engines are reflecting society’s own prejudices — as the advertising results Google serves up are often based on the most popular links previous users have clicked on.
“Over time, as people tend to click one version of ad text over others, the weights change,” Prof Sweeney explained.
“So the ad text getting the most clicks eventually displays more frequently.”
She argued that technology should be used to counteract this effect.
“In the broader picture, technology can do more to thwart discriminatory effects and harmonise with societal norms.
“Ads responding to name searches appear in a specific information context and technology controls that context.”