Posted on November 25, 2019

How Google Interferes With Its Search Algorithms and Changes Your Results

Wall Street Journal, Kirsten Grind, Sam Schechner, Robert McMillan, and John West, November 15, 2019

Google CEO Sundar Pichai

Google CEO Sundar Pichai (Credit Image: © Imago via ZUMA Press)

Every minute, an estimated 3.8 million queries are typed into Google, prompting its algorithms to spit out results for hotel rates or breast-cancer treatments or the latest news about President Trump.

They are arguably the most powerful lines of computer code in the global economy, controlling how much of the world accesses information found on the internet, and the starting point for billions of dollars of commerce.

Twenty years ago, Google founders began building a goliath on the premise that its search algorithms could do a better job combing the web for useful information than humans. Google executives have said repeatedly—in private meetings with outside groups and in congressional testimony—that the algorithms are objective and essentially autonomous, unsullied by human biases or business considerations.

The company states in a Google blog, “We do not use human curation to collect or arrange the results on a page.” It says it can’t divulge details about how the algorithms work because the company is involved in a long-running and high-stakes battle with those who want to profit by gaming the system.

But that message often clashes with what happens behind the scenes. Over time, Google has increasingly re-engineered and interfered with search results to a far greater degree than the company and its executives have acknowledged, a Wall Street Journal investigation has found.

Those actions often come in response to pressure from businesses, outside interest groups and governments around the world. They have increased sharply since the 2016 election and the rise of online misinformation, the Journal found.

Google’s evolving approach marks a shift from its founding philosophy of “organizing the world’s information,” to one that is far more active in deciding how that information should appear.

More than 100 interviews and the Journal’s own testing of Google’s search results reveal:

• Google made algorithmic changes to its search results that favor big businesses over smaller ones, and in at least one case made changes on behalf of a major advertiser, eBay Inc., contrary to its public position that it never takes that type of action. The company also boosts some major websites, such as Amazon.com Inc. and Facebook Inc., according to people familiar with the matter.

• Google engineers regularly make behind-the-scenes adjustments to other information the company is increasingly layering on top of its basic search results. These features include auto-complete suggestions, boxes called “knowledge panels” and “featured snippets,” and news results, which aren’t subject to the same company policies limiting what engineers can remove or change.

• Despite publicly denying doing so, Google keeps blacklists to remove certain sites or prevent others from surfacing in certain types of results. These moves are separate from those that block sites as required by U.S. or foreign law, such as those featuring child abuse or with copyright infringement, and from changes designed to demote spam sites, which attempt to game the system to appear higher in results.

• In auto-complete, the feature that predicts search terms as the user types a query, Google’s engineers have created algorithms and blacklists to weed out more-incendiary suggestions for controversial subjects, such as abortion or immigration, in effect filtering out inflammatory results on high-profile topics.

• Google employees and executives, including co-founders Larry Page and Sergey Brin, have disagreed on how much to intervene on search results and to what extent. Employees can push for revisions in specific search results, including on topics such as vaccinations and autism.

• To evaluate its search results, Google employs thousands of low-paid contractors whose purpose the company says is to assess the quality of the algorithms’ rankings. Even so, contractors said Google gave feedback to these workers to convey what it considered to be the correct ranking of results, and they revised their assessments accordingly, according to contractors interviewed by the Journal. The contractors’ collective evaluations are then used to adjust algorithms.

The journal’s findings undercut one of Google’s core defenses against global regulators worried about how it wields its immense power—that the company doesn’t exert editorial control over what it shows users. Regulators’ areas of concern include anticompetitive practices, political bias and online misinformation.

Far from being autonomous computer programs oblivious to outside pressure, Google’s algorithms are subject to regular tinkering from executives and engineers who are trying to deliver relevant search results, while also pleasing a wide variety of powerful interests and driving its parent company’s more than $30 billion in annual profit. Google is now the most highly trafficked website in the world, surpassing 90% of the market share for all search engines. The market capitalization of its parent, Alphabet Inc., is more than $900 billion.

Google made more than 3,200 changes to its algorithms in 2018, up from more than 2,400 in 2017 and from about 500 in 2010, according to Google and a person familiar with the matter. Google said 15% of queries today are for words, or combinations of words, that the company has never seen before, putting more demands on engineers to make sure the algorithms deliver useful results.

A Google spokeswoman disputed the Journal’s conclusions, saying, “We do today what we have done all along, provide relevant results from the most reliable sources available.”

Lara Levin, the spokeswoman, said the company is transparent in its guidelines for evaluators and in what it designs the algorithms to do.

As part of its examination, the Journal tested Google’s search results over several weeks this summer and compared them with results from two competing search engines, Microsoft Corp.’s Bing and DuckDuckGo, a privacy-focused company that builds its results from syndicated feeds from other companies, including Verizon Communications Inc. ’s Yahoo search engine.

The testing showed wide discrepancies in how Google handled auto-complete queries and some of what Google calls organic search results—the list of websites that Google says are algorithmically sorted by relevance in response to a user’s query. (Read about the methodology for the Journal’s analysis.)

Ms. Levin, the Google spokeswoman, declined to comment on specific results of the Journal’s testing. In general, she said, “Our systems aim to provide relevant results from authoritative sources,” adding that organic search results alone “are not representative of the information made accessible via search.”

The Journal tested the auto-complete feature, which Google says draws from its vast database of search information to predict what a user intends to type, as well as data such as a user’s location and search history. The testing showed the extent to which Google doesn’t offer certain suggestions compared with other search engines.

Typing “Joe Biden is” or “Donald Trump is” in auto-complete, Google offered predicted language that was more innocuous than the other search engines. Similar differences were shown for other presidential candidates tested by the Journal.

The Journal also tested several search terms in auto-complete such as “immigrants are” and “abortion is.” Google’s predicted searches were less inflammatory than those of the other engines.

Gabriel Weinberg, DuckDuckGo’s chief executive, said that for certain words or phrases entered into the search box, such as ones that might be offensive, DuckDuckGo has decided to block all of its auto-complete suggestions, which it licenses from Yahoo. He said that type of block wasn’t triggered in the Journal’s searches for Donald Trump or Joe Biden.

A spokeswoman for Yahoo operator Verizon Media said, “We are committed to delivering a safe and trustworthy search experience to our users and partners, and we work diligently to ensure that search suggestions within Yahoo Search reflect that commitment.”

{snip}

In other areas of the Journal analysis, Google’s results in organic search and news for a number of hot-button terms and politicians’ names showed prominent representation of both conservative and liberal news outlets.

Algorithms are effectively recipes in code form, providing step-by-step instructions for how computers should solve certain problems. They drive not just the internet, but the apps that populate phones and tablets.

Algorithms determine which friends show up in a Facebook user’s news feed, which Twitter posts are most likely to go viral and how much an Uber ride should cost during rush hour as opposed to the middle of the night. They are used by banks to screen loan applications, businesses to look for the best job applicants and insurers to determine a person’s expected lifespan.

In the beginning, their power was rarely questioned. At Google in particular, its innovative algorithms ranked web content in a way that was groundbreaking, and hugely lucrative. The company aimed to make the web useful while relying on the assumption that code alone could do the heavy lifting of figuring out how to rank information.

But bad actors are increasingly trying to manipulate search results, businesses are trying to game the system and misinformation is rampant across tech platforms. Google found itself facing a version of the pressures on Facebook, which long said it was just connecting people but has been forced to more aggressively police content on its platform.

A 2016 internal investigation at Google showed between a 10th of a percent and a quarter of a percent of search queries were returning misinformation of some kind, according to one Google executive who works on search. It was a small number percentage-wise, but given the huge volume of Google searches it would amount to nearly two billion searches a year.

{snip}

Google’s Ms. Levin said the number includes not just misinformation but also a “wide range of other content defined as lowest quality.” She disputed the Journal’s estimate of the number of searches that were affected. The company doesn’t disclose metrics on Google searches.

Google assembled a small SWAT team to work on the problem that became known internally as “Project Owl.” Borrowing from the strategy used earlier to fight spam, engineers worked to emphasize factors on a page that are proxies for “authoritativeness,” effectively pushing down pages that don’t display those attributes.

{snip}

One Google search executive described the problem of defining misinformation as incredibly hard, and said the company didn’t want to go down the path of figuring it out.

Around the time Google started addressing issues such as misinformation, it started fielding even more complaints, to the point where human interference became more routine, according to people familiar with the matter, putting it in the position of arbitrating some of society’s most complicated issues. Some changes to search results might be considered reasonable—boosting trusted websites like the National Suicide Prevention Lifeline, for example—but Google has made little disclosure about when changes are made, or why.

{snip}

The U.S. Justice Department earlier this year opened an antitrust probe, in which Google’s search policies and practices are expected to be areas of focus. Google executives have twice been called to testify before Congress in the past year over concerns about political bias. In the European Union, Google has been fined more than $9 billion in the past three years for anticompetitive practices, including allegedly using its search engine to favor its own products.

In response, Google has said it faces tough competition in a dynamic tech sector, and that its behavior is aimed at helping create choice for consumers, not hurting rivals. The company is currently appealing the decisions against it in the EU, and it has denied claims of political bias.

Google rarely releases detailed information on algorithm changes, and its moves have bedeviled companies and interest groups, who feel they are operating at the tech giant’s whim.

In one change hotly contested within Google, engineers opted to tilt results to favor prominent businesses over smaller ones, based on the argument that customers were more likely to get what they wanted at larger outlets. One effect of the change was a boost to Amazon’s products, even if the items had been discontinued, according to people familiar with the matter.

{snip}

Google engineers said it is widely acknowledged within the company that search is a zero-sum game: A change that helps lift one result inevitably pushes down another, often with considerable impact on the businesses involved.

Ms. Levin said there is no guidance in Google’s rater guidelines that suggest big sites are inherently more authoritative than small sites. “It’s inaccurate to suggest we did not address issues like discontinued products appearing high up in results,” she added.

Many of the changes within Google have coincided with its gradual evolution from a company with an engineering-focused, almost academic culture into an advertising behemoth and one of the most profitable companies in the world. Advertising revenue—which includes ads on search as well as on other products such as maps and YouTube—was $116.3 billion last year.

Some very big advertisers received direct advice on how to improve their organic search results, a perk not available to businesses with no contacts at Google, according to people familiar with the matter. In some cases, that help included sending in search engineers to explain a problem, they said.

{snip}

Ms. Levin, the Google spokeswoman, said the search team’s practice is to not provide specialized guidance to website owners. She also said that faster indexing of a site isn’t a guarantee that it will rank higher. “We prioritize issues based on impact, not any commercial relationships,” she said.

{snip}

(The Wall Street Journal is owned by News Corp, which has complained publicly about Google’s moves to play down news sites that charge for subscriptions. Google ended the policy after intensive lobbying by News Corp and other paywalled publishers. More recently, News Corp has called for an “algorithm review board” to oversee Google, Facebook and other tech giants. News Corp has a commercial agreement to supply news through Facebook, and Dow Jones & Co., publisher of The Wall Street Journal, has a commercial agreement to supply news through Apple services. Google’s Ms. Levin and News Corp declined to comment.)

{snip}

Google’s Ms. Levin said “extreme transparency has historically proven to empower bad actors in a way that hurts our users and website owners who play by the rules.”

“Building a service like this means making tens of thousands of really, really complicated human decisions, and that’s not what people think,” said John Bowers, a research associate at the Berkman Klein Center.

On one extreme, those decisions at Google are made by the world’s most accomplished and highest-paid engineers, whose job is to turn the dials within millions of lines of complex code. On the other is an army of more than 10,000 contract workers, who work from home and get paid by the hour to evaluate search results.

{snip}

One of those evaluators was Zack Langley, now a 27-year-old logistics manager at a tour company in New Orleans. Mr. Langley got a one-year contract in the spring of 2016 evaluating Google’s search results through Lionbridge Technologies Inc., one of several companies Google and other tech platforms use for contract work.

{snip}

He said contractors would get notes from Lionbridge that he believed came from Google telling them the “correct” results on other searches.

He said that in late 2016, as the election approached, Google officials got more involved in dictating the best results, although not necessarily on issues related to the campaign. “They used to have a hands-off approach, and then it seemed to change,” he said.

{snip}

One of the first hot-button issues surfaced in 2015, according to people familiar with the matter, when some employees complained that a search for “how do vaccines cause autism” delivered misinformation through sites that oppose vaccinations.

At least one employee defended the result, writing that Google should “let the algorithms decide” what shows up, according to one person familiar with the matter. Instead, the people said, Google made a change so that the first result is a site called howdovaccinescauseautism.com—which states on its home page in large black letters, “They f—ing don’t.” (The phrase has become a meme within Google.)

Google’s Ms. Levin declined to comment.

{snip}

Google has guidelines for changing its ranking algorithms, a grueling process called the “launch committee.” Google executives have pointed to this process in a general way in congressional testimony when asked about algorithm changes.

{snip}

Google’s Ms. Levin said not every algorithm change is discussed in a meeting, but “there are other processes for reviewing more straightforward launches at different levels of the organization,” such as an email review.

{snip}

When Pinterest Inc. filed to go public earlier this year, it said that “search engines, such as Google, may modify their algorithms and policies or enforce those policies in ways that are detrimental to us.” It added: “Our ability to appeal these actions is limited.”

{snip}

In April, the conservative Heritage Foundation called Google to complain that a coming movie called “Unplanned” had been labeled in a knowledge panel as “propaganda,” according to a person familiar with the matter. The film is about a former Planned Parenthood director who had a change of heart and became pro-life.

After the Heritage Foundation complained to a contact at Google, the company apologized and removed “propaganda” from the description, that person said.

Google’s Ms. Levin said the change “was not the result of pressure from an outside group, it was a violation of the feature’s policy.”

On the auto-complete feature, Google reached a confidential settlement in France in 2012 with several outside groups that had complained it was anti-Semitic that Google was suggesting the French word for “Jew” when searchers typed in the name of several prominent politicians. Google agreed to “algorithmically mitigate” such suggestions as part of a pact that barred the parties from disclosing its terms, according to people familiar with the matter.

{snip}

Google still maintains lists of phrases and terms that are manually blacklisted from auto-complete, according to people familiar with the matter.

The company internally has a “clearly articulated set of policies” about what terms or phrases might be blacklisted in auto-complete, and that it follows those rules, according to a person familiar with the matter.

Blacklists also affect the results in organic search and Google News, as well as other search products, such as Web answers and knowledge panels, according to people familiar with the matter.

Google has said in congressional testimony it doesn’t use blacklists. Asked in a 2018 hearing whether Google had ever blacklisted a “company, group, individual or outlet…for political reasons,” Karan Bhatia, Google’s vice president of public policy, responded: “No, ma’am, we don’t use blacklists/whitelists to influence our search results,” according to the transcript.

Ms. Levin said those statements were related to blacklists targeting political groups, which she said the company doesn’t keep.

{snip}

The Journal reviewed a draft policy document from August 2018 that outlines how Google employees should implement an anti-misinformation blacklist aimed at blocking certain publishers from appearing in Google News and other search products. The document says engineers should focus on “a publisher misrepresenting their ownership or web properties” and having “deceptive content”—that is, sites that actively aim to mislead—as opposed to those that have inaccurate content.

{snip}

Some individuals and companies said changes made by the company seem ad hoc, or inconsistent. People familiar with the matter said Google increasingly will make manual or algorithmic changes that aren’t acknowledged publicly in order to maintain that it isn’t affected by outside pressure.

“It’s very convenient for us to say that the algorithms make all the decisions,” said one former Google executive.

In March 2017, Google updated the guidelines it gives contractors who evaluate search results, instructing them for the first time to give low-quality ratings to sites “created with the sole purpose of promoting hate or violence against a group of people”—something that would help adjust Google algorithms to lower those sites in search.

The next year, the company broadened the guidance to any pages that promote such hate or violence, even if it isn’t the page’s sole purpose and even if it is “expressed in polite or even academic-sounding language.”

Google has resisted entirely removing some content that outsiders complained should be blocked. In May 2018, Ignacio Wenley Palacios, a Spain-based lawyer working for the Lawfare Project, a nonprofit that funds litigation to protect Jewish people, asked Google to remove an anti-Semitic article lauding a German Holocaust denier posted on a Spanish-language neo-Nazi blog.

The company declined. In an email to Mr. Wenley Palacios, lawyers for Google contended that “while such content is detestable” it isn’t “manifestly illegal” in Spain.

Mr. Wenley Palacios then filed a lawsuit, but in the spring of this year, before the suit could be heard, he said, Google lawyers told him the company was changing its policy on such removals in Spain.

According to Mr. Wenley Palacios, the lawyers said the firm would now remove from searches conducted in Spain any links to Holocaust denial and other content that could hurt vulnerable minorities, once they are pointed out to the company. The results would still be accessible outside of Spain. He said both sides agreed to dismiss the case.

Google’s Ms. Levin described the action as a “legal removal” in accordance with local law. Holocaust denial isn’t illegal in Spain, but if it is coupled with an intent to spread hate, it can fall under Spanish criminal law banning certain forms of hate speech.

“Google used to say, ‘We don’t approve of the content, but that’s what it is,’ ” Mr. Wenley Palacios said. “That has changed dramatically.”

{snip}

Criticism alleging political bias in Google’s search results has sharpened since the 2016 election.

Interest groups from the right and left have besieged Google with questions about content displayed in search results and about why the company’s algorithms returned certain information over others.

{snip}

Over the past year, abortion-rights groups have complained about search results that turned up the websites of what are known as “crisis pregnancy centers,” organizations that counsel women against having abortions, according to people familiar with the matter.

One of the complaining organizations was Naral Pro-Choice America, which tracks the activities of anti-abortion groups through its opposition research department, said spokeswoman Kristin Ford.

{snip}

In June, Google updated its advertising policies related to abortion, saying that advertisers must state whether they provide abortions or not, according to its website. Ms. Ford said Naral wasn’t told in advance of the policy change.

Ms. Levin said Google didn’t implement any changes with regard to how crisis pregnancy centers rank for abortion queries.

The Journal tested the term “abortion” in organic search results over 17 days in July and August. Thirty-nine percent of all results on the first page had the hostname www.plannedparenthood.org, the site of Planned Parenthood Federation of America, the nonprofit, abortion-rights organization.

By comparison, 14% of Bing’s first page of search results and 16% of DuckDuckGo’s first page of results were from Planned Parenthood.

Ms. Levin said Google doesn’t have any particular ranking implementations aimed at promoting Planned Parenthood.

The practice of creating blacklists for certain types of sites or searches has fueled cries of political bias from some Google engineers and right-wing publications that said they have viewed portions of the blacklists. Some of the websites Google appears to have targeted in Google News were conservative sites and blogs, according to documents reviewed by the Journal. In one partial blacklist reviewed by the Journal, some conservative and right-wing websites, including The Gateway Pundit and The United West, were included on a list of hundreds of websites that wouldn’t appear in news or featured products, although they could appear in organic search results.

{snip}

Demands from governments for changes have grown rapidly since 2016.

From 2010 to 2018, Google fielded such requests from countries including the U.S. to remove 685,000 links from what Google calls web search. The requests came from courts or other authorities that said the links broke local laws or should be removed for other reasons.

Nearly 78% of those removal requests have been since the beginning of 2016, according to reports that Google publishes on its website. Google’s ultimate actions on those requests weren’t disclosed

{snip}