Posted on October 26, 2021

Facebook’s Internal Chat Boards Show Politics Often at Center of Decision Making

Keach Hagey and Jeff Horwitz, Wall Street Journal, October 24, 2021

In June 2020, when America was rocked by protests over the death of George Floyd at the hands of a Minneapolis police officer, a Facebook employee posted a message on the company’s racial-justice chat board: “Get Breitbart out of News Tab.”

News Tab is a feature that aggregates and promotes articles from various publishers, chosen by Facebook. The employee’s message included screenshots of headlines on Breitbart’s website, such as “Minneapolis Mayhem: Riots in Masks,” “Massive Looting, Buildings in Flames, Bonfires!” and “BLM Protesters Pummel Police Cars on 101.”

The employee said they were “emblematic of a concerted effort at Breitbart and similarly hyperpartisan sources (none of which belong in News Tab) to paint Black Americans and Black-led movements in a very negative way,” according to written conversations on Facebook’s office communication system reviewed by The Wall Street Journal. Many other employees chimed in to agree.

In the same chat, a company researcher said any steps aimed at removing Breitbart—a right-wing publisher popular with supporters of former President Donald Trump —could face roadblocks internally because of the potential political blowback. “At best, it would be a very difficult policy discussion,” the researcher said.

Facebook chose to keep Breitbart on News Tab. A spokeswoman for the tech giant said the company makes a judgment based on the specific content published on Facebook, not the entire Breitbart site, and that the Facebook material met its requirements, including the need to abide by its rules against misinformation and hate speech.

Many Republicans, from Mr. Trump down, say Facebook discriminates against conservatives. The documents reviewed by the Journal didn’t render a verdict on whether bias influences its decisions overall. They do show that employees and their bosses have hotly debated whether and how to restrain right-wing publishers, with more-senior employees often providing a check on agitation from the rank and file. The documents viewed by the Journal, which don’t capture all of the employee messaging, didn’t mention equivalent debates over left-wing publications.

{snip}

Facebook employees, as seen in a large quantity of internal message-board conversations, have agitated consistently for the company to act against far-right sites. In many cases, they have framed their arguments around Facebook’s enforcement of its own rules, alleging that Facebook is giving the right-wing publishers a pass to avoid PR blowback. As one employee put it in an internal communication: “We’re scared of political backlash if we enforce our policies without exemptions.”

Facebook employees focused special attention on Breitbart, the documents show, criticizing Facebook for showcasing the site’s content in News Tab and for helping it to sell ads. They also alleged Facebook gave special treatment to Breitbart and other conservative publishers, helping them skirt penalties for circulating misinformation or hate speech.

Right-wing sites are consistently among the best-performing publishers on the platform in terms of engagement, according to data from research firm NewsWhip. That is one reason Facebook also is criticized by people on the left, who say Facebook’s algorithms reward far-right content.

{snip}

In May 2016, the tech blog Gizmodo reported that Facebook’s “Trending Topics” list routinely suppressed conservative news. Facebook denied the allegations, but the ensuing controversy prompted claims of bias from Republicans that haven’t let up.

Some internal documents show employee antipathy toward conservative media. In 2018, an engineer who had claimed on a message board that Facebook was intolerant of conservatives, left the company. When he took his critique to Tucker Carlson’s Fox News show, some Facebook employees criticized him for going on a network “so infamous and biased it can’t even call itself a news channel,” records from the message boards show. Various employees called Mr. Carlson a “white nationalist” and “partisan hack” who “looks as though he’s a Golden Retriever who has been consistently cheated out of a cache of treats.”

“Any dog comparison is a compliment as far as I’m concerned,” Mr. Carlson said in an interview.

{snip}

In a farewell memo to colleagues in late 2020, a staffer in Facebook’s integrity team, which seeks to mitigate harmful behavior on the platform, said Breitbart was undermining the company’s efforts to fight hate speech.

{snip}

As the May 25 killing of Mr. Floyd inflamed political tensions across the country in 2020, one staffer wrote in the racial-justice chat that he understood “factual progressive and conservative leaning news organizations” both needed to be represented, but that could be done without including Breitbart.

A senior researcher wrote in the chat that it would be a problem for Facebook to remove Breitbart from News Tab for the way it framed news events, such as the protests after Mr. Floyd’s death, because “news framing is not a standard by which we approach journalistic integrity.”

He said if the company removed publishers whose trust and quality scores were going down, Breitbart might be caught in that net. But he questioned whether the company would do that for all publishers whose scores had fallen. “I can also tell you that we saw drops in trust in CNN 2 years ago: would we take the same approach for them too?” he wrote.

He said that Breitbart had been hurt by algorithm changes that favored all content considered trustworthy, which were defensible within Facebook, he wrote, because they applied to all publishers and could be tied to some clear goal of improving user experience.

{snip}

Facebook’s relationship with Breitbart has also come under fire from advertisers and the employees who work on ad sales. In 2018, one employee working on the Facebook Audience Network, a group of third-party publishers for whom Facebook sells advertising, argued that Facebook should drop Breitbart from the network.

“My argument is that allowing Breitbart to monetize through us is, in fact, a political statement,” the person wrote in an internal memo. “It’s an acceptance of extreme, hateful and often false news used to propagate fear, racism and bigotry.”

After the 2016 election, advertisers started looking to avoid Breitbart, which delighted in provoking the left with anti-PC rhetoric and nationalism that critics called racist. In the automated ad system, even if an advertiser didn’t specifically seek to advertise on Breitbart, its ads could appear there.

Many advertisers sought to ensure their ads didn’t appear on Breitbart by taking advantage of a Facebook Audience Network feature that allowed them to block specific websites, the employee wrote, but the tactic wasn’t proving effective.

{snip}

Facebook took steps to damp the spread of what it deemed misinformation in users’ feeds after the 2016 election. That included a tool called “Sparing Sharing,” which targeted “hyperposters,” or accounts that post very frequently. It reduced the reach of their posts, since data had shown these users disproportionately shared false and incendiary information.

Facebook had implemented the change even though Joel Kaplan, Facebook’s global head of public policy and a former deputy chief of staff to former President George W. Bush, had argued against implementing the initiative too aggressively. Mr. Zuckerberg approved the change but ordered that its effects be weakened.

Another tool, called “Informed Engagement,” reduced the reach of posts that people were more likely to share if they hadn’t read them.

The two tweaks successfully shifted the news stories users were likely to see toward a more mainstream, less volatile mix.

In 2019, Facebook data scientists studied the impact of the two tools on dozens of publishers based on their ideologies, according to the documents reviewed by the Journal.

The study, dubbed a “political ideology analysis,” suggested the company had been suppressing the traffic of major far-right publishers, even though that wasn’t its intent, according to the documents. “Very conservative” sites, it found, would benefit the most if the tools were removed, with Breitbart’s traffic increasing an estimated 20%, Washington Times’ 18%, Western Journal’s 16% and Epoch Times’ by 11%, according to the documents.

{snip}

The company stopped the Informed Engagement program but kept Sparing Sharing.

{snip}