Posted on March 1, 2019

Facebook Insider, Formerly Responsible for Content Review in Facebook’s Intellectual Property Dept Speaks Out, Loses Job

Project Veritas, February 27, 2019

View the documents here.

Project Veritas has obtained and published documents and presentation materials from a former Facebook insider. This information describes how Facebook engineers plan and go about policing political speech. Screenshots from a Facebook workstation show the specific technical actions taken against political figures, as well as “[e]xisting strategies” taken to combat political speech.

{snip}

To gain a better understanding of the documents, Project Veritas spoke with the Facebook insider in an interview. The insider separated from Facebook in 2018 and was later hired by Project Veritas.

{snip}

According to the insider, the documents revealed a routine suppression of the distribution of conservative Facebook pages. The technical action she repeatedly saw, and for which Project Veritas was provided documentation, was labeled ActionDeboostLiveDistribution. Said the insider, “I would see [this term] appear on several different conservative pages. I first noticed it with an account that I can’t remember, but I remember once I started looking at it, I also saw it on Mike Cernovich’s page, saw it on Steven Crowder’s page, as well as the Daily Caller’s page.”

{snip}

A screenshot of an action log on Mike Cernovich’s Facebook page provided by the insider, shows the tag. The insider believes that the “deboost” code suppresses the distribution of livestream videos on Facebook. Project Veritas spoke to a current Facebook employee off the record who said that the code could limit a video’s visibility in news feeds, remove sharing features, and disable interactive notifications.

When approached for comment, author and filmmaker Mike Cernovich said the troubling issue is that Facebook could just “make stuff up” about people through these systems. “Facebook, or an individual at Facebook, has the unilateral power to create false allegations against someone he or she doesn’t like. The person accused not only can’t do anything about the allegation, they don’t even have an idea the allegation was made,” said Cernovich.

The insider says that unlike many actions that Facebook content moderators can take against pages, the “deboost” action, which appears to occur algorithmically, does not notify the page’s owner. {snip}

Upon further review, the insider says she did not notice the tag on any left-wing pages. “I looked at the Young Turks’ page, I looked at Colin Kaepernick’s page, none of them had received the same deboost comment.”

{snip}

“They’re shifting the goal post”

Also in the in the documents was a presentation, authored by Facebook engineers Seiji Yamamoto and Eduardo Arino de la Rubia, titled “Coordinating Trolling on FB.” Yamamoto is a Data Science Manager, and de la Rubia is a Chief Data Scientist at Facebook. The presentation appears to describe the current actions, as well as potential future actions, Facebook can take to combat alleged abusive behavior on the platform.

Yamamoto, who is responsible for “News Feed Reduction Strategy,” also authored a post where he said Facebook should address “…quite a bit of content near the perimeter of hate speech.” Said the Facebook insider, the “perimeter of hate speech” means “things that aren’t actually hate speech but that might offend somebody. Anything that is perceived as hateful but no court would define it as hate speech.”

The insider believes Yamamoto’s plans appears to be political in nature, rather than in response to abusive behaviors, “[i]t was clearly kind of designed… aimed to be the right wing meme culture that’s become extremely prevalent in the past few years. And some of the words that appeared on there were, using words like SJWMSM… the New York Times doesn’t talk about the MSM. The independent conservative outlets are using that language.”

Also in Yamamoto’s report was a line appearing to say that online Facebook trolls are involved in “destructive behaviors” such as “[r]ed-pilling normies to convert them to their worldview.”

In online circles the term “red-pilling” refers to bluntly showing the truth, and “normies” refers generally to apolitical or uninformed people. Directly below the line in the document is hyperlink labeled “example video.”

The video linked in the presentation was made by Lauren Chen, a conservative commentator who now hosts a program on BlazeTV. “If you actually watch the video you can see that it clearly isn’t abusive or promoting harassment, the video was a criticism of social justice,” said Chen when asked for comment on this story. She added that “the video actually promotes equality and individualism.”

On a page from the presentation titled “Strategies we use today,” Yamamoto and de la Rubia list “demote bad content.” They add, “… we should still of course delete and demote, but we can do even more…”

Other actions that could be interpreted as “bad content” could be posting words such as “zucced,” “REEE,” and “normie.” Said the insider, Facebook is “shifting the goal post. It’s one thing, if you’re dropping the n-word, or things like that, using some kind of homophobic or racial slur, by all means that’s something that a platform should not want on it. But now you’re moving it to things like, jokes that conservatives tend to make.”

{snip}

Two of the “tactics” outlined in the presentation that the Facebook engineers propose for dealing with “troll operations” involve the introduction of a “Troll Twilight Zone.”

Yamamoto and de la Rubia’s presentation says that “troll accounts,” can have their internet bandwidth limited and experience forced glitches like frequent “auto-logout[s]” and the failed upload of comments. These “special features” would be triggered “leading up to important elections.”

Facebook could identify trolls by their vocabulary, friend network, and behavior, according to the presentation. “Facebook has what’s called a Fake Account Index,” explained the insider, “where they assign a score which helps them determine whether the account is a real person or just a dummy spam account. {snip}”

{snip}

Another proposed tactic in the presentation would apparently alert a “troll’s” friends list when they have been banned. {snip}

{snip}

The presentation says that notifying a “troll’s” friend list would “strike fear in the hearts of trolls…” and “[n]otified users who accidentally befriended the offender might be more mindful of suspicious accounts, increasing overall herd immunity.”

The insider now works for Project Veritas.

{snip}

{snip} When asked if supplying documents and testimony to Project Veritas was “worth it,” the insider said “Yes… I knew what I had seen… this is something they were trying to keep in the shadows, that they did not want to public to know and yet the public has a right to know.”

Project Veritas founder James O’Keefe believes that “our collective future depends on those who are willing to give up everything for what they believe.” He believes that if more insiders from large companies step forward and expose similar dishonesty and wrongdoing, that the country will be better educated. O’Keefe said:

“What are you willing to give up? How many of you will step forward? While they may be able to stop one of us, they won’t be able to stop an army. Be brave. Do something.”