Amanda Seitz, Associated Press, June 10, 2022
The social media posts are of a distinct type. They hint darkly that the CIA or the FBI are behind mass shootings. They traffic in racist, sexist and homophobic tropes. They revel in the prospect of a “white boy summer.”
White nationalists and supremacists, on accounts often run by young men, are building thriving, macho communities across social media platforms like Instagram, Telegram and TikTok, evading detection with coded hashtags and innuendo.
Their snarky memes and trendy videos are riling up thousands of followers on divisive issues including abortion, guns, immigration and LGBTQ rights. The Department of Homeland Security warned Tuesday that such skewed framing of the subjects could drive extremists to violently attack public places across the U.S. in the coming months.
These type of threats and racist ideology have become so commonplace on social media that it’s nearly impossible for law enforcement to separate internet ramblings from dangerous, potentially violent people, Michael German, who infiltrated white supremacy groups as an FBI agent, told the Senate Judiciary Committee on Tuesday.
DHS and the FBI are also working with state and local agencies to raise awareness about the increased threat around the U.S. in the coming months.
References to hate-filled ideologies are more elusive across mainstream platforms like Twitter, Instagram, TikTok and Telegram. To avoid detection from artificial intelligence-powered moderation, users don’t use obvious terms like “white genocide” or “white power” in conversation.
They signal their beliefs in other ways: a Christian cross emoji in their profile or words like “anglo” or “pilled,” a term embraced by far-right chatrooms, in usernames. Most recently, some of these accounts have borrowed the pop song “White Boy Summer” to cheer on the leaked Supreme Court draft opinion on Roe v. Wade, according to an analysis by Zignal Labs, a social media intelligence firm.
Facebook and Instagram owner Meta banned praise and support for white nationalist and separatists movements in 2019 on company platforms, but the social media shift to subtlety makes it difficult to moderate the posts. Meta says it has more than 350 experts, with backgrounds from national security to radicalization research, dedicated to ridding the site of such hateful speech.
A closer look reveals hundreds of posts steeped in sexist, antisemitic, racist and homophobic content.
U.S. extremists are mimicking the social media strategy used by the Islamic State group, which turned to subtle language and images across Telegram, Facebook and YouTube a decade ago to evade the industry-wide crackdown of the terrorist group’s online presence, said Mia Bloom, a communications professor at Georgia State University.
“They’re trying to recruit,” said Bloom, who has researched social media use for both Islamic State terrorists and far-right extremists. “We’re starting to see some of the same patterns with ISIS and the far-right. The coded speech, the ways to evade AI. The groups were appealing to a younger and younger crowd.”
For example, on Instagram, one of the most popular apps for teens and young adults, white supremacists amplify each other’s content daily and point their followers to new accounts.