Racist and Violent Ideas Jump From Web’s Fringes to Mainstream Sites

On March 30, a young man was accused of shooting at a Tops grocery store in Buffalo through a smorgasbord of racist and antisemitic websites online. On BitChute, a video sharing site known for hosting right-wing extremism, he heard a lecture on the decline of the American middle class by a Finnish extremist. On YouTube he found a lurid video of a car driving through the Black neighborhoods in Detroit.

Over the course of the week that followed, his online writing shows, he lingered in furtive chat rooms on Reddit and 4chan but also read articles on HuffPost and Medium. He watched local television news reports of gruesome crimes. He toggled between “documentaries” on extremist websites and gun tutorials on YouTube.

The young man, who was indicted by a grand jury last week, has been portrayed by authorities and some media outlets as a troubled outcast who acted alone when he killed 10 black people at a grocery store and injured three more. In fact, he dwelled in numerous online communities where he and others consumed and shared racist and violent content.

As the number of mass shootings escalates, experts say many disturbing ideas that fuel atrocities are no longer relegated to a handful of tricky-to-find dark corners of the web. More and more outlets, both fringe and mainstream, host bigoted content, often in the name of free speech. And the inability – or unwillingness – of online services to contain violent content threatens to draw more people hateful postings.

Many images and text that the young man had in his extensive writings, including a diary and a 180-page “manifesto,” have circulated for years online. Often, they are infiltrated by some of the world’s most popular sites, such as Reddit and Twitter. His path to radicalization, illustrated in these documents, reveals the limits of efforts by companies like Twitter and Google to moderate posts, images and videos that promote extremism and violence. Enough of the content that is currently open can be pipeline for users to find more extreme websites just a click or two away.

It’s quite prolific on the Internet, “said Eric K. Ward, a senior fellow at the Southern Poverty Law Center who is also the executive director at the Western States Center, a nonprofit research organization. “It’s not just going to fall in your lap; You have to start looking for it. But once you start looking for it, the problem is that it starts to rain down on a person in abundance. “

The Buffalo attack has renewed focus on the role that social media and other websites continue to play in violent extremism, with criticism coming from the public as well as government officials.

“The fact that this act of barbarism, the execution of innocent human beings, could be livestreamed on social media platforms and not taken down within a second tells me that there is a responsibility out there,” Gov. Kathy Hochul of New York said after shooting in Buffalo. Four days later, the state’s attorney general, Letitia James, announced that she had begun an investigation into the role the platform played.

Facebook pointed to its rules and policies that prohibit hateful content. In a statement, a spokeswoman said the platform detects over 96 percent of content tied to hate organizations before it is reported. Twitter declined to comment. Some of the social media posts on Facebook, Twitter and Reddit that The New York Times identified through reverse image searches were deleted; Some of the accounts that were shared with the images were suspended.

The man charged in the killings, Payton Gendron, 18, detailed his attack on Discord, a chat app that emerged from the video game world in 2015, and streamed it live on Twitch, which Amazon owns. The company managed to take down his video within two minutes, but many of the sources of the disinformation he cited remain online even now.

His paper trail provides a chilling glimpse into how he prepared a deadly assault online, culling tips on weaponry and tactics and finding inspiration in fellow racists and previous attacks that he largely mimicked with his own. Altogether, the content formed a twisted and racist view of reality. The gunman considered the ideas to be an alternative to mainstream views.

“How does one prevent a shooter like I ask you?” He wrote on Discord in April, more than a month before the shooting. “The only way is to prevent them from learning the truth.”

His writings map in detail the websites that motivated him. Much of the information he cobbled together in links to his writings or images he had cherry-picked to match his racist views, reflecting the kind of online life he lived.

By his own account, the young man’s radicalization began not long after the Covid-19 pandemic, when he was largely restricted to his home like millions of other Americans. He described getting his news from Reddit before joining 4chan, the online message board. He followed up on topics like guns and the outdoors before finding another devoted to politics, finally settling in a place that allowed a toxic mélange of racist and extremist disinformation.

Although he frequented sites like 4chan to be known on the fringes, he also spent time on mainstream sites, according to his own record, especially YouTube, where he found graphic scenes from police cameras and videos describing gun tips and tricks. As the attack neared the day, the gunman watched more YouTube videos about mass shootings and police officers engaged in gunfights.

YouTube said it had reviewed all the videos that appeared in the diary. Three videos were removed because they linked to websites that violated YouTube’s firearms policy, which “prohibits content intended to instruct viewers to make firearms, manufacture accessories that convert a firearm to an automatic fire, or livestreaming content that shows someone operating a firearm,” according to to Jack Malon, a YouTube spokesman.

At the center of the shooting, like others before it, was a false conviction that an international Jewish conspiracy intends to supplant white voters with immigrants who will frequently take over political power in America.

The conspiracy, known as the “great replacement theory,” has roots at least back to the czarist Russian antisemitic hoax called “The Protocols of the Elders of Zion,” which purported to be a Jewish plot to overtake Christianity in Europe.

It resurfaced more recently in the works of two French novelists, Jean Raspail and Renaud Camus, who, four decades apart, imagined waves of immigrants taking power in France. It was Mr. Camus, a socialist turned far-right populist, who popularized the term “le grand remplacement” in a novel by that name in 2011.

Mr. Gendron, according to the documents he posted, seems to have read none of those; Instead he attributed the “great replacement” to no writings online by the gunman who murdered 51 Muslims at Two Mosques in Christchurch, New Zealand, in 2019.

After that attack, New Zealand’s prime minister, Jacinda Ardern, spearheaded an international pact, called the Christchurch Call, that saw government and major tech companies commit to ending terrorist and extremist content online. Although the agreement carried no legal penalties, the Trump administration rejected the sign, citing the principle of free speech.

Mr. Gendron’s experience online shows that writings and video clips associated with the Christchurch shooting remain available to inspire other acts of racially motivated violence. He referred to both repeatedly.

The Anti-Defamation League warned last year that a “great replacement” had moved from the fringes of white supremacist beliefs to the mainstream, pointing to chants of protesters at the 2017 “Unite the Right” rally in Charlottesville, Va., That erupted. In violence and the commentaries of Tucker Carlson on Fox News.

“Most of us don’t know the origin story,” Mr. The Ward of the Southern Poverty Law Center said. “What we know is the narrative, and the great replacement theory of the narrative has been credited by elected officials and personalities to such an extent that the origins of the story no longer need to be told. People are just beginning to understand it as if they might understand conventional wisdom. And that ‘s what frightening. “

For all the efforts some major social media platforms have made to moderate content online, the algorithms they use – often meant to show users posts that they will read, watch and click – can accelerate the spread of disinformation and other harmful content.

Media Matters for America, a liberal-leaning nonprofit, said last month that its researchers found at least 50 ads on Facebook over the last two years promoting aspects of the “great replacement” and related themes. Many of the ads came from candidates for political office, even though the company, now known as Meta, announced in 2019 that it would bar white nationalist and white separatist content from Facebook and Instagram.

The organization’s researchers also found that 907 posts on right-wing sites on the same themes drew more than 1.5 million engagements, far more than posts intended to debunk them.

Though Mr. Gendron’s video of the shooting was removed from Twitch, it resurfaced on 4chan, even while he was still at the scene of the crime. The video has since spread to other fringe platforms like Gab and eventually mainstream platforms like Twitter, Reddit and Facebook.

The advent of social media has a fairly short period of time enabled nefarious ideas and conspiracies that once simmered in relative isolation to proliferate through society, bringing people together to be animated by hate, said Angelo Carusone, President of Media Matters for America.

“They’re not isolated anymore,” he said. “They’ve been connected.”

Leave a Comment