Perhaps you have an image in your mind of people who get brainwashed by YouTube.
You might picture your cousin who loves to watch videos of cuddly animals. Then out of the blue, YouTube’s algorithm plops a terrorist recruitment video at the top of the app and continues to suggest ever more extreme videos until he’s persuaded to take up arms.
A new analysis adds nuance to our understanding of YouTube’s role in spreading beliefs that are far beyond the mainstream.
A group of academics found that YouTube rarely suggests videos that might feature conspiracy theories, extreme bigotry or quack science to people who have little interest in such material. And those people are unlikely to follow such computerized recommendations when they are offered. The kittens-to-terrorist pipeline is extremely uncommon.
That doesn’t mean YouTube is not a force in radicalization. The paper also found that research volunteers who already held bigoted views or followed YouTube channels that frequently feature fringe beliefs were far more likely to seek out or recommend more videos along the same lines.
The findings suggest that policymakers, internet executives and the public should focus less on the potential risk of an unwitting person being led into extremist ideology on YouTube, and more on ways that YouTube may help validate and harden the views of people already inclined to such. beliefs.
“We understand the way that social media facilitates demand meeting supply’s extreme viewpoints,” said Brendan Nyhan, one of the paper’s co-authors and a Dartmouth College professor who studies misperceptions about politics and health care. “Even a few people with extreme views can create grave harm in the world.”
People watch more than a billion hours of YouTube videos daily. There are perennial concerns that the Google-owned site may amplify extremist voices, silence expression or both, similar to the worries that surround Facebook.
This is just a piece of research, and I mention below some limitations of the analysis. But what’s intriguing is that the research challenges the binary notion that turning YouTube’s algorithm risks into any of us monsters or that kooky things on the Internet do little harm. Neither may be true.
(You can read the research paper here. A version of it was also published earlier by the Anti-Defamation League.)
Digging into the details, about 0.6 percent of research participants were responsible for about 80 percent of the total watch time for YouTube channels that were classified as “extremist,” such as the far-right figures David Duke and Mike Cernovich. (YouTube banned Duke’s channel in 2020.)
Most of those people find the videos by accident but not by following web links, clicking on videos from YouTube channels that they subscribe to, or following YouTube’s recommendations. About one in four videos that YouTube recommended to people watching an extreme YouTube channel were another video like this.
Only 108 times during the research – about 0.02 percent of all video visits the researchers observed – someone watching a traditionally traditional YouTube channel following a computerized suggestion to an outside-the-mainstream channel when they were not already subscribed.
The analysis suggests that most of the audience for YouTube videos promoting fringe beliefs are people who want to watch them, and then YouTube feeds them more. The researchers found that observership was far more likely among volunteers who displayed high levels of gender or racial resentment, as measured based on their responses to surveys.
“Our results make it clear that YouTube continues to provide a platform for alternative and extreme content to be distributed to vulnerable audiences,” the researchers wrote.
Like all research, this analysis has caveats. The study was conducted in 2020, after YouTube made significant changes to curtail recommending videos that misinform people in a harmful way. That makes it difficult to know the patterns that researchers have found in YouTube recommendations will have been different in prior years.
Independent experts have also not yet rigorously reviewed the data and analysis, and the research did not examine in detail the relationship between watching YouTubers such as Laura Loomer and Candace Owens, some of whom are critically named and described as having “alternative” channels. , and viewership of extreme videos.
More studies are needed, but these findings suggest two things. First, YouTube deserves credit for the changes it made to reduce the way that the site pushed people out of the mainstream into the mainstream that they weren’t intentionally seeking.
Second, there needs to be more conversation about how much further YouTube should go to reduce the exposure to potentially extreme or dangerous ideas of people who are inclined to believe them. Even a small minority of YouTube’s audience that may watch regularly extreme videos are many millions of people.
Should YouTube make it more difficult, for example, for people to link to fringe videos – something it has considered? Should the site make it harder for people to subscribe to extremist channels or watch those videos or similar ones? Or is the status quo fine?
This research reminds us to continually wrestle with the complicated ways that social media can both mirror a nastiness in our world and reinforce it, and resist easy explanations. There are none.
Tip of the Week
The usual human guide to digital privacy
Brian X. ChenThe Consumer Tech columnist for The New York Times, is here to break down what you need to know about tracking online.
Last week, listeners asked the KQED Forum radio program questions about internet privacy. Our conversation illuminated just how many people were concerned about their digital activity being monitored and how confused they were about what they could do.
Here’s a rundown that I hope will help On Tech readers.
There are two broad types of digital tracking. “Third-party” tracking is what we often find creepy. If you visit a shoe website and it logs what you looked for, you might then keep seeing ads for those shoes everywhere else online. Repeated across many websites and apps, marketers compile a record of your activity on target ads at you.
If you are concerned about this, you can try a web browser such as Firefox or Brave that blocks this type of tracking. Google says its Chrome web browser will do the same in 2023. Last year, Apple gave iPhone owners the option to say no to online surveillance of this type, and Android phone owners will have a similar option at some point.
If you want to go the extra mile, you can download tracker blockers, like uBlock Origin or an app called 1Blocker.
Third-party tracking on the squeeze has shifted the focus to “first-party ”data collectionwhich is what a website or app is monitoring when you use its product.
If you search for directions to a Chinese restaurant in a mapping app, this app may assume that you like Chinese food and allow other Chinese restaurants to advertise to you. Many people consider this less creepy and potentially useful.
You don’t have much choice if you want to avoid first-party tracking other than using a website or app. You could also use the app or website to log in without minimizing the information that is collected, though that may limit what you can do there.
Before we go…
Barack Obama crusades against disinformation: The former president is spreading a message about the risks of online falsehoods. He’s wading into a “fierce but inconclusive debate over how best to restore trust online,” reported my colleagues Steven Lee Myers and Cecilia Kang.
Elon Musk’s funding is apparently secured: The chief executive of Tesla and SpaceX detailed its roughly $ 46.5 billion offer for loans and other financing commitments to buy Twitter. Twitter’s board must decide whether to accept it, and Musk has suggested that he should instead let Twitter shareholders decide for themselves.
Three ways to cut your tech spend: Brian Chen has tips on how to identify which online subscriptions you may want to trim, save money on your cellphone bill and decide when you might (and may not) need a new phone.
Hugs to this
Welcome to a penguin chick’s first swim.
We want to hear from you. Tell us what you think of this newsletter and what else you like to explore. You can reach us at firstname.lastname@example.orgGeneral Chat Chat Lounge
If you don’t already get this newsletter in your inbox, Please sign up hereGeneral Chat Chat Lounge You can also read past On Tech columnsGeneral Chat Chat Lounge