YouTube Rabbit Hole Is Different

[ad_1]

Maybe he has an image in mind of people brainwashed by YouTube.

You can imagine your cousin who loves to watch videos of cute animals. Then all of a sudden, YouTube’s algorithm places a video of a terrorist recruiting at the top of the app and keeps recommending videos more extreme than ever before until he’s convinced to take up arms.

A new analysis adds nuances to our understanding of YouTube’s role in spreading beliefs far beyond the mainstream.

A group of academics has found that YouTube rarely recommends videos containing conspiracy theories, extreme bigotry or pseudoscience to people who show little interest in such material. And these people are unlikely to follow such computerized advice when offered. The kitten-terrorist pipeline is extremely rare.

This is not to say that YouTube is not a force in radicalization. The report also found that research volunteers who already hold bigoted views or who frequently follow YouTube channels of differing beliefs were much more likely to search for or recommend more videos along the same lines.

The findings suggest that policy makers, internet administrators and the public should focus less on the potential risk of an unwitting person being led to extremist ideology on YouTube, and more on the ways YouTube can help validate and harden the views of people who already have such a disposition. beliefs.

“We have underestimated the way social media facilitates the supply of extremist views to meet demand,” he said. Brendan Nyhan, one of the paper’s co-authors, and a Dartmouth College professor who studies misconceptions about politics and health care. “Even a few extremists can do great harm in the world.”

People watch over a billion hours of YouTube videos a day. There are persistent concerns that the Google-owned site might raise extremist voices, mute legitimate statements, or both, similar to concerns surrounding Facebook.

This is just a piece of research and I will mention some of the limitations of the analysis below. But what’s intriguing is that the research challenges the dual notion that YouTube’s algorithm risks turning any of us into monsters, or that weird things on the internet do little harm. Both may not be correct.

(You can do read the research paper here. he also had a version previously published by the Anti-Defamation League.)

In detail, about 0.6 percent of research participants were responsible for about 80 percent of the total watch time of YouTube channels classified as “extremist”, such as far-right figures David Duke and Mike Cernovich. (YouTube forbidden Duke’s channel in 2020.)

Most of these people found the videos not by accident, but by following web links, clicking on videos on YouTube channels they subscribed to, or following YouTube’s recommendations. One of about four videos YouTube recommended to people watching an extreme YouTube channel was another video like this one.

Only 108 times during the study—about 0.02 percent of all video visits researchers observed—someone who watched a relatively traditional YouTube channel followed a computerized recommendation to a non-mainstream channel even though they had not yet subscribed.

The analysis shows that most viewers of YouTube videos that support extremist beliefs are people who want to watch them, and then YouTube feeds them the same. The researchers found that volunteers who displayed high levels of gender or racial resentment were much more likely to be onlookers, as measured by their responses to the questionnaires.

“Our results make it clear that YouTube continues to provide a platform for alternative and extreme content to be distributed to vulnerable audiences,” the researchers said.

Like all research, this analysis has caveats. Work, YouTube recommending videos that harmfully misinform people. This makes it difficult to know whether the patterns the researchers found in YouTube recommendations would have been different in previous years.

Independent experts have not yet rigorously reviewed the data and analysis, either, and the research has not examined in detail the relationship between watching YouTubers, which some have called and described as having “alternative” channels, such as Laura Loomer and Candace Owens. , and viewing extreme videos.

More studies are needed, but these findings suggest two things. First, YouTube may deserve credit for the changes the site has made to reduce the ways the site is intentionally pushing people to view outside of the mainstream they aren’t looking for.

Second, there needs to be more discussion about how far YouTube needs to go to reduce its exposure to potentially extreme or dangerous ideas to people who are inclined to believe them. Even a tiny minority of YouTubers who regularly watch extreme videos are millions of people.

Should YouTube, for example, make it harder for people to link to edge videos? thought? Should the site make it harder for people who subscribe to extremist channels to automatically see these videos or suggest similar ones? Or is the current situation good?

This research reminds us to constantly grapple and resist easy explanations in the complex ways that social media can both mirror and amplify the mess in our world. There isn’t any.


Tip of the Week

Brian X ChenThe New York Times’ consumer technology columnist is here to explain what you need to know about online monitoring.

listeners last week KQED Forum radio show asked me about internet privacy. Our conversation highlighted how worried many people are about their digital activity being monitored and how confused they are about what to do.

Here’s a rundown that I hope will help On Tech readers.

There are two broad types of digital monitoring. “Third party” tracking is what we usually find frightening. If you visit a shoe website and it logs what you look at, you may still see ads for these shoes everywhere else online. Repeat marketers on many websites and apps compile a log of your activity to target ads to you.

If you are worried about this, you can try a web browser like Firefox or Brave. automatically blocks this type of monitoring. Google says the Chrome web browser will do the same in 2023. Last year, Apple gave iPhone owners option to say no This type of online surveillance in apps and Android phone owners will have a similar option at some point.

If you want to go the extra mile, you can download tracker blockers like: uBlock Origin or an app called 1Blocker.

Pressure on third party tracking, focus on “first-party data collectionis what a website or app monitors when you use its product.

If you search for directions to a Chinese restaurant in a map app, the app may assume you like Chinese food and let other Chinese restaurants advertise to you. Many people find it less daunting and potentially helpful.

You don’t have much choice if you want to avoid first-party tracking other than not using a website or app. You can also use the app or website without logging in to minimize the information collected, but this may limit what you can do there.

  • Barack Obama fights against disinformation: The ex-president begins spreading a message about the risks of online lies. Colleagues Steven Lee Myers and Cecilia Kang reported that they had “a fierce but fruitless discussion about how best to restore trust online.”

  • Elon Musk’s funding apparently secured: CEO of Tesla and SpaceX detailed Loans and other financing commitments for its approximately $46.5 billion bid to acquire Twitter. The Twitter board must decide whether to accept it, and Musk suggested instead that he wants to let Twitter shareholders decide for themselves.

  • Here are three ways to reduce your tech spending: owned by Brian Chen tips Learn how to identify which online subscriptions you might want to trim, save money on your cell phone bill, and decide when you might (and might not) need a new phone.

welcome penguin chick’s first swim.


We want to hear from you. Let us know what you think of this newsletter and what else you want us to discover. You can reach us at ontech@nytimes.com.

If you have not yet received this newsletter in your inbox, please register here. You can also read History in technology columns.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

/** * The template for displaying the footer * * Contains the closing of the #content div and all content after. * * @link https://developer.wordpress.org/themes/basics/template-files/#template-partials * * @package BeShop */ $beshop_topfooter_show = get_theme_mod( 'beshop_topfooter_show', 1 ); $beshop_basket_visibility = get_theme_mod( 'beshop_basket_visibility', 'all' ); ?>