Very, Very Few People Are Falling Down the YouTube Rabbit Hole | The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that::The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that.

  • masterofn001@lemmy.ca
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    2
    ·
    edit-2
    10 months ago

    Recently watched a documentary called ‘the YouTube effect’ by Alex Winter (bill of bill and ted) which goes into how YouTube was essential in the current global state of radicalized individuals.

    In the earlyish days of the internet (late 1990s / early 00s) I fell deep down the rabbit hole of right wing hate and conspiracy theories…

    One subject of the doc explains his descent. It is almost exactly mine. Only these days it is hyper stimulated, laser targeted, data driven, psychological warfare, wrapped in polished, billionaire backed campaigns.

    It comes at you from wherever you are.

    Crypto bros. Health/hydro bros. incel bros. Christian bros. Muslim bros. Rogan bros. Peterson bros. Elon bros. Tech bros. Anon bros. etc.

    By the time a lot of people realize what’s happened, if ever, they’re already in too deep.

    • mdm_@lemmy.ca
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      10 months ago

      Crypto bros. Health/hydro bros. incel bros. Christian bros. Muslim bros. Rogan bros. Peterson bros. Elon bros. Tech bros. Anon bros. etc.

      Hmm I’m sensing a theme here…

      • kambusha@feddit.ch
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Not OP but I’m guessing the algorithms that recommend videos have gone down one direction, so you’re in an echo chamber where it seems like that is everything there is. You’d never hear a counter-argument; only ever one side of the argument.

      • masterofn001@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 months ago

        Whatever your interest or hobby, there is a psyop devoted to it. Wherever you are, whatever you do, whatever you’re curious about, you will find targeted propaganda.

        Because of the methods used. The conspiracies wrapped in a cozy blanket of semi truth and emotional manipulation make it easy to fall prey to.

        If you’re angry, it will make you angrier. Violent, even. If you’re happy, it can make you hate with the loving joy of false religious zeal. If you are confused and uncertain, it will provide the esoteric truths you seek, with the absolute certainty of a “final solution”

        Etc

        And it’s difficult to unwind.

  • dylanTheDeveloper@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    10 months ago

    I keep getting ‘rescue’ animal videos which involve people purposely putting puppies and kittens in destressing situations so they can ‘save them’ its sick and no matter how often i block and report those videos they re-appear next month. I also get alot of ‘police shooting people’ videos which i also try to block

    • regbin_@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      I think it’s just a matter of fine tuning your preferences. I’ve never an irrelevant video recommended to me within the last few years. All the recommendations have been great. Retro hardware reviews, video game gameplay guides, science videos, and other informational/engineering stuff.

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    10 months ago

    All my YouTube recommendations went downhill about 3 years ago. I am bombarded by rightwing Christian stuff no matter how many times I flag and complain.

    • megalodon@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      10 months ago

      I’m bombarded by Joe Rogan stuff. I keep blocking the channels but there is an endless stream of them

      • Wolpertinger@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        I always downvote, then block the channel whenever I get those. However, I think the mere act of going to the button to block the channel instead of just scrolling on immediately is telling the algorithm that I want more of that kind of video.

        Watching a bit of breadtube stuff, I feel like thr algorithm can’t determine what video is against stuff like that and what’s for, so I get recommended videos for whatever I don’t like instead of against.

        • invisinak@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          that might actually be the issue. YouTube doesn’t differentiate between upvotes and downvotes really. to the algorithm you’re engaging in the content either way so it’s serving you more to keep you engaged.

        • 5BC2E7@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          I am also suspecting that downvoting or blocking is somehow interpreted as “engaged with the content so lets shove more of it”

  • there1snospoon@ttrpg.network
    link
    fedilink
    English
    arrow-up
    9
    ·
    10 months ago

    The article below:

    Around the time of the 2016 election, YouTube became known as a home to the rising alt-right and to massively popular conspiracy theorists. The Google-owned site had more than 1 billion users and was playing host to charismatic personalities who had developed intimate relationships with their audiences, potentially making it a powerful vector for political influence. At the time, Alex Jones’s channel, Infowars, had more than 2 million subscribers. And YouTube’s recommendation algorithm, which accounted for the majority of what people watched on the platform, looked to be pulling people deeper and deeper into dangerous delusions.

    The process of “falling down the rabbit hole” was memorably illustrated by personal accounts of people who had ended up on strange paths into the dark heart of the platform, where they were intrigued and then convinced by extremist rhetoric—an interest in critiques of feminism could lead to men’s rights and then white supremacy and then calls for violence. Most troubling is that a person who was not necessarily looking for extreme content could end up watching it because the algorithm noticed a whisper of something in their previous choices. It could exacerbate a person’s worst impulses and take them to a place they wouldn’t have chosen, but would have trouble getting out of.

    Just how big a rabbit-hole problem YouTube had wasn’t quite clear, and the company denied it had one at all even as it was making changes to address the criticisms. In early 2019, YouTube announced tweaks to its recommendation system with the goal of dramatically reducing the promotion of “harmful misinformation” and “borderline content” (the kinds of videos that were almost extreme enough to remove, but not quite). At the same time, it also went on a demonetizing spree, blocking shared-ad-revenue programs for YouTube creators who disobeyed its policies about hate speech.Whatever else YouTube continued to allow on its site, the idea was that the rabbit hole would be filled in.

    A new peer-reviewed study, published today in Science Advances, suggests that YouTube’s 2019 update worked. The research team was led by Brendan Nyhan, a government professor at Dartmouth who studies polarization in the context of the internet. Nyhan and his co-authors surveyed 1,181 people about their existing political attitudes and then used a custom browser extension to monitor all of their YouTube activity and recommendations for a period of several months at the end of 2020. It found that extremist videos were watched by only 6 percent of participants. Of those people, the majority had deliberately subscribed to at least one extremist channel, meaning that they hadn’t been pushed there by the algorithm. Further, these people were often coming to extremist videos from external links instead of from within YouTube.

    These viewing patterns showed no evidence of a rabbit-hole process as it’s typically imagined: Rather than naive users suddenly and unwittingly finding themselves funneled toward hateful content, “we see people with very high levels of gender and racial resentment seeking this content out,” Nyhan told me. That people are primarily viewing extremist content through subscriptions and external links is something “only [this team has] been able to capture, because of the method,” says Manoel Horta Ribeiro, a researcher at the Swiss Federal Institute of Technology Lausanne who wasn’t involved in the study. Whereas many previous studies of the YouTube rabbit hole have had to use bots to simulate the experience of navigating YouTube’s recommendations—by clicking mindlessly on the next suggested video over and over and over—this is the first that obtained such granular data on real, human behavior.

    The study does have an unavoidable flaw: It cannot account for anything that happened on YouTube before the data were collected, in 2020. “It may be the case that the susceptible population was already radicalized during YouTube’s pre-2019 era,” as Nyhan and his co-authors explain in the paper. Extremist content does still exist on YouTube, after all, and some people do still watch it. So there’s a chicken-and-egg dilemma: Which came first, the extremist who watches videos on YouTube, or the YouTuber who encounters extremist content there?

    Examining today’s YouTube to try to understand the YouTube of several years ago is, to deploy another metaphor, “a little bit ‘apples and oranges,’” Jonas Kaiser, a researcher at Harvard’s Berkman Klein Center for Internet and Society who wasn’t involved in the study, told me. Though he considers it a solid study, he said he also recognizes the difficulty of learning much about a platform’s past by looking at one sample of users from its present. This was also a significant issue with a collection of new studies about Facebook’s role in political polarization, which were published last month (Nyhan worked on one of them). Those studies demonstrated that, although echo chambers on Facebook do exist, they don’t have major effects on people’s political attitudes today. But they couldn’t demonstrate whether the echo chambers had already had those effects long before the study.

    The new research is still important, in part because it proposes a specific, technical definition of rabbit hole. The term has been used in different ways in common speech and even in academic research. Nyhan’s team defined a “rabbit hole event” as one in which a person follows a recommendation to get to a more extreme type of video than they were previously watching. They can’t have been subscribing to the channel they end up on, or to similarly extreme channels, before the recommendation pushed them. This mechanism wasn’t common in their findings at all. They saw it act on only 1 percent of participants, accounting for only 0.002 percent of all views of extremist-channel videos.

    This is great to know. But, again, it doesn’t mean that rabbit holes, as the team defined them, weren’t at one point a bigger problem. It’s just a good indication that they seem to be rare right now. Why did it take so long to go looking for the rabbit holes? “It’s a shame we didn’t catch them on both sides of the change,” Nyhan acknowledged. “That would have been ideal.” But it took time to build the browser extension (which is now open source, so it can be used by other researchers), and it also took time to come up with a whole bunch of money. Nyhan estimated that the study received about $100,000 in funding, but an additional National Science Foundation grant that went to a separate team that built the browser extension was huge—almost $500,000.

    Nyhan was careful not to say that this paper represents a total exoneration of YouTube. The platform hasn’t stopped letting its subscription feature drive traffic to extremists. It also continues to allow users to publish extremist videos. And learning that only a tiny percentage of users stumble across extremist content isn’t the same as learning that no one does; a tiny percentage of a gargantuan user base still represents a large number of people.

    This speaks to the broader problem with last month’s new Facebook research as well: Americans want to understand why the country is so dramatically polarized, and people have seen the huge changes in our technology use and information consumption in the years when that polarization became most obvious. But the web changes every day. Things that YouTube no longer wants to host could still find huge audiences, instead, on platforms such as Rumble; most young people now use TikTok, a platform that barely existed when we started talking about the effects of social media. As soon as we start to unravel one mystery about how the internet affects us, another one takes its place.

    • intensely_human@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 months ago

      Another way to put that study’s weakness, in scientific terms, is that there’s no control group against which the studied group is being compared. There’s zero indication that the 2019 changes had any effect at all, without some data from before those changes.

    • gothicdecadence@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      I've never heard of Rumble before, apparently it's a video platform and the company that owns Truth social, so it's very popular with the far right

    • Cosmic Cleric@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      10 months ago

      The article below:

      Honestly don’t mean this as an attack, but couldn’t people just clicked on the link, if they really wanted to read the article?

  • skymtf@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    8
    ·
    10 months ago

    Who did these stats, I’m getting more right wing proganda than ever. Also Facebook is just as bad as ever. I really like stuff like the fediverse since I can control my feed.

    • Wolpertinger@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      Me, too. I’m always recommended Joe Rogan or Jordan Peterson videos, with a sprinkling of Ben Shapiro. I even got someone claiming the holocaust was overblown (i reported them). All within the past few months.

      I don’t get recommended regular videos like that, but youtube shorts are full of that garbage. I suspect it’s a blind spot

  • scarabic@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    10 months ago

    If it’s true that they have closed the radicalization rabbit hole then that is a huge achievement and very very good news.

    • Edgelord_Of_Tomorrow@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      10 months ago

      Now that they’ve entrenched an entire alternative universe in an election-winning proportion of the population, they don’t need it anymore.

      Unless YouTube is going to be deliberately directing people to deprogramming content it’s too late.

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        A lot of damage is done, certainly, but I think any success they have will depend on keeping up this bullshit. New voters are growing up all the time. The less chance for them to fall down the QAnon conspiracy after they just wanted to find some video game guide content, the better.

  • qwamqwamqwam@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    10 months ago

    Wait what? Maybe I’m misunderstanding, but this is what I got out of the article:

    “We had anecdotes and preliminary evidence of a phenomenon. A robust scientific study showed no evidence of said phenomenon. Therefore, the phenomenon was previously real but has now stopped.”

    That seems like really, really bad science. Or at least, really really bad science reporting. Like, if anecdotes are all it takes, here’s one from just a few weeks ago.

    I left some Andrew Tate-esque stuff running overnight by accident and ended up having to delete my watch history to get my homepage back to how it was before.

    • TimewornTraveler@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      From the quoted bit it sounds like there was credible science that found nothing. That doesn’t mean there is nothing, but just that they found nothing.

  • uriel238@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    10 months ago

    Weirdly, YouTube’s algo propelled me down the Pinko-commie anarcho-socialist boy-we-suck-at-democracy rabbit hole. I was an avid BreadTuber long before I ever heard the name BreadTube.

    • 31337@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      Yeah, it started for me during Covid when I felt like I needed long-form podcasts/streamers in the background for noise while working from home. I think my progression was from The Worst Year Ever -> Chapo Trap House -> It Could Happen Here -> PhilosphyTube -> Contrapoints -> Vaush. TBF, I’ve been a leftist before Youtube existed, probably starting with Chomsky, Einstein’s article, and random pirated documentaries.

    • ram@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      10 months ago

      Can always go to one of the many instances that defederated them. Not like there’s account-wide upvote points to lose or anything. (genuine suggestion)

    • doggle@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      There are many instances that have defederated them that you could join. Or if you’re really serious you could host your own.

    • Franzia@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      We are not on a website. Therefore, we have a better solution than the top-down approach that Youtube uses.

  • intensely_human@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    2
    ·
    10 months ago

    Why do we need to know what happened before? A record of the past is just material radicals can use to radicalize others.