enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Algorithmic radicalization - Wikipedia

    en.wikipedia.org/wiki/Algorithmic_radicalization

    YouTube's algorithm is accountable for roughly 70% of users' recommended videos and what drives people to watch certain content. [20] According to a 2022 study by the Mozilla Foundation, users have little power to keep unsolicited videos out of their suggested recommended content. This includes videos about hate speech, livestreams, etc. [21] [20]

  3. Rabbit Hole (podcast) - Wikipedia

    en.wikipedia.org/wiki/Rabbit_Hole_(podcast)

    YouTube's content recommendation algorithm is designed to keep the user engaged as long as possible, which Roose calls the "rabbit hole effect". [5] The podcast features interviews with a variety of people involved with YouTube and the "rabbit hole effect". [6] For instance, in episode four Roose interviews Susan Wojcicki—the CEO of YouTube. [2]

  4. YouTube's algorithm pushes right-wing, explicit videos ... - AOL

    www.aol.com/news/youtubes-algorithm-pushes-wing...

    YouTube's algorithm recommends right-wing, extremist videos to users — even if they haven't interacted with that content before. YouTube's algorithm pushes right-wing, explicit videos regardless ...

  5. YouTube's algorithm more likely to recommend users ... - AOL

    www.aol.com/news/youtube-algorithm-more-likely...

    The study noted that YouTube’s recommendation algorithm “drives 70% of all video views.” ... The researchers also found that YouTube recommended videos including sexually explicit content to ...

  6. Alt-right pipeline - Wikipedia

    en.wikipedia.org/wiki/Alt-right_pipeline

    The alt-right pipeline (also called the alt-right rabbit hole) is a proposed conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming provocative right-wing political content, such as antifeminist or anti-SJW ideas, gradually increases exposure to the alt-right or similar far-right politics.

  7. YouTube Algorithm Steers People Away From Radical Content - AOL

    www.aol.com/news/youtube-algorithm-steers-people...

    For premium support please call: 800-290-4726 more ways to reach us

  8. Social impact of YouTube - Wikipedia

    en.wikipedia.org/wiki/Social_impact_of_YouTube

    [89] A 2022 study published by the City University of New York found that "despite widespread concerns that YouTube's algorithms send people down 'rabbit holes' with recommendations to extremist videos, little systematic evidence exists to support this conjecture", "exposure to alternative and extremist channel videos on YouTube is heavily ...

  9. YouTube algorithms push eating disorder content to teen girls ...

    www.aol.com/news/youtube-algorithms-push-eating...

    The report, titled "YouTube's Anorexia Algorithm," examines the first 1,000 videos that a teen girl would receive in the "Up Next" panel when watching videos about weight loss, diet or exercise ...