enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Alt-right pipeline - Wikipedia

    en.wikipedia.org/wiki/Alt-right_pipeline

    The alt-right pipeline (also called the alt-right rabbit hole) is a proposed conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming provocative right-wing political content, such as antifeminist or anti-SJW ideas, gradually increases exposure to the alt-right or similar far-right politics.

  3. YouTube's algorithm more likely to recommend users ... - AOL

    www.aol.com/news/youtube-algorithm-more-likely...

    This isn’t the first time YouTube has faced scrutiny for its algorithm. Researchers have repeatedly found that YouTube has recommended extremist and conspiracy theory videos to users.

  4. Algorithmic radicalization - Wikipedia

    en.wikipedia.org/wiki/Algorithmic_radicalization

    Algorithmic radicalization is the concept that recommender algorithms on popular social media sites such as YouTube and Facebook drive users toward progressively more extreme content over time, leading to them developing radicalized extremist political views. Algorithms record user interactions, from likes/dislikes to amount of time spent on ...

  5. Rabbit Hole (podcast) - Wikipedia

    en.wikipedia.org/wiki/Rabbit_Hole_(podcast)

    YouTube's content recommendation algorithm is designed to keep the user engaged as long as possible, which Roose calls the "rabbit hole effect". [5] The podcast features interviews with a variety of people involved with YouTube and the "rabbit hole effect". [6] For instance, in episode four Roose interviews Susan Wojcicki—the CEO of YouTube. [2]

  6. YouTube's algorithm pushes right-wing, explicit videos ... - AOL

    www.aol.com/news/youtubes-algorithm-pushes-wing...

    YouTube's algorithm recommends right-wing, extremist videos to users — even if they haven't interacted with that content before. YouTube's algorithm recommends right-wing, extremist videos to ...

  7. YouTube algorithms push eating disorder content to teen girls ...

    www.aol.com/youtube-algorithms-push-eating...

    The report, titled "YouTube's Anorexia Algorithm," examines the first 1,000 videos that a teen girl would receive in the "Up Next" panel when watching videos about weight loss, diet or exercise ...

  8. YouTube moderation - Wikipedia

    en.wikipedia.org/wiki/YouTube_moderation

    A 2019 BBC investigation of YouTube searches in ten different languages found that YouTube's algorithm promoted health misinformation, including fake cancer cures. [56] In Brazil, YouTube has been linked to pushing pseudoscientific misinformation on health matters, as well as elevated far-right fringe discourse and conspiracy theories. [57]

  9. Elsagate - Wikipedia

    en.wikipedia.org/wiki/Elsagate

    On November 4, The New York Times published an article about the "startling" videos slipping past YouTube's filters and disturbing children, "either by mistake or because bad actors have found ways to fool the YouTube Kids' algorithms". [3]