enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. YouTube's algorithm more likely to recommend users ... - AOL

    www.aol.com/news/youtube-algorithm-more-likely...

    YouTube's algorithm more likely to recommend users right-wing and religious content, research finds ... “YouTube’s recommendation system is trained to raise high-quality content on the home ...

  3. YouTube moderation - Wikipedia

    en.wikipedia.org/wiki/YouTube_moderation

    YouTube has suggested potential plans to remove all videos featuring children from the main YouTube site and transferring them to the YouTube Kids site where they would have stronger controls over the recommendation system, as well as other major changes on the main YouTube site to the recommended feature and auto-play system. [128]

  4. Research finds pattern of YouTube recommending right ... - AOL

    www.aol.com/research-finds-pattern-youtube...

    YouTube’s algorithm frequently recommends right-leaning and Christian videos to users who have not previously shown interest in those topics, according to new research released Tuesday. The ...

  5. Algorithmic radicalization - Wikipedia

    en.wikipedia.org/wiki/Algorithmic_radicalization

    Algorithmic radicalization is the concept that recommender algorithms on popular social media sites such as YouTube and Facebook drive users toward progressively more extreme content over time, leading to them developing radicalized extremist political views. Algorithms record user interactions, from likes/dislikes to amount of time spent on ...

  6. Rabbit Hole (podcast) - Wikipedia

    en.wikipedia.org/wiki/Rabbit_Hole_(podcast)

    YouTube's content recommendation algorithm is designed to keep the user engaged as long as possible, which Roose calls the "rabbit hole effect". [5] The podcast features interviews with a variety of people involved with YouTube and the "rabbit hole effect". [6] For instance, in episode four Roose interviews Susan Wojcicki—the CEO of YouTube. [2]

  7. YouTube's algorithm pushes right-wing, explicit videos ... - AOL

    www.aol.com/news/youtubes-algorithm-pushes-wing...

    YouTube's algorithm recommends right-wing, extremist videos to users — even if they haven't interacted with that content before. YouTube's algorithm pushes right-wing, explicit videos regardless ...

  8. Alt-right pipeline - Wikipedia

    en.wikipedia.org/wiki/Alt-right_pipeline

    In 2019, YouTube announced a change to its recommendation algorithm to reduce conspiracy theory related content. [12] [18] Some extreme content, such as explicit depictions of violence, are typically removed on most social media platforms. On YouTube, content that expresses support of extremism may have monetization features removed, may be ...

  9. YouTube's recommender AI still a horror show, finds major ...

    www.aol.com/news/youtubes-recommender-ai-still...

    For years YouTube's video-recommending algorithm has stood accused of fuelling a grab bag of societal ills by feeding users an AI-amplified diet of hate speech, political extremism and/or ...