Search results
Results from the WOW.Com Content Network
The alt-right pipeline (also called the alt-right rabbit hole) is a proposed conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming provocative right-wing political content, such as antifeminist or anti-SJW ideas, gradually increases exposure to the alt-right or similar far-right politics.
This isn’t the first time YouTube has faced scrutiny for its algorithm. Researchers have repeatedly found that YouTube has recommended extremist and conspiracy theory videos to users.
Algorithmic radicalization is the concept that recommender algorithms on popular social media sites such as YouTube and Facebook drive users toward progressively more extreme content over time, leading to them developing radicalized extremist political views. Algorithms record user interactions, from likes/dislikes to amount of time spent on ...
YouTube's content recommendation algorithm is designed to keep the user engaged as long as possible, which Roose calls the "rabbit hole effect". [5] The podcast features interviews with a variety of people involved with YouTube and the "rabbit hole effect". [6] For instance, in episode four Roose interviews Susan Wojcicki—the CEO of YouTube. [2]
YouTube's algorithm recommends right-wing, extremist videos to users — even if they haven't interacted with that content before. YouTube's algorithm recommends right-wing, extremist videos to ...
The report, titled "YouTube's Anorexia Algorithm," examines the first 1,000 videos that a teen girl would receive in the "Up Next" panel when watching videos about weight loss, diet or exercise ...
A 2019 BBC investigation of YouTube searches in ten different languages found that YouTube's algorithm promoted health misinformation, including fake cancer cures. [56] In Brazil, YouTube has been linked to pushing pseudoscientific misinformation on health matters, as well as elevated far-right fringe discourse and conspiracy theories. [57]
On November 4, The New York Times published an article about the "startling" videos slipping past YouTube's filters and disturbing children, "either by mistake or because bad actors have found ways to fool the YouTube Kids' algorithms". [3]