Search results
Results from the WOW.Com Content Network
Algorithms of Oppression: How Search Engines Reinforce Racism is a 2018 book by Safiya Umoja Noble in the fields of information science, machine learning, and human-computer interaction. [1] [2] [3] [4]
Noble's first book, Algorithms of Oppression, was published by NYU Press in 2018 and has been reviewed in journals such as the Los Angeles Review of Books and was featured in the New York Public Library 2018 Best Books for Adults. [32] [33] [34] It considers how bias against people of color is embedded into supposedly neutral search engines. [34]
While users generate results that are "completed" automatically, Google has failed to remove sexist and racist autocompletion text. For example, Algorithms of Oppression: How Search Engines Reinforce Racism Safiya Noble notes an example of the search for "black girls", which was reported to result in pornographic images. Google claimed it was ...
Other journalists and researchers have expressed concerns similar to Epstein's. Safiya Noble cited Epstein's research about search engine bias in her 2018 book Algorithms of Oppression, [99] although she has expressed doubt that search engines ought to counter-balance the content of large, well-resourced and highly trained newsrooms with what ...
Safiya Umoja Noble publishes Algorithms of Oppression: How Search Engines Reinforce Racism, arguing that search algorithms are racist and perpetuate societal problems. [169]-Joy Buolamwini publishes Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, exposing biases in facial recognition systems. [170]
This page was last edited on 8 September 2024, at 05:25 (UTC).; Text is available under the Creative Commons Attribution-ShareAlike 4.0 License; additional terms may apply.
Weapons of Math Destruction is a 2016 American book about the societal impact of algorithms, written by Cathy O'Neil. It explores how some big data algorithms are increasingly used in ways that reinforce preexisting inequality. It was longlisted for the 2016 National Book Award for Nonfiction but did not make it through the shortlist.
Social media inadvertently isolates users into their own ideological filter bubbles, according to internet activist Eli Pariser. A filter bubble or ideological frame is a state of intellectual isolation [1] that can result from personalized searches, recommendation systems, and algorithmic curation.