Digital tools are often hailed as transparent and democratising “disrupters”. Two new books argue that this optimistic vision is mistaken and that algorithms, as currently deployed, pose a major threat to the human rights of marginalised groups.
By way of introduction to her topic, Safiya Umoja Noble, an assistant professor of information studies at the University of California, Los Angeles, recounts the experience that led her to write Algorithms of Oppression. In September 2011, looking for inspiration as to how to entertain her pre-teen stepdaughter and visiting nieces, Noble searched for “black girls” on Google. The phrase was meant to elicit information on things that might be of interest to their demographic; instead, it produced a page of results awash with pornography. As Noble drolly observes: “This best information, as listed by rank in the search results, was certainly not the best information for me or for the children I love.”
Noble acknowledges that any set of results will quickly become historical: repeating the search a year later, she found that pornography had been suppressed. But she argues that such episodes should be seen as systemic rather than as one-off “glitches”. Though Noble doesn’t charge Google with racist or sexist intent, she challenges the assumption that the output of Google Search reflects the democratic, if regrettable, inclinations of its users, and questions Google’s abdication of responsibility for the results its algorithms produce.