Thoughts on “Algorithms of Oppression, How search Engines Reinforce Racism”

There was a lot of information and many ideas covered in Safiya Umoja Noble’s Book, “Algorithms of Oppression, How search Engines Reinforce Racism”, so I’ll do my best to keep this post short while pointing out some parts that stuck out to me and my thoughts on them.

One of the first points I found worth mentioning was made my Noble stating “Not only do algorithms reinforce racism, sexism, and oppression overall, but they can spread false associations”. (I’m not sure if this is a direct quote, unfortunately I lost all my highlighted material to page 44.) This was made clear when she searches “black girls” on Google, and the first hit was a pornographic site. This quote also reminded me of a Fortune article titled “Lack of Women in Stem Could Lead to AI Sexism” where several virtual assistants are pointed out as having female names or voices, (i.e. Alexa) which “perpetuates the stereotypes about women as the chipper, helpful assistant.” It takes something as unobtrusive as a female AI voice, or as appalling as porn being the first search result for “black girls” to alter or reinforce our perception of a population.

Another point I wanted to address was in the section “Gaming the System: Optimizing and Co-optimizing Results in Search Engines” on page 47 where Noble describes “Hit and Run” activity as “activity that can deliberately co-opt terms and identities on the web for political, ideological, and satirical purposes.” This made me think of the 2016 presidential election where there was controversy raised over Facebook being many people’s primary news source and why that is problematic. These reasons were made clear in “Algorithms of Oppression” and Nick Seaver’s “Knowing Algorithms” in Part 2 where he describes Eli Pariser’s experience with algorithmic filtering on Facebook. Our knowledge of algorithms or lack thereof, can cause real damage to our understanding of all sides of a story.

Thankfully, as Noble mentions, some things have changed since her initial Google search. Pornography is not the first result to pop up, and what I found most exciting is that during my research, “Black Girls Code” is actually the first result. As the general public becomes more aware of algorithms and the impact they can have on our research, they hopefully become less likely to skew public perceptions, or at the very least, we’ll question the information we’re being fed.

7 thoughts on “Thoughts on “Algorithms of Oppression, How search Engines Reinforce Racism”

  1. Carolyn A. McDonough

    Quinn–Thanks for getting the discussion on “Algorithms of Oppression” started. I’ve been close reading Noble’s Introduction and I’m perceiving it a “Call to Arms” re Google.

    As it’s the “day of” class, I will keep this brief and just wanted to add these quotes from the Intro which I found particularly terse (in the best of senses) as “talking points”:

    -“Google Search is an advertising company not a reliable info company…Is this the best information? For whom?…queries in Google and other digital media platforms…are the beginning of a much-needed reassessment of information as a public good.” This last sentence reminds me of Sean’s astute observation last week about Google as a “utility” of sorts (which I agree with).

    Most succinctly, Noble asserts: “We need a full-on reevaluation of the implications of our information resources being governed by corporate controlled advertising companies.”

    I have often wondered why anti-trust laws don’t apply, or have not yet been applied, to Google.

    I have also often wondered why citizens aren’t “up in arms” about Google’s profiteering from sexism and racism — and I would add, actually trafficking in porn by pornify-ing search terms with a history of elevating searches, such as the “black girls” example Noble writes about, to #1 yield status VIA porn. As far as I always knew, such activity shows intent to profit and would/could/should be considered ILLEGAL.

    I find a parallel to this in a conversation Rob and I were having recently after class about what happened, pray tell, to Trademark law in the wake of every broadcaster screaming “like us on Facebook” many times per hour? Back in the day, Facebook would have had to PAY the broadcaster for such promotion.

    I hope people start to become more informed media consumers about the true nature of Google.

    To be cont’d in class…

  2. Nancy Foasberg

    Like Caroline, I should apologize for responding at the last minute!

    I really loved how the readings this week connected to last week’s readings on materiality; last week, we looked at infrastructure as the things taken for granted that underlie ubiquitous technology, and this week–that also, but in a very different way! I like how you bring out Noble’s argument about how flattened representations of marginalized groups of people have real effects in the real world, and I suspect that those effects are heightened when the way in which the information is produced/delivered/presented is obscured and becomes “natural” or “obvious”. I love the work that Noble does to point out other possibilities, including her idea for a color coded search engine.

    As Noble points out throughout her book, Google has been let off the hook by pretty much everybody — they have no legal responsibility for anything that shows up on their platforms, and anyone who finds objectionable materials by searching Google tends to take for granted. She writes about how we’re all inured to this sort of thing. I really want to draw a connection here to Eubanks, who is very concerned with out digital architectures, like physical architectures, are often constructed to isolate people and flatten people into caricatures that belong to groups that are seen as in need of some management. It’s a different kind of flattening, but in both cases there’s a narrative about marginalized or vulnerable people that aggressively excludes their voices.

    But really, I’m just very impressed with her identifying search results as a kind of representation. I hadn’t thought of it that way before, and it’s different from a lot of other kinds of representations in that it isn’t created by a person to serve a particular purpose–but it still pushes a narrative under a certain ideology.

  3. Carolyn A. McDonough

    Nancy — I admire how Noble holds Google accountable in a strong and undeniable way by crafting its very own, biased search engine results into her scholarly argument — slam dunk.

    Carolyn 🙂
    (p.s. not Caroline)

    1. Carolyn A. McDonough

      Awww l’il sis, sweet 🙂 and nice name, too — no worries, Nancy, I truly cringe in correcting people but someone told me it’s important for the universe to “hear” our given names. I have to say there’s something to this because it’s been a good year + I LOVE that kind of stuff — do you?

Comments are closed.