Hello folks! Matt asked Anthony and I to put together a doodle poll for our upcoming two sessions of student presentations on 12/4 & 12/11. Please sign up for one of these dates, as we must condense the presentations so that everyone has a chance to share their work. As Matt mentioned in class each proposal should be no longer than 6-8 minutes each unless you are collaborating which allows from an extended presentation time.
Monthly Archives: November 2018
Excavating the Slave Experience: An Interactive Pedagogical Mapping tool
Hello all,
I just wanted to pass along the link to Excavating the Slave Experience, created collaboratively by MALS students, Monika Wright, Iris Finkel, and Tristen Goodwin, which explores how runaway slave advertisements could be digitally archived and recorded as a pedagogical tool to inform of the historically violent process of tracking black bodies as a source of free labor which can also be found within the Simone Browne piece. In thinking of Brown and Nakamura’s reading I’d love to further explore how digital tools can be used to combat systems of infrastructural, and digital inequality, specifically in Digital Humanities knowledge production. Furthermore, how interactive mediums can possess the capacity to counter the misconstrued identities of marginalized people as we continue to think of our own role as “Dh’ers”. As Nakamura points out the through the exploitation of Navajo women in the pursuit of capitalistic, consumption-based technology, it is the lack of acknowledgement in addressing historical inequality which is the foundation which continues to perpetuate cycles of mis-representative narratives of marginalized people in association to labor in the digital world. Indigenous games scholar, Elizabeth LaPensée, talks about the process of “Digital Preservation” in her work of creating graphic novels and video games which re-appropriate traditional symbols of technology such as the classic Atari Space Invaders, and re-platforms narratives in a way which re-centers the perspectives of colonialism and stories of European conquest. You can get a taste of this in her game Invaders in which you play as a Native person combating the “foreign” forces in the embodiment of space invaders (colonialists) with single-arrow bursts which translates the disorienting experience of making sense “alien” technology and the history of domination and conquest.
These tools of digital preservation may not be complete solutions in undoing systemic inequality, but can perhaps do the work of sparking the inquiry for individuals to delve further towards understanding issues outside of one’s experience.
Biometric Stereotyping: A Tool of Oppression
Upon reading both of the Zach Blas articles, along with viewing the video “Facial Weaponization Communiqué: Fag Face (2012),” I can honestly say I am quite horrified at the discoveries made by these researchers. I have never really thought about the concept of biometrics and facial recognition, and even when I did think about these topics it was in the realm of specific facial recognition (such as with the most recent iPhone series), and not as a tool of categorization and oppression. Although it is true that you may find differences in human body types based on genetics, that does not mean these technologies should be made to ignore features of diverse bodies.
Blas addresses specific failures in terms of biometric technology’s inability to include diverse features when he explains: “Asian women’s hands fail to be legible to fingerprint devices; eyes with cataracts hinder iris scans; dark skin continues to be undetectable; and non-normative formations of age, gender, and race frequently fail successful detection” (Blas). When I was getting ready to start working within schools during my undergraduate degree, it was required that I get fingerprinted so that this personal biometric information was in the system. I didn’t connect that idea that the size of the scanner used would not have been practical for someone with smaller hands than I. These devices are used every single day, around the whole world, yet they are not built to accommodate the realistic world, the people we see walking past us on the streets of our neighborhoods.
Iris scanners falling short when it comes to cataracts very directly oppresses Black and Hispanic individuals for they make up a majority of those who develop cataracts at some point in their lives. Lastly, we (hopefully) do not even need to discuss how facial recognition failing to identify dark skin falls under the category of oppressive. Blas also discusses how there are claims that these facial scanners can determine characteristics of the person below the surface of their skin, such as gender and sexuality. I would like to point out how wrongful and dehumanizing it is to categorize physical human features into such general groups. As someone who is a biracial person of color, I carry features of both my Dominican father and my English & Irish mother. So put me in front of a facial recognition scanner, what exactly will come up? Or will it just get confused? Encoding the device in this manner feels almost colonial. It reads faces and other bodily features and provides information that attempts to keep human beings separated as if members of different species.
I used the term “colonial” because of the way this mode of thinking resonates with segregation and slavery. Blas also wrote about how once the facial biometric diagrams were fabricated, the metal replicated what appeared to be facial cages similar to that of handcuffs, prison bars, and torture devices used during Medival times and slavery in the United States (Blas). The images of the face cages are incredibly discomforting but definitely powerful. This is where Blas begins to intersect with a point made by Simone Browne in her essay “Race and Surveillance.” Dating all the way back to slavery and the Transatlantic Slave Trade in the USA, slaves would be cataloged as a form of “disciplinary power” (Browne 73). Browne continues to explain how this disciplinary power is exercised through its invisibility, whilst simultaneously imposing a compulsory visibility on its targets. This concept links to the idea of these oppressive systems being invisible to those of us who are unaffected by their tyranny. All while these same oppressive systems that are in place exist as daily and active thoughts in the lives of those who are affected. I think it’s important to not be ambivalent about these oppressive technologies solely because it does not concern you directly. If it affects a people of any demographic in a negative manner, it should become a priority to fight for a positive change with the goal of reaching true equality.
Machine Bias
https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
I just wanted to bring the above article into our discourse. I apologize for adding it so late in the week.
It covers a set of algorithms designed to estimate recidivism risk in convicted criminals for consideration in sentencing and parole calculations. Without knowledge of a subject’s race, nor by asking questions to determine a subject’s race, the algorithm has been shown to significantly overestimate recidivism for black convicts while significantly underestimating the same for white convicts.
The fact that the algorithms were designed and owned by a private company brings to light all we are reading and thinking about algorithmic transparency, the economic motivations to resist said transparency, and how bias can be perpetuated more easily without it.
I have a big problem with marginalizing the plight of the addict with a phenomenon I don’t see as real
The purpose of this short blog is to get feedback from any of you who have read it that may be able to shed more light on what she is saying so I can discuss it more informed tomorrow. I realize I am posting late, so maybe I won’t get much response, but here goes… I am referring to the meth mouth section and her implication that white addicts are merely a marker of “good” or “bad” in terms of their whiteness, and not their mental illness, addiction. I don’t think addiction is a good thing I think it is indeed a really tragic, devastating thing. And I think her connecting the plight of the white addict as a mere societal “inconvenience” or “stain” to their overall whiteness is a very toxic and problematic claim to be made. I just feel like racializing addiction for white people only in such a way counter-merits the lived experience of countless people who suffer from such a devastating illness. Can someone who agreed with what she said please explain what I am missing? I understand addiction can be looked at differently when it comes to race, but I don’t see her claims as anything but insulting to addicted peoples who may happen to be white. I think there were many more responsible avenues for the discussion of surveillance, racism and addiction.
Protected: On Gang Databases
Thoughts on “Algorithms of Oppression, How search Engines Reinforce Racism”
There was a lot of information and many ideas covered in Safiya Umoja Noble’s Book, “Algorithms of Oppression, How search Engines Reinforce Racism”, so I’ll do my best to keep this post short while pointing out some parts that stuck out to me and my thoughts on them.
One of the first points I found worth mentioning was made my Noble stating “Not only do algorithms reinforce racism, sexism, and oppression overall, but they can spread false associations”. (I’m not sure if this is a direct quote, unfortunately I lost all my highlighted material to page 44.) This was made clear when she searches “black girls” on Google, and the first hit was a pornographic site. This quote also reminded me of a Fortune article titled “Lack of Women in Stem Could Lead to AI Sexism” where several virtual assistants are pointed out as having female names or voices, (i.e. Alexa) which “perpetuates the stereotypes about women as the chipper, helpful assistant.” It takes something as unobtrusive as a female AI voice, or as appalling as porn being the first search result for “black girls” to alter or reinforce our perception of a population.
Another point I wanted to address was in the section “Gaming the System: Optimizing and Co-optimizing Results in Search Engines” on page 47 where Noble describes “Hit and Run” activity as “activity that can deliberately co-opt terms and identities on the web for political, ideological, and satirical purposes.” This made me think of the 2016 presidential election where there was controversy raised over Facebook being many people’s primary news source and why that is problematic. These reasons were made clear in “Algorithms of Oppression” and Nick Seaver’s “Knowing Algorithms” in Part 2 where he describes Eli Pariser’s experience with algorithmic filtering on Facebook. Our knowledge of algorithms or lack thereof, can cause real damage to our understanding of all sides of a story.
Thankfully, as Noble mentions, some things have changed since her initial Google search. Pornography is not the first result to pop up, and what I found most exciting is that during my research, “Black Girls Code” is actually the first result. As the general public becomes more aware of algorithms and the impact they can have on our research, they hopefully become less likely to skew public perceptions, or at the very least, we’ll question the information we’re being fed.
Concerning adjuncts
Issues around adjuncts came up in the small group discussion I was in last week.
I saw this article, and thought it should be shared.
While I have not seen all the things listed in this article, I have seen quite a few of them. Course cancellations come down literally a few days before classes start. I’ve seen adjuncts get screwed over by that more than once.
Once, it almost happened to me, except one of our full timers decided to up and resign, so I got one of her classes.
Full disclosure: I have two different positions at LaGuardia: I am a Sr. College Lab Tech and an adjunct lecturer.
In fact, in the Spring semester, the course I usually teach, Voice and Diction, is taught as a hybrid class. I took the seminar on how to teach in that environment and altered the course to work as a hybrid partially because I would be the only person on staff who can both teach V&D and teach it as a hybrid.
Still, I’m lucky. My supervisor does his best to take care of all of us, and, since I’m already on campus, I’m almost always going to get a class. It might start at 8:00 pm or 7:30 am, but I’ll get one.
So, on top of the pay issue, adjuncts have to worry about the things discussed in this article.
I’m not saying don’t be an adjunct. I love teaching. I’m saying go in with your eyes open.
Where have all the women gone?
Lisa Nakamura’s analyzes in “Indigenous Circuits: Navajo Women and the Racialization of Early Electronic Manufacture” the Fairchild Corporation’s use of Native American women to manufacture integrated circuit boards from the mid-60’s to the mid-’70s. She makes the point that women of color’s labor were essential to the founding of the physical, digital culture—circuits and chips—but that the patriarchal culture dominated their contribution and then erased it from the narrative. One way Nakamura supports her thesis is to examine a 1969 sales brochure depicting Fairchild’s “docile” Navajo women technology workers as culturally suited for the tedious and detailed task. The brochure visually draws together the digital and the native by using a rug pattern that women on the reservation would have produced and then showing that same pattern on the integrated circuit. These women are the central feature of the company’s account of the integrity of Fairchild’s manufacturing quality. She examines how Fairchild’s narrative is aimed at the US government who was not only heavily invested in using their circuits but also interested in ending services to the reservations (925). However, in the retellings of the history of Silicon Valley, the contributions of these women are all but expunged except for a “footnote” (921). Her article is an essential step in the direction of a fully inclusive narrative of the DNA of digital culture, and she gets closer to the heart of the problem of colonialism and patriarchy in the history but leaves out the most crucial part of the narrative: the native women’s voices. The problem is that when she is retelling the exploitation of these tribal women in Shiprock, New Mexico, we never hear from the women themselves.
I found it fascinating that though she directly quotes Charlie Sporck from Fairchild, she gives the native women no voice of their own except in an indirect mention of the general feelings of the Navajo tribe vis-a-vis the Fairchild plant’s closure (936). If there were no accounts from any of the women workers at the time. Why doesn’t she acknowledge them? Is there an understanding that no one went back to talk to the women? Is it implicit? Leaving holes in the narrative is something we’ve spoken about this semester such as the ghost of Sally Hemings’ brother, James, who is all but left out of the archive. I have the same feeling about these women who are mentioned but not heard. It’s like talking about someone who you pretend isn’t in the room but is right in front of you. She accounts for the holes but does not do much to fill them with the actual participants. To tell this story without the voices of the women is to continue (albeit in a lesser way) what everyone else has done and objectify the erased people. I think she almost gets to giving agency to the oppressed and has written an essential piece in the history of digital culture, but this story needs to go further and be more inclusive even while continuing to analyze the narrative.
But how do we get away from objectifying the subjects of our inquiries? It might be messy to include the voices of these women, more time consuming and disturb the narrative that they were exploited without their participation. I don’t know, but we need to start to think about these things when we are writing about the marginalized. The last word on the Navajo women’s disposition was Charlie Sporck’s claim that the plant was a “failure.” He said, “the women made the money, and the men drank it up” (936). Why does he have the last word? When talking about the DNA of digital culture and how the exploitation of women of color figures into it, it is probably best to include those women’s voices and finally give the space to the exploited workers. I think that when we as academics need to look closely at the holes in the narrative spaces where the voices of the exploited should be, and make sure to give space to them.
Reflections on Race and Surveillance
We want you to know that we know who you are.
Mankind has always had the unfortunate and vile need to make distinctions between “desired” and “undesired” people, a deep-seated human aspect which throughout history has brought destruction upon many lives. This need to define ourselves, not based on what we are but on what we are not, has sometimes been given a physical expression in the form of visual identifying markers such as the yellow star of David required to be worn by Jews during the Second World War, to identity cards stating Hutu or Tutsi ethnicity during the lead up to the 1994 genocide in Rwanda. The 2010 “Support Our Law Enforcement and Safe Neighborhoods Act” of Arizona which required citizens of the to wear identity cards at all times is a more recent example of a state’s attempt to single out a specific group, in this case the Hispanic population. The law was partially stuck down by the Supreme Court but still allows law enforcement officers permission to check a motorist’s immigration status, opening up the argument that the law can be used as a tool for racial profiling.
Simone Browne’s writings, on how racializing surveillance can function as a tool to exercise social control over target populations, addresses both the history and the practical component of discriminatory factors, tracing such practices back to the era of American slavery. By describing different mechanisms of surveillance, the author details how a slave’s physical movement was controlled in the form of “a slave pass” which had to be produced upon request while operating outside of the plantation. Other measures existed in the form of slave patrols and publicized posters of runaway slaves which encouraged the vigilance of the everyday man. This practice acted as an invitation for communities to get involved as enforcers of social control. Efforts to control “undesired” people are still in place today and as described by the author, census questions, racial box ticking, and bio metrics are all part of the techniques used to get a grasp on who’s out there.
On another note…..I recently moved. I have not notified anyone nor renewed my driver’s license yet the other day I received a request to fill in a juror qualification questionnaire…..how’d they know where to find me………!