The title of this post was originally going to be “Computers as… objective machines?” but since Stephen Ramsay’s Reading Machines: Toward an Algorithmic Criticism focuses on algorithms, I had to change up the noun just a bit. I wouldn’t consider algorithms to be machines, but after typing out the title of the reading, that’s another interesting thought.
Ramsay opens up the text, specifically in his “Preconditions,” by setting the scene with the idea of computers as objective machines:
“Against this view stand those who regard computers in the humanities as providing a welcome relief from the radical skepticism of contemporary humanistic thought. Here, after all, is a machine that not only gives answers but demands them—a device that is wholly intolerant toward equivocation and uncertainty. In this view, the computer represents an emancipation from the ironic imprisonments of postmodern excess. Even without supposing that computation leads toward (or even begins with) objectivity, some see it as a way to get beyond the beached solipsism that characterizes modern discours and toward its right and proper end in raison.” (page ix)
What interested me about this excerpt was not that the idea was new to me, but I realized I had not really given much thought to how some people and institutions have come to the belief that algorithms could also be objective entities. It all began with computers themselves, though that’s another intriguing idea in itself. What exactly does it mean for computers to be objective machines? Does that include computers’ pre-installed applications and the intricacies of the applications themselves? I actually can’t remember the last time I used a computer solely for its pre-installed applications because as far as I can remember, a computer and the Internet have come hand-in-hand — even if the Internet connection was as slow as a snail. What I’m trying to get at is I just can’t place my finger on what exactly encompasses computers when it comes to this idea that computers are objective machines.
Next, while Ramsay discusses algorithms in the context of text analysis, I want to bring up damage that algorithms have been responsible for in other contexts — especially in education systems. I suppose this is my “algorithmic criticism” of education systems (thanks Ramsay!). Weapons of Math Destruction by Cathy O’Neil is a very good book I read last semester that discusses the shortcomings of algorithms in many different contexts. In one chapter, O’Neil writes about how many college admissions offices use algorithms to rank students and predict their behavior, which isn’t an idea that should be new to anyone. You can read the relevant excerpts from Weapons of Math Destruction in this article, but some of the quotes are below:
“Enter the age of big data. Recently, college admissions offices have begun to use algorithms that work on an individual-student basis to profile and predict their behavior. They use social media data, as well as the data supplied by the applications, to compute the likelihood a given student will enroll if accepted, the extent of financial aid needed by the student—or needed to seduce a relatively well-off student—and the chances that student will graduate. It’s the big data version of the exact same game, with the exact same goal: to increase the college’s ranking.”
“What about poor kids? There’s an algorithm for that. The College Board website has a matching algorithm to pair high school students with suitable colleges, and it’s free. This could be a useful tool for many. But the college readiness advisors I interviewed said their inner-city students are almost entirely paired with expensive for-profit universities, the diplomas of which have been shown to be no more useful in landing jobs than high-school diplomas.”
“The college admissions process has become a minefield, and the current algorithms are the mines. If we are to regain control over our education system, we need to do better, and that means a better definition of quality education, with an eye on containing costs. We can start by demanding college rankings, for example, that are tailored to our needs and that take into account cost and future debt loads. The U.S. Department of Education’s College Scorecard is a great start. Big data can help but only if we scrutinize the algorithms instead of unquestionably following them.”
I want to end with a quote from Ramsay’s Reading Machines, since I feel it encompasses all the shortcomings of algorithms regardless of context:
“If algorithmic criticism is to have a central hermeneutical tenet, it is this: that the narrowing constraints of computational logic—the irreducible tendency of the computer toward enumeration, measurement, and verification—is fully compatible with the goals of criticism set forth above.” (page 16)
Similarly, as we saw last week when reflecting on our text mining praxis assignments, many of us ended up doing more of a close reading — when the goal/expectation was distant reading — because it was nearly (if not completely) impossible to determine the context of the words without inserting ourselves into the text mining process. The human factor inevitably cannot (and should not) be completely eliminated.