The Algorithm Rejected Me

David Trend

School is where most kids first become aware of what I call the  “update imperative.”  After all, education is a process continual improvement in a step-by-step process of knowledge acquisition and socialization. In this sense schooling represents much more than the beginning of education. For many kids it’s a time of moving from the familiarity of home into the larger world of other people, comparative judgement, and a system of tasks and rewards. Along the way, a package of attitudes and beliefs is silently conditioned: conformity to norms, obedience to authority, and the cost of failure. All of this is presented with a gradually intensifying pressure to succeed, rationalized as a rehearsal for adult life. Rarely are the ideological parameters of this “hidden curriculum” ever challenged, or even recognized. Much like work, American K-12 schools are driven largely by mandates of individual achievement and material accumulation.

By the time college applications are due, levels of anxiety can run out of control, given the role of degrees in long term earnings.  Many students start the admissions Hunger Games as early as middle school, plotting their chances, polishing their transcripts, and doing anything they can to get good grades. Everyone knows how admissions data now flows in an age in which students apply to an average of 10 schools each. Unsurprisingly perhaps, overall applications have increased by 22% in the past year alone.[i] And while the applicant side of this equation has been much publicized, what happens in the admissions office remains shrouded in mystery. Largely unknown are secret criteria driven by algorithms to determine things like likelihood to enroll or willingness to pay. Even less known are kinds of AI analytics used to monitor and grade students, sometimes making prejudicial judgements along the way.

“Imagine being rejected from a university or advised out of your major because you’re Black, or a woman, or a first-generation college student. Imagine learning that these decisions were made by predictive analytics software that you can’t object to or opt out of.” With these words Shea Swauger began a recent piece in the Washington Post, adding the following sober rejoinder. “Just over a decade ago, this seemed unlikely. Now it seems difficult to stop”[ii] Swauger’s article proceeded to detail the rapid pace at which algorithms and AI are becoming the new normal at colleges and universities. Much of this is driven by the squeeze on higher education to deliver more while spending less, even as many institutions are struggling financially. Behind this growing institutional panic are cash-strapped state governments who can’t contribute as much as they once did, and political nightmare of the student debt crisis. As larger and larger chunks of instruction are delivered by minimally qualified part time teachers or graduate students, institutions are seeking more savings by automating work as much as possible.

Adopted as an emergency measure in the early 2020s, the cost savings of online instruction were simply too attractive for many schools to stop. Crises often get exploited to make changes like this, as many sociologists have observed, using terms like “shock doctrine,” “disaster capitalism,” or “creative destruction.” Historically speaking, most distance teaching innovations became commonplace in just this way. Correspondence schooling grew in the 1940s to serve returning World War II vets, telecourses boomed in the 1960s when the baby-boom went to school, and today the once-marginal practice of remote teaching has gone mainstream. “Online Learning is Here to Stay” read a feature in New York Times, citing research from the Rand Corporation saying that 20 percent of schools were keeping many of their online offerings. “Families have come to prefer stand-alone virtual schools and districts are rushing to accommodate, but questions still linger.”[iii]

Questions indeed. No less a source than the U.S. Department of Education issued a report entitled “The Disparate Impact of COVID-19 on America’s Students” documenting deepening inequities resulting from online learning.[iv]  While the shift to distance learning seemed to affect everyone in much the same way, already vulnerable students and schools had less margin for adaptation. Those affected included students without home internet or with poor connections, those in crowded households or with work obligations.  Students with underlying health conditions or disabilities found themselves without the support schools might have offered. In large numbers, students with learning differences had difficulty completing coursework (76 percent) or feeling connected to school (57 percent). But most startling from Department of Education report were data showing online course attendance of Black and Latinx students 22 percent below white students.[v] Overall decisions to attend college dropped by seven percent, with those from high-poverty high schools declining by nearly double the rate.[vi]  To its credit, the Department of Education Report went beyond simply cataloguing the above problems, characterizing what happened as nothing less than a civil rights issue. [vii]

As these equity issues got more attention, education researchers began looking at why it was happening. They discovered the problems went well beyond online learning, per se. The real issue lay in the methods by which teaching was done, the attitudes professors brought to the classroom, and forms of stratification so engrained in teaching practices that no one ever questioned their fairness. Take the concept of “rigor” for instance, a concept professors use to reward excellence over mediocrity.  This is typically done by making it difficult to do well, with arduous assignments, tough tests, and strict rules –– practices that research now shows favors well-resourced students from privileged academic backgrounds. Researchers found that when such barriers were modified or removed, many more students could reach the same standards. Along similar lines, the shift to online learning raised new concerns about academic integrity, with many professors adopting AI-driven surveillance systems like Examity or Respondus to video-record students as they took tests, and software like Turnitin or Grammarly to check for plagiarism. Aside from heightening anxiety for vulnerable students, the surveillance software later proved highly problematic in incorrectly flagging students of color, ones with disabilities, or others with busy home environments.  Here again, certain cohorts of students were unfairly advantaged and disadvantaged in the process.

This kind of surveillance is common in the workplace, of course, where  monitoring is an accepted method of job supervision and performance reviews, as well a means to incentivize productivity and maintaining quality control.  No surprise then that  algorithmic systems now populate workplaces all over America, much as they do today schools and colleges.  Uber and Lyft are famous for this, given the complexity of locating drivers, allocating rides, monitoring driver efficiency, and rewarding tasks. And the past Uber employees have blamed algorithms for reducing their work and affecting their incomes. But workers have trouble proving any of it.  Labor specialists say this new kind of “algorithmic management” upsets the power balance between management and employees, allowing companies to take advantage. And given the long history association of suveillance with worker exploitation and mistreatment, this issue if becoming a growing worry in workplaces worldwide. Concerns are so intense in U.K. that government officials recently issued a report warning potential problems. The New Frontier: Artificial Intelligence at Work listed  potential problems with what is being term by labor specialist algorithmic management.[viii] The include the use of software to track remote workers, algorithmic  monitoring of employees at home, automated decisionmaking in hiring and firing, as well as a troubling lack of transparency and oversite or the algorithms themselves.

In a recent article entitled “The Algorithm Fired Me,” reporter Margaret Roosevelt detailed abuses by Amazon in California.[ix] Well known for its next-day purchase deliveries, Amazon has been accused setting speed quotas for package deliveries and warehouse order fulfillment. Employees say that penalties for “time off task” forces many to skip bathroom breaks and to overlook  safety protocols. Although workers can check their own work speed throughtout the day, many have no idea how their daily “scores” are calculated. One frantic employee reported spending 10 hours a day bending, twisting, scanning, and wrapping in a blind attempt attempt to process 200 items each hour.  Aside from injuries from speed-induced mishaps, this kind of work commonly results in repetitive strain injuries.  Last year the California legislature found that Amazon workers got hurt on the job at double the industry standard. In reponse lawmakers passed a bill, AB 701, forbidding productivity monitoring of the kind seen in Amazon warehouses. Proponents of the measure argued that the law went far beyond Amazons, per se, since a growing number of companies like Walmart recent have begun similar algorithmic management. Equity issues are part of the discussion as well, with workplace statistics showing that black and latinx employees work in such warehouses at nearly double their percentage in the U.S. labor force.[x]

Bad as algorithms seem for workers and students, it’s important to remember who the real culprits are. Software is just a tool, after all, built to serve of a human agenda. What’s really taking place at Amazon and the admissions office is an automated move toward a common agenda of economy. Many technology writers have written about the perils of converting human beings into data for such purposes. When information loses its body it becomes untethered from context, vulnerable to misinterpretation, and potentially harmful.[xi]  Workers and school applicants can become little more numbers, even as those numbers result in life changing events for individuals.

Truth be told, there is nothing especially new about this in either employment or education. The very field of modern statistics grew in large part during the eighteenth century as a tool of public administration for purposes like taxation and population health. Over time this representation of population as data became increasingly harmful –– as statistics became adopted as a money-making tool. From the 1950s onward proponents of “neoliberalism” pushed for defining all of society economic terms, hence conflating efficiency with profitability. This attitude discounted the human element across many kinds of activity, often creating disadvantage and harm to vulnerable populations. With today’s automation of statistical decisionmaking, this reduction of workers, students, customers, and citizens to data points simulaneously has become more prevalent and invisible, often reinforcing agendas, advantages, and inequities outside the concious awarness of those involved.

[i] Scott Jaschik, “An Admissions Recovery?” Inside Higher Ed (Nov. 2021) https://www.insidehighered.com/admissions/article/2021/11/29/early-signs-are-positive-admissions-2021-22 (accessed Feb, 20, 2022).

[ii] Shea Swauger, “The Next Normal: Algorithms Will Take Over College, From Admissions to Advising,” Washington Post (Nov. 12, 2021) https://www.washingtonpost.com/outlook/next-normal-algorithms-college/2021/11/12/366fe8dc-4264-11ec-a3aa-0255edc02eb7_story.html (accessed Feb. 20, 2022).

[iii] Natasha Singer, “Online Learning is Here to Stay,” New York Times (Apr. 14, 2021) https://www.nytimes.com/2021/04/11/technology/remote-learning-online-school.html (accessed Jul. 18, 2021).

[iv] Education in a Pandemic: The Disparate Impacts of COVID-19 on America’s Students, U.S. Department of Education, Office for Civil Rights (Washington, D.C.: 2021).

[v] Education in a Pandemic p. 12.

[vi] Education in a Pandemic p. 32.

[vii] Education in a Pandemic p. ii..

[viii] Owen Hughs, “Workplace Monitoring Is Everywhere: Here’s How To Stop Algorithms Ruling Your Office,” ZDNet (Nov. 16, 2021) https://www.zdnet.com/article/workplace-monitoring-is-everywhere-heres-how-to-stop-algorithms-ruling-your-office/ (accessed Feb. 21, 2022).

[ix] Margaret Roosevelt, “The Algorithm Fired Me: California Bill Takes On Amazon’s Notorious Work Culture,” Los Angeles Times (Aug. 31, 2021) https://www.latimes.com/business/story/2021-08-31/la-fi-amazon-warehouse-injuries-ab701-bill-calosha (accessed Feb. 20, 2022).

[x] Ibid.

[xi] N. Katheryn Hayles, How We Become Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics (Chicago: Chicago, 1999).

Leave a Reply

Your email address will not be published. Required fields are marked *