Applying an important lesson from Dr. Ruha Benjamin’s book, “Race After Technology” — there may be a difficult truth beneath the glitch.
If you’ve seen The Matrix, you likely remember the déjà vu scene, in which Neo notices a black cat walk by twice:
Even watching the animated GIF can induce some disturbing chills. And that sense of disturbance is no coincidence: as Trinity quickly explains to Neo, this minor “glitch” involving the black cat is actually an important sign. It indicates that the agents of the Matrix have changed something in the program, rearranging the reality that Neo, Trinity, Morpheus, and others must face.
As Dr. Ruha Benjamin explains in her book Race After Technology, this scene from The Matrix provides an instructive depiction of a glitch as an important sign to pay attention to, rather than a trivial problem to ignore.
As most of us have spent more time in the digital world this year, we have experienced more and more “glitches:” the WiFi unexpectedly cuts out, websites sometimes crash, Zoom and other programs freeze up, and so on. While many glitches often seem minor, others are quite damaging.
Some glitches represent what Dr. Ruha Benjamin calls the New Jim Code, which she defines as
“the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era”
Here are just three examples of the New Jim Code which involve seemingly minor “glitches” that actually surface much deeper systemic issues. The first is directly from Dr. Benjamin’s book.
1. Google Maps Pronounces Malcolm X as “Malcolm Ten”
2. Google Photos Tags Black People as Gorillas
3. Google Translate Thinks John Lewis Spoke Portuguese
Each of these three examples features an impressive innovation alongside a problematic oversight:
- Google Maps features an AI that can speak out loud and translate Roman numerals, but fails to recognize the name of famous human rights activist Malcolm X
- Google Photos features an AI that allows users to search for images using words, but fails to recognize black people as humans
- Google’s YouTube features an AI that automatically translates audio into text, and yet it fails when translating a speech from John Lewis at the March on Washington, a speech which preceded Dr. King’s famous “I have a dream” speech
The contrast between innovation and oversight is key: alongside these feats of AI innovation, solving the racist glitches are relatively straightforward to “solve” from a technical standpoint. In other words, racist glitches do not result from what is possible, but what is prioritized. The three examples here show that Google did not prioritize racial justice and inclusion, and instead played a part in the New Jim Code. Each glitch serves to “reflect and reproduce existing inequities” under the “objective or progressive” guise of artificial intelligence.
Since technology always reflects the values of its creators, these glitches can be partially understood by looking at representation disparities in Google’s tech workforce. When Allie Bland noticed the “Malcolm ten” glitch and guessed that “there were no black engineers working [at Google],” she was not far off. As of 2020, only 2.4% of Google’s tech workforce was considered “Black+,” compared to 13.4% of the U.S. population which is Black or African American. This is part of a much broader representation issue in computing, but Google could easily prioritize it if they ever wanted to. They have the resources to take meaningful action.
What should we do about it?
I have heard this question so much recently, including from my own mouth, and have come to realize it is often misguided. The urge to take action is commendable, but there is no set of actions we can simply “check off” to do our part. Sure, we can avoid using Google by switching our search engine to DuckDuckGo or StartPage, but that is not a solution to systemic injustice. In fact, there is no such thing as a quick fix for systemic injustice.
A more helpful notion here is “staying with the trouble.” This concept is from Donna Haraway, and it basically suggests that we keep going when we encounter difficult truths about injustice and inequity. So instead of responding by completing a checklist, we can respond by following a map, which guides our future decisions and actions toward justice.
As Catherine D’Ignazio and Lauren Klein put it in Data Feminism, “staying with the trouble” means “having the courage to keep going when the work is difficult and fuzzy and you and your people and your institutions are a major part of the problem” (emphasis added). So if you feel disturbed and want to do something about it, an important part is to just keep reading, keep thinking, keep having conversations, and stay with the trouble.
And when you notice a glitch, ask yourself if there may be a difficult truth underneath it.