Race After Technology

“Remember to imagine and craft the worlds you cannot live without, just as you dismantle the worlds you cannot live within”

RUHA BENJAMIN

Ruha Benjamin, an Associate Professor of African American Studies at Princeton University, founder of the JUST DATA Lab, author of two books – People’s Science and Race after Technology, gave an eye-opening talk on the racist practices of technology. She began by giving a trailer of her latest book Race after Technology, moving on to provide some real-life examples of how racism in technology is exercised. She finally talks about the ‘New Jim Code’, also mentioning the various approaches undertaken in order to counter it. I was also able to identify correlations with the various readings done for the class, addressing it where relevant.

TRAILER OF THE BOOK “Race after Technology

As Ruha Benjamin gives a brief of her book Race after Technology, she brought forth her three provocations. The provocations, as she puts it, are as follows:

1.           Racism is productive or is it?

She states that racism is productive, not in the sense of being good, but in the literal capacity of racism to produce things of value to some, even as it creates havoc on others.

2.          Social inputs make some inventions appear inevitable.

According to Ruha Benjamin, when we think about how racist technology shapes us, we tend to limit this thinking to the social and ethical impacts of technology, but we fail to remember, how all this existed prior to the birth of technology. So, it’s not just the impact of technology, but the social inputs that make some inventions appear inevitable and desirable.

3.          People are forced to live in someone else’s imagination.

As Benjamin declares, imagination is not the afterthought where we have the luxury to dismiss or fantasize, but it is a resource, a battleground that involves the input and output of tech and social order. In fact, she states that most of the people are forced to live inside someone else’s imagination. In other words, racism among other axes of dominance helps produce this fragmented imagination, misery for some and monopoly for others.

EXAMPLES OF RACIST TECHNOLOGY

•           Citizen app

Ruha Benjamin continues to talk about the real-life practices of racism in technology. She gives an example of a relatively new application called Citizen. This app sends real-time crime alerts based on a curated selection of 911 calls. It also offers a way to report, live-stream and comment on a reported crime act. It shows incidents as red dots on a map so you could avoid supposedly dangerous neighborhoods. According to Ruha Benjamin, the Citizen app gave people the privilege to avoid crimes, rather than stopping it. Likewise, Citizen and other tech fixes for social problems are not simply about technology’s impact on societies, but also about how racial norms and values shape what tools are imagined necessary in the first place.

•           Racist Robots

Further, Benjamin talks about Racist Robots another apt example of how racism works in technology. There were a series of waves that seemed shocked at the idea of how artifacts can have politics. In contrast, some declared, technology inherits its creator’s biases. According to Benjamin, one of the challenges we now face is how to meaningfully differentiate technologies that are used to differentiate us. This coded bias and imagined objectivity is what she termed the ‘New Jim Code’.

THE NEW JIM CODE

Michelle Alexander’s analysis of the New Jim Code considers how the reproduction of racist forms of social controls and successive institutional forms entails a crucial sociotechnical component, that not only hides the nature of domination but allows it to penetrate every facet of social life under the guides of progress. Benjamin provides an example of a targeted ad from the mid-20th century, which entices white families to purchase a home in the particular neighborhood of Los Angeles. Developers were trying to do this by promising them beneficial restrictions, that restricted someone from selling their property to Black people or other unwanted groups. Followed by the rise of the Black Power movement, Fair housing act of 1968, that thought to protect people from housing discrimination when renting or buying a home. She states the four conceptual offspring of the ‘New Jim Code’, around which the chapters are organized. The offsprings of the New Jim Code, as she declares are:

  • Engineered inequity
  • Default discrimination
  • Coded exposure
  • Techno benevolence

There have been some strong restrictions on the New Jim Code. One of the most heartening revelations is that tech industry insiders have recently been speaking out about the most outrageous forms of corporate collusion that involves racism and militarism. She elaborates by citing an example where thousands of Google employees condemn the company’s collaborations on a pentagon program that uses Artificial Intelligence to make drone strikes more effective. This kind of informed refusal is certainly necessary as we build a movement to counter the New Jim Code. However, according to Benjamin, we can’t wait for the workers’ sympathy to sway the industry. Initiatives like Data for Black Lives and the Detroit Community Technology Project offer a more far-reaching approach, the former brings together people working in a number of agencies and organizations in a proactive approach to tech justice, especially at the policy level. One of the concrete collaborations that has grown out of Data for Black Lives was last year when several government agencies, including the police department and public support system, formed a controversial joint power agreement called Innovation Project, giving agencies broad discretions to collect and share data on young people with the goal of developing predictive tools to identify drug use in the city. There was an immediate and broad-based backlash from the community with the support of Data for Black Lives. In 2017, a group of over 20 localizations formed what they called “Stop the cradle to prison algorithm”. This coalition asks for a better process moving forward, and structural input into advancing upstream interventions. In “Finding Augusta” Heidi talks about how people are getting accustomed to Google which in return of their free service, stores your data and history in order to track users’ preferences and interests to get targeted ads, as the one mentioned previously by Benjamin.

CONCLUSION

She concludes by talking about Harvard Professor Derick’s radical assessment of reality through creative methods and racial reversals insisting “To see things as they really are……you must imagine them as what they might be”.

All in all, the talk was a great one with enlightening thoughts about technology’s racist side, something I had usually overlooked. Her thoughts were strong and to the point with solid examples to back them.

Leave a Reply

Your email address will not be published. Required fields are marked *