By Lillian Gooden
Looking at the Margins: Incorporating Harm Reduction Into Tech
Presented by Norman Shamas and Afsaneh Rigot of Article 19
Radical Networks Conference, October 2019
Hosted at Prime Produce, NYC
I attended the Radical Networks 2019 conference on a rainy October afternoon. This conference, which centers marginalized and oppressed groups, gathers artists, experimenters, and researchers, and invites them to exchange radical ideas on technology and telecommunications. Conference participants are radical thinkers who want to use technology to help their communities while resisting systems of control and surveillance.
The sessions that initially drew me to this conference were titled “Media Infrastructures and Racialized Territorial Formations: Perspectives from the South” and “Everything Has A Resonant Frequency: On Crystals, Networks, and Crystal Networks.”
These bore a relation to my term paper, which (as of now) seeks to explore the physical aspects and environmental tolls of the Web’s infrastructure. I came to hear the work of a researcher looking into issues of access for the largely indigenous populations in the warmer, tropical, most remote regions of Colombia in “Media Infrastructures.” I came to hear the insights of another researcher and journalist interested in the supply chain of the minerals that power our communications, from World War II-era radios to smartphones in “Resonant Frequency.”
Harm Reduction is Radical
But my attention was captivated by Shamas and Rigot’s presentation on harm reduction, entitled “Looking at the Margins.” To set the context, Norman Shamas began with a definition of harm reduction. Harm reduction consists of practices that ensure one’s survival and aim to minimize harm from certain activities, typically those that are illegal. With the aid of a tweet by @ReyBee10 (2018), they frame it as a term that was “was started by sex workers, queer & trans PoC, people who use drugs, people in the streets saving their own lives and all the intersections thereof—not by public health folks.”
They presented a few examples of community harm reduction practives such as needle exchanges and safer sex education. Central to the idea of harm reduction is the acceptance of pleasure being a normal part of human life. Taking an abstinence-only approach harms by withholding potentially life-saving information. Shamas and Rigot might argue that harm reduction is design justice in motion, as it involves those most affected by structures of domination design their own solutions (Costanza-Chock, 2015).
In addition, Shamas highlighted humanitarian assistance for migrants crossing the desert in the form of providing food and water. Those providing assistance, they said, act in awareness and opposition to structural oppression in that they believe that migrants crossing the border deserve to live. If we are invested in reducing harm, we must shun stigmatization and judgement and work to mitigate risks instead. Harm reduction is inherently radical.
Harm reduction in tech matters because we currently lack a way to bring in systemic support for people that are being harmed while engaging in certain activities—without stigmatizing them. The presenters brought up two case studies to demonstrate methods of designing harm reduction measures and show their effectiveness. My focus is on the first of the two.
Case Study: Queer Dating Apps in Hostile Societies
Afsaneh Rigot, the second presenter, introduced a case study involving queer dating apps in Egypt, Lebanon, and Iran. In all of these countries, queer people are targets of persecution by government officials and fellow citizens alike. Of course, criminalizing sexuality has never prevented queer people from seeking and enjoying love. They have endured despite the risk of arrest and abuse.
The team at Article 19 sought to learn about queer dating app users, their needs, and design harm reduction solutions around these needs. Taking a design justice approach, they worked with local groups in Egypt, Lebanon, and Iran to get a sense of the environment on the ground. As they embarked on their research, the team understood that “the full inclusion of people with direct lived experience of the conditions [they were] trying to change” was crucial, to use Costanza-Chock’s words from “Design Justice.”
User Interviews Yield Revelatory Findings
It was established early on that participants in these countries had no interest in quitting their dating apps despite the risks, so proposed solutions had to support continued app use. Rigot pointed out that elsewhere, trainings around risky issues tend to be prescriptive, talk down to users, and fail to meet them where they are. Proposed solutions produced by these sessions fail their users by misunderstanding them and their needs.
For example, the discussion around safety on queer dating apps typically centers on privacy and geolocation. It is often recommended that users disable this feature in order to protect themselves. This advice seems reasonable enough, doesn’t it?
However, in talking to queer communities in this study, Rigot and Shamas found that surprisingly, geolocation was one of the features that made users feel most secure when chatting with other GPS-located users on dating apps. Many respondents brought up that knowing that the person they were engaging with was someone who lived in their town made them feel safer!
This revelation underscores the importance of learning locally desired applications or service—one of the ethical guidelines for fieldwork laid out by PERCS at Elon University. It is crucial to tailor any solution that you are designing to the specific needs of your user, and to uncover those needs through dialogue and direct engagement.
The Article 19 team found that devoting energy to developing geolocation solutions would not have been the best use of their time and resources, especially as many users already employed tactics such as GPS spoofing to preserve their anonymity on queer dating apps. Instead, users expressed that they would find it beneficial to have legal resources embedded in the apps they use in the event that they found themselves a target of government surveillance or other abuse.
In their findings, the Article 19 team also identified a desire for app icon cloaking. Suspected “deviants” in the countries surveyed sometimes found their property, including the information on their phones, subject to search. A queer dating app discovered by a government official could be grounds for penal action. By cloaking their app icons, users might be able to keep themselves safe(r) by disguising, say, Grindr, as something so innocuous as a calendar or calculator app.
Perhaps this reality seems distant from a US perspective, as we live in a country where a queer person can legally adopt children, marry their beloved, and even run for president. However, the reality is that despite our tenuous legal protections, many queer Americans live in daily fear of persecution, discrimination, and violence. Solutions like the above can still be of use even to those who do not live in such overtly hostile environments. Designing solutions for the most marginalized in society will yield applications that can protect all users.
Works Referenced:
Costanza-Chock, Sasha. (2018). “Design Justice: Towards an Intersectional Feminist Framework for Design Theory and Practice.” Proceedings of the Design Research Society 2018. https://ssrn.com/abstract=3189696.
PERCS: The Program for Ethnographic Research & Community Studies. “The ethics of fieldwork.” Elon University. http://www.elon.edu/docs/e-web/org/percs/EthicsModuleforWeb.pdf.
ReyBee10. (2018, Oct 10). “#HarmReduction was started by sex workers, queer & trans PoC, people who use drugs, people in the streets saving their own lives and all the intersections thereof— not by public health folks….respect the origins and beware co-optation..to paraphrase @HarmReduction #HarmRed18” [Twitter post]. Retrieved from https://twitter.com/ReyBee10/status/1052975455748452352.