Blog Post

Don’t Call Us Edge Cases – Designing From the Margins

  • Afsaneh Rigot
| Feb. 04, 2022

The first time I heard about “edge cases'' was in 2016.  I was consulting with a well-meaning technologist about human rights violations I had documented that were made possible through the weaponization of social media, messenger, and dating apps. This person explained to me that these cases would be seen by companies, developers and technologists as “edge cases” - cases that are coded as outliers, and as situations that don’t impact the general public. These were cases not to be overly concerned about, in effect furthering the othering of the marginalized populations they concern.  

It made me furious that the experiences of such important groups of people would be labelled as “edge cases” - a term that effectively categorizes and functionalizes the labelling of the abuses they endure as low-priority anomalies just because they don’t fall into the “main use cases”. This is harm experienced by those who don’t fall into the dominant user groups or even geographical focuses, and/or financial brackets.  

This well-meaning person was also not wrong… there have been many instances where community members, local experts, and I documented and attempted to find solutions to prevent harm caused by certain technologies only to have corporations remind us of how “edge” the cases and reports were. They would assure us that the cases would be taken and examined, yet with little to no follow through. This compartmentalization and turn of phrase, when it comes to human contexts and impacts, is harmful on many accounts; it diminishes the massive impact that technology has on peoples’ lives and therefore also the culpability of implicated corporations and companies.  

The types of cases in my mandate that were discussed were cases predominantly in the Middle East and North Africa. These are cases of queer people being identified, targeted, entrapped, arrested and/or prosecuted with the use and support of differing communication technologies, because their identities are criminalized. Also similar cases where inadequate security and privacy also disincentives vulnerable people such as refugees from reporting abuse or violence from law enforcement authorities in the fear that the tracked data will lead to their deportation or incarceration. The political, social, and legal infrastructure leaves such groups without any societal or legal protections; they are also often bearing the brunt of the law, systematically leaving them at the side lines and on the edges of society.  

In development, an “Edge Case”  is seen as an atypical or less common use case for a product. Edge cases are seen as more difficult to solve and as cases that impact fewer people. The “move fast, break things” ethos was also built on the belief that you can develop products for the masses without solving “Edge Cases”. Even if we hear that our tech giants and Silicon Valley ethos has evolved past the “Move Fast, Break Things” era, “edge case” thinking still embodies western-centrism and has disproportionate impacts on vulnerable and/or hard-to-reach communities. Human rights abuses via tech are only responded to after the fact, when they could be reduced or avoided if technology was designed with these impacts in mind from the onset of the design and engineering process.  

Communication technologies and tools are not designed for these “edge cases”, yet are still advertised to and relied upon by their most vulnerable users. Often the people affected by such cases are not seen by the creators of major technologies. People most likely to suffer these impacts are not wealthy or likely to buy more, they are not in the US or UK markets, they are not white and within heteronormative user scenarios that can be easily described and coded for. But, what if they were sought out from the inception of the engineering and design process? Instead of “edge cases”, what if they were seen as central cases to help better design for all? With an understanding of who is most impacted by social, political and legal frameworks, we can also understand who would be most likely to be a victim of the weaponization of certain technologies. Reframing how we see an “edge case” becomes fundamental here.  

Of course, now that I have been a little more involved with the workings of technologists, developers, and tech companies, I know the stance is often not as crass. Many companies adopt varying levels of scoping, as well as inclusive and participatory design frameworks. Mostly, companies are forced to own up to the negative social impacts of their technology on people far beyond their targeted audiences and do some “damage control”. However, things are still far from adequate and designers of our major tools still predominantly focus on groups with the most institutional and hegemonic power. We’re not designing for historically disinvested, oppressed, and marginalized folks, especially those outside of the “Western” frames of influence from the outset. Rather, we include them in the later stages of implementation or even just bring them in for some tickbox consultations when a tool is already made. These cases continue to be treated as edge - as low priority cases to be eventually dealt with. 

Although many designers and experts suggest moving away from “main use” cases (opposite of edge cases), there has not been a push to focus on those most at the margins: those without infrastructures of legal protection and those who are historically impacted by social and political oppression. In Design for Real Life, Eric Meyer and Sara Wachter-Boettcher suggest that designers depart from the concept of “edge cases” as it marginalizes users and their inherently complex human realities by seeing them as consumers. For example, certain features may cause emotional harm for everyday users in extreme circumstances and these need to be accounted for. They suggest redefining “edge cases” as “stress cases”. But Meyer and Wachter-Boettcher do not go far enough as they are still concerned with the emotional circumstances of [presumably White, American] mainstream users that are commercially incentivisable and are still in the scope of the intellectual vision of creating these products. The cases I and many others are documenting are far more complex than this and are evidence of a much higher degree of harm towards groups that are not as commercially incentivisable and exist far outside the range of vision of technology designers. 

The fault should not be placed solely on engineers or designers as to why there’s a resistance in identifying and focusing on these cases. It is rather a symptom of capitalistic production and consumerism. Or simply, it’s the good ol’ “we just didn’t intend for these things to happen as we didn’t know”. Sadly not identifying these cases, and the engagement of “unintended” consequences has passed. We see in daily instances the impact of tech on historically oppressed and marginalized groups, especially as this tech is rarely, if ever, designed with them in mind. We’ve seen technology’s dangerous impact on sex workers, Black, brown, and Indigenous people, refugee and immigrant communities, as well as many queer communities. Technology is often used to support the structures that oppress them. 

In Design from the Margins (DFM), my current work-in-progress at TAPP, I argue that there are strategies to reduce these harms - especially human rights abuses carried through the weaponization of tech -- and increase protections if we start the design processes from those cases seen at the “extremes”, “outliers” or “edge cases” from a human impact perspective. For example, in my research, at ARTICLE 19 we entered and worked with Middle East and North Africa (MENA) LGBTQ communities, many of whom were from rural parts of each country studied or from doubly marginalized identities. In US/EU design processes these cases would be deemed as the extreme scenarios/“edge cases'' that were rarely engaged with.  Yet, by focusing on these cases, that I call “decentered cases” (this is the term I’m currently using, and I will expand on in the forth-coming DFM concept launch), we brought changes such as the “discreet app icons” toGrindr. This allowed people to use a feature that turns their app icon to look like a calendar, or calculator (and other variations). So, in an initial police search… they bypass that risk. This was a feature created solely based on these “extreme” cases, but it was so popular for the general user base of Grindr that it went from only fully being available in “high risk” countries to available internationally for free. Because people anywhere in the world might have personal reasons to want to maintain their privacy.  

Other similar changes were brought to the apps based on the documentation of these cases and wants of those impacted from the community: for example, more vital security features globally on Grindr based on the ARTICLE 19 research were released - this in turn increased their user counts. This is designing from the margins based on what’s deemed “edge”. It comes with more innovation and sustained growth but for once not at the expense of those often left forgotten. The latest disappearing messages options on WhatsApp were significant changes based on the documentation of the use of digital evidence against queer people in MENA and requests from lawyers who represent these cases (this is will be in a forth-coming in-depth report that I will be publishing with the support of ARTICLE 19 and Berkman Klein Center at Harvard).  If these cases of folks most impacted and at-risk can be designed for, then other decentered cases can also be designed for. It is a question of reprioritization, industry will, and reconstruction on what is seen as edge.  

The reality is that addressing these cases and reframing them as central is not just about improving the design process but also about justice and addressing the human rights abuses caused by technology when it's used in a context it was not designed for. It is about addressing the effects of western-centrism on vulnerable and/or hard-to-reach communities. Understanding the needs, wants and methods of communities not only protects them, but it makes for better technology internationally. By re-centering - repositioning these vital experiences of the decentered, we are creating tools that protect us all.  

A huge thank you to the brilliant Hadas Z, Kendra Albert, and Maggie Delano for their reviews and comments on this piece.  

For more information on this publication: Belfer Communications Office
For Academic Citation: Rigot, Afsaneh .Don’t Call Us Edge Cases – Designing From the Margins .” Perspectives on Public Purpose, February 4, 2022,

The Author

Photo of Afsaneh Rigot