There is tremendous momentum right now to “do something”—to hold Facebook to account. This is laudable and necessary, but it is also a distraction from the existential underlying challenge. What truly matters is how Facebook-like systems—whether run by a large company or mediated via a decentralized protocol—impact our capacity to understand, trust, and govern at every scale.
Our north star cannot simply be reigning in Facebook and its ilk. That will feel cathartic—but it will simply be a blip in history on the descent into dystopia.
Within the next twenty years, we will need to navigate a host of crises that would challenge even the most resilient of national and international institutions. As just one example, climate change is unlikely to destabilize every country directly for much longer—but it is already triggering resource scarcity and local conflicts; the ensuing mass migrations will help continue to elevate saber rattling xenophobes to power, likely sparking escalating wars.
Unfortunately, our existing world order is simply not up to the task of tackling such collective global problems—as we’ve seen alarmingly with the pandemic, not to mention more localized human rights abuses.
But what does all of this have to do with Facebook et al? A lot—both fortunately and unfortunately. Facebook is an example of a social technology that facilitates collective communication at global scale. We can treat social media, search engines, and messaging as social technologies (as well as other systems for collecting, aggregating, and routing communication). All of these systems, including Facebook, can implicitly influence crucial aspects of society:
-
Who gains power and influence—e.g. the careful thinker, or the confident sensationalist
-
How we intuit what ‘matters’—e.g. the climate crisis or pizzagate
-
Who we become—e.g. intentional or reactive
How we make sense of the world and ourselves determines what we want to do—and what sorts of actors win the competition for attention determines who gets to ultimately choose.
Of course, our communication systems are not the only obstacle to addressing complex global challenges. Entrenched power structures, emergent crises, and sheer status quo inertia are much bigger culprits—but they are by definition hard to move. Communication systems are one of our primary ways to work through the blockers to action—and if they fail us, then we will fail.
This means that we cannot simply aim to hold the companies that mediate our communications accountable for their harms—we also must also ensure that they actively live to their potential to support the public interest. This calls for a different sort of north star:
Social technologies must facilitate understanding, trust, and wise decision-making.1
Concretely, this means the product design decisions, recommendations, and policies should all prioritize this goal—even over growth—just as drug companies (albeit imperfectly) must also prioritize positive clinical outcomes over usage.
But how do we do that?
The Social Technology Compass Questions
To navigate any challenging terrain—in this case, the structure and impacts of both currently existing and potential social technologies—we need a compass. In this metaphorical sense, the necessary compass should help determine if one is heading toward that north star, or in a very different and dangerous direction. The compass questions aim to do that—they ask: What is rewarded? What is internalized? What is possible? What is easy? Who decides?
As a concrete example, consider the first question "What is rewarded?", and how it applies to social media platforms like Facebook in contrast to the issues captivating public and regulatory attention. For many years, much of the public dialogue was focused on content moderation, enforcement, and censorship—essentially the question of “What should be removed?” Most answers to this question which also support free expression2 would have negligible impact on our north star, particularly given the significant tradeoffs being made. This is not to denigrate the trust and safety professionals doing this difficult work—it really does matter, and can even save lives. But their impacts pale in comparison to the influence of the environment platform where the most reliable way for a politician or media organization to gain attention is to call out an outgroup—“the other”—something now validated by empirical cross-platform research.
Consider the alternative: a platform which is structured in such a way that the most reliable way to “win” the attention game is to (returning to our north star) support understanding, trust, and wise decision-making. Of course, there are lots of challenges to this—it is frankly much harder to pull off, and such a platform may find that user growth is harder relative to a platform more similar to Facebook. Content removal is unlikely to be the primary way that one goes about this—instead the focus would be on product decisions and algorithm design that could help us overcome some of our baser human instinctual reactions.
There are now thankfully more conversations that go beyond takedowns, and instead focus on downranking “problematic” content, but even the focus on “demotion”, “algorithmic amplification” and so on does not get at the core issue. Reducing the spread of explicitly “problematic” content (whatever that means) may help reduce the benefit of causing harm, but does not get us to an environment where the way to succeed is to support understanding, trust, and wise decision-making. Moreover, other purported fixes, such as smaller platform companies, decentralization, and individualized control over one’s feed or product experience would barely move the needle toward our north star. In fact, many of these purported solutions might make the problems worse, or harder to fix. In contrast to these approaches, there are ways that one can likely measure the extent that content is being divisive or uniting along societal fault lines (without anyone even knowing what those fault lines are) and this can be used to reward content that leans more uniting with more attention—across an entire platform.
---
The other compass questions aim to provide an analogous perspective—they don’t direct toward anything that isn’t in some sense obvious, but they help refocus our attention toward solutions that actually help us get to that north star. To use them effectively, one helpful approach is to consider social technologies as a kind of “game,” and to apply the compass questions in that context:
-
What is rewarded? → How the external world is impacted by the game; i.e. how moves can impact resource allocation and need fulfillment.
-
What is internalized? → How players are changed by the game; i.e. in a way that impacts what moves they choose to make.
-
What is possible? → The moves that players can make.
-
What is easy? → The costs of moves for the players.
-
Who decides? → How the rules of the games are determined.
This makes clear that the first two questions are about impacts, the second two about actions, and the final question is around governance. As an example, consider the “game” of the Facebook newsfeed. Here are some of the resulting answers to the compass questions:
-
What is rewarded? → Content producers, particularly politicians and media organizations, are rewarded the most by attacks against the outgroup.
-
What is internalized? → Users internalize that the outgroup is deeply bad, and they are more likely therefore to intrinsically choose to share and applaud negative content about the outgroup—even independent of any reward for that behavior.
-
What is possible? → The fact that users can react to a post with an anger reaction emoji, which increases how many people will see that post.
-
What is easy? → The ability of users to nearly instantly reshare content.
-
Who decides? → Mark Zuckerburg ultimately determines what the recommendation engine rewards, and therefore what is seen in the feed, likely influenced by external partisan political pressures.
It is also illuminating to consider the flipside of these compass questions, such as “What is not internalized?” For example, one thing that is most definitely not internalized is the capacity to competently navigate the information stream that is presented to a user by a platform like Facebook. This is of course something that a Facebook news feed could explicitly aim to support, and it has made tiny steps to that effect (which has had empirically measurable positive impacts), but it has not invested a significant amount in this.
Reimagining Social Technology
The north star and compass questions illustrate the need for a more explicit understanding of where our information ecosystem is heading. Building on them, a series of proposals will aim to provide sketches of a map—showing paths we might take to help get us past current dead-ends and provide potential answers to those questions.
The first proposal in the series (at platformdemocracy.com) focuses on the question of “Who decides?”, and introduces the concept of platform democracy and platform assemblies, building on years of stakeholder conversations and incredible work by deliberative democracy innovators, researchers, and national governments around the world. It shows that there is a way to get past the ‘dead-end’ belief that only corporations and governments can determine platform policy and hold platforms accountable.
Future proposals will extend this to other forms of collective decision-making and explore answers to other compass questions: both the status-quo and the alternatives that could move us much closer to the north star. Beyond the core proposals, additional outputs will provide context on the compass questions, the empirical and pragmatic foundations for the proposals, and explore the applicability of these proposals to current urgent issues.
Ultimately, the goal is to help map out a positive pragmatic vision of what social technologies could be—and perhaps need to be—if we are to have a world worth living in.
Keep informed about this work via the newsletter and twitter. These frameworks and proposals are continuously evolving; please do reach out for thoughts, feedback, and potential collaboration through the email listed at aviv.me.
Ovadya, Aviv . “Holding Platforms Accountable Is Not Enough. We Need A ‘Compass’ For Social Technologies .” November 9, 2021