Societal divides as a taxable negative externality of digital platforms
An exploration of the rationale for regulating algorithmically mediated platforms differently, by Helena Puig Larrauri
Talk to any peacebuilder trying to disseminate an alternative, complex narrative about a conflict, and they’ll tell you a version of the same thing: the odds are stacked against peace on social media platforms. It’s always been an uphill battle, but these days it feels like there’s no countering the digitally amplified hate and division.
You don’t even have to be a peacebuilder: this message probably resonates intuitively. We all know online toxicity is a problem. So what would it take to unstack these odds? Over the past months, I’ve been doing some thinking about this question as part of Ashoka’s Next Now initiative. The result is this paper, which in essence argues that online polarization should be seen as a negative externality of surveillance capitalism, just like carbon is a negative externality of industrial capitalism — and as such, it should be taxed.
Negative externality: an unintended cost or consequence that affects people who didn’t choose to incur it
Understanding online polarization as a negative externality
Digital platforms were not set up in order to distribute divisive content or create polarizing interactions. Rather, digital platforms produce these societal divides as a by-product of a profit maximizing strategy: that is, polarization is a negative externality of their business model.
The concept of negative externalities is commonly used to explain the impact of industrial capitalism on the environment and on public health. Simply put, an externality is an uncompensated effect of production or consumption that affects society outside of the market mechanism. Where there is a negative externality, the private (or company) costs of production are lower than the social costs of production. In other words, there is no disincentive to produce this negative externality because it is not priced in the business model.
Applying this framework to digital platforms explains why the production of polarization goes unchecked. Producing hateful or polarizing content is not necessarily profitable to platforms: polarizing content would have to be a very large fraction of all content to change engagement metrics in a business-relevant way, and its presence drives away advertisers, damages reputation and invites regulation. The challenge is rather that polarization is a difficult-to-control side effect of optimizing for the type of engagement that platforms do want to maximise for. Pollution is not profitable to industrial capitalism – it’s the manufacturing that is profitable, but reducing pollution is expensive. Similarly, polarization is not profitable to surveillance capitalism – it’s the engagement that is profitable, but reducing polarization is expensive.
Why taxation and not another policy?
Current policy approaches to address the negative societal impact of digital platforms, whether through content moderation or changes to algorithm design, require the collaboration of companies. These approaches face a fundamental challenge: it is in the financial interest of digital platform companies to spread content that is engaging, divisive content is engaging, and platforms are not incentivized to increase their accuracy in detecting or reducing the distribution of polarizing content. Digital platform companies will never invest as much on content moderation as would be required, and are unlikely to make changes to algorithms that limit engagement.
We need a different policy approach, one that addresses the financial incentives that lead platform companies to build algorithms whose feedback loops and unintended consequences result in polarization. If we understand online polarization as a negative externality, then there is a policy argument in favor of creating a financial disincentive to its production.
Data “pollutes” the social good when the collection and use of human behavioral data affects society as a whole, beyond those whose data is collected, and separate from the harm to their individual privacy or agency. If we can understand that polarization is a negative externality of surveillance capitalism, much like carbon emissions are negative externalities of industrial capitalism, then a polarization tax is akin to carbon taxes.
…a polarization tax is akin to carbon taxes.
Taxing the polarization footprint
Understanding that online polarization is both expensive to avoid for platform companies and a negative externality to the social fabric sets the scene for arguing in favor of taxes that address it as a policy response to elicit behavior change.
The most direct way to introduce a tax that addresses the negative societal impact of algorithmically mediated platforms would be to agree on a measure of online polarization, and tax companies according to how much of this negative externality they produce. This might superficially seem similar to external content moderation policies that impose fines to penalize platforms that fail to adequately moderate certain categories of content. The main difference with these content moderation policies is that a taxation policy regulates viral, polarizing content and interactions in the aggregate. In other words, the policy and its enforcement mechanisms need to be set up not to identify individual violations for categories of content that must be removed but to measure a pattern of content consumption, distribution and interaction across the platform.
…not to identify individual violations, but to measure a pattern…
Although taxing a measure of online polarization would be the most direct way to apply the logic of taxing a negative externality to address the societal harm of surveillance capitalism, it may not be viable given the definitional and measurement challenges. Implementing this policy would require piloting approaches to measure the “polarization footprint” of digital platforms, and the viability of proposing a taxation regime based on this measure
Taxing proxies for online polarization
If measuring online polarization is too complex or controversial, what might be taxable proxies for online polarization?
One option might be to tax the databases used to power recommender algorithms. In effect, this would be similar to taxing the “data footprint” of digital platforms rather than their “polarization footprint” by imposing a tax on very large databases (or introducing a tax incentive to disincentivize very large data hoarding). The tax rate could potentially be non-linear, imposing a higher rate the more data is collected (so that Facebook pays more for each additional unit of data than a small social app).
There is one important counterargument to this policy: data use has both negative and positive externalities. Concretely for digital platforms, not all uses of data to recommend content result in societal divides. In fact, there is evidence that personalization creates a lot of value for users. In other words, personalized recommendations are more likely to result in polarization outcomes, but they also result in many other positive outcomes. Implementing this policy would necessitate researching the connection between personalization / content targeting based on certain kinds of personal data and polarization outcomes, in order to understand whether introducing a non-linear tax on large databases might reduce polarization.
The paper also looks at options to tax data centres, data processes and data brokerage — and in summary concludes none of these proxies are adequate at addressing online polarization (though some of these taxes might be great at addressing other kinds of negative societal impact, such as environmental damage or privacy violations).
Where to next?
There are credible policy pathways to make online polarization more expensive in order to reduce the impact of digital platforms on societal divides. This is the start of a longer and much needed debate. We invite challenge to the assumptions that lead us to argue for taxation, and discussion on the viability of concrete tax regimes proposed.