Opening remarks at the Build Peace 2020 conference by Helena Puig Larrauri, Director and Co-Founder of Build Up.
Build Peace is a conversation that evolves over the years and that reflects what is around us. In the first four years, we really focused on the opportunities afforded to peacebuilders by digital technologies — on technology as a tool for peace, for shifting power towards local voices. There was a sense of excitement at all we could do.
Over the past two years, the conversation has shifted. At some point we all began to recognize that technology was not a tool external to a conflict context, but an integral part of that context. We reflected on the duality of how technology is affecting conflict dynamics: there’s more polarization and fragmentation; there’s also more plurality and inclusion.
And then came the covid-19 pandemic. As peacebuilders, we need to pay attention to what the pandemic is doing to the fabric that keeps our societies together. What divisions are being exacerbated and how does that shift the balance of power? And of course many of the effects of the pandemic on conflict have nothing to do with digital technology, but rather with the social, political and economic inequities that the pandemic is deepening.
But some are specific to digital technology. In a nutshell: social distancing means more of our lives are taking place online, and it’s made it imperative that we dig a little deeper into understanding the socio-technical ecology, this evolving system we exist in, because it is changing how conflicts unfold and how we engage with conflict. This year I keep asking myself:
How much is it worth risking polarisation and fragmentation in order to build a truly plural and inclusive society? How do we influence this balance in our socio-technical ecology?
Which is a lot. The sketch diagram below is my attempt at digging deeper. Let me talk you through it.
Most of our attention to date has been focused on the tip of this pyramid — the deliberate use of digital technologies to promote division, to attack social cohesion. The next layer under it is still about deliberate, if more covert, uses of technology. I think we’re all becoming increasingly aware of these dangers. Platforms are addressing them with rules and terms of service. Peacebuilders are addressing them through content moderation and digital literacy programs.
When you think about it, this is pretty limited. The things that are happening in these two layers are signals of deeper divisions that technology is worsening by amplifying human weaknesses. It’s important to address hatespeech, disinformation, recruitment, etc… but we’re peacebuilders, so I think we need to go deeper and address the issues that are central to social cohesion.
That’s what is at the bottom of this pyramid. As peacebuilders, we don’t think social cohesion is everyone agreeing all the time. That is the peace of the graveyards — it’s definitely not the peace I want. Social cohesion is not the closure of debate, but a space in which pluralism can flourish peacefully. To hold this space, we work against our neurological hardwiring to define groups by exclusion and to communicate that exclusion aggressively. We try to foster non violent communications that enable the personal and collective transformation for this space to hold. When we work on peacebuilding, when we try to solve for social cohesion, we’re essentially addressing issues that are at the center of the human condition.
That’s why the third layer in this pyramid worries me particularly. Beneath the overt use of digital technologies to foster division is this broader, systemic effect that digital technologies are having on human communication and on human neurology.
Think about disinformation and misinformation. Just so we’re clear on the distinction: disinformation is creating and spreading incorrect information to intentionally deceive or manipulate others; misinformation is spreading incorrect information without the intent to deceive. Most disinformation is aimed at alienating a person or a group — in other words, disinformation uses a lie to close debate, to simplify the narrative rather than allow for plurality. But then that lie spreads, it becomes misinformation, because our brains kind of love heuristic simplicity.
And it spreads more because most digital technologies amplify this very human weakness. Most digital platforms have a surveillance capitalist business model: their monetization strategy is to gather large amounts of data so they can build models or profiles for every individual user that help predict their actions. To get that data, they need users to engage again and again; and to drive engagement, they favor content that is simple and that we already agree with. It’s not that algorithms couldn’t push a different kind of content. Catherine O’Neill says that “algorithms are opinions embedded in code”. The opinion of most platform technologies is that they need more quantity of engagement, more of our attention on stuff, any stuff, and fostering complexity doesn’t serve this attention extraction model.
Ok, so that’s already pretty bad, but there’s another thing that I think is worse at this level. Because even if an algorithm is presenting me with stuff I agree with, I could go elsewhere, right. I could behave differently — I could post and share nuanced content. Why don’t people just not spread disinformation? Why do even your reasonable friends post crazy stuff? The other way that digital technologies drive engagement is by pinging us all the time, and creating an addiction to chasing those pings. We post for the likes, we share for the laughs. The design of notifications is largely influenced by nudge theory, which was pioneered by the Persuasive Technology Lab at Stanford University. Their theory of change is to hardwire habits, it’s a manipulation tactic that changes the incentives our brains respond to.
Algorithms and nudges are resulting in a gradual, slight, imperceptible change in your own behavior and perception — it is almost indistinguishable from magic. Ultimately, the subtle behavior change that surveillance capitalism trades in means we have less control over who we are and what we really believe — as individuals (because habit hardwiring is deeply influencing us) and as a collective (because we engage with different content and so have less of a shared understanding of reality).
And you might think that this is only affecting the digital space. Honestly I don’t think there’s so much of a distinction anymore — who we are online and who we are offline — but there are also many offline spaces where the effect of surveillance capitalism, of data being used to influence our perception of reality, is obvious. One of those is urban space, and South Africans know this better than most. In recent years, smart CCTV systems have been rolled out across most major cities. These systems are powered by facial recognition, they scrutinize peoples’ demographics and movement for a pre-coded set of unusual behaviors that only thinly disguise a racial bias, and exacerbates post-apartheid injustice in urban space.
Let’s take a breath with Adrienne Brooks. Because maybe as she says:
“Things are not getting worse, they are getting uncovered, we must hold each other tight and continue to pull back the veil.”
I actually think we’re all going through this unveiling in 2020 — that somehow the pandemic has made this socio-technical soup we’re swimming in so much more evident.
Some people are saying the way to address this is through individual behavior change. I just want to be clear that I don’t think we should all go analog. Evan Greer made a comment on Twitter that I love, in response to the recent Netflix documentary on social media, they said: “One of my problems with “The Social Dilemma” is that it makes the same mistake a lot of tech observers are making: it treats social media as if it’s cigarettes — something that’s addictive and bad with no value at all. The Internet is more like sex, drugs and rock & roll.”
Abstinence is not the answer — my life is better due to sex, drugs, rock & roll, and the internet. Behavior change is part of the answer, but it’s not to say don’t do it, rather do it differently. At Build Up, we run programs that intervene on social media platforms to foster more plural, complex conversations, and through doing so to influence our own consciousness around human communication. I think there’s a lot more that can be done in this space — to move away from behaviors in digital spaces that result in identity polarization and social division.
It’s going to take more than individuals changing their behaviors to alter this socio-technical ecology we are in. It’s not about regulation of content and publication, because there is great liberation in the ability of anyone to speak up and be heard. The internet has dismantled the concentration of information and communication power in the hands of a few, and we should protect that: I guess I’m saying we need to risk some fragmentation and polarisation for the sake of a plural and inclusive society. As Evan Greer puts it: “If we fall into the trap of framing the Internet as cigarettes, rather than recognizing its complexity as more like sex, drugs, and rock and roll, then we’re playing into the hands of those who would love to see our voices censored and a return to traditional power structures.”
The key issue is not about content, not about the top two layers of the pyramid I shared above, but about how it spreads and captures our attention. What would it take for platforms to be built to support the best in us rather than exploit the weaknesses of the human condition? I think this will only happen when we dismantle some of the surveillance capitalist business models that many digital technologies rest upon. We could demand an ethical code for algorithm and nudge designs; we could tax companies on the amount of data they hold or have community-owned data.
It’s complicated, this unveiling, but I think as peacebuilders we have so much to do in this space — and the content of this conference is a testament to that. Sometimes, when it gets a bit much, I turn to this poem by Maggie Smith for inspiration.
Want to hear more about social justice and pandemic in the digital age? Up until November 8 2020, you can still join us at the virtual Build Peace 2020 conference.