by Build Up (twitter / facebook)
Between June and December 2017, Build Up, in partnership with MIT MISTI and with funding from HumanityX and The City of The Hague, ran a pilot program that explored interventions to address polarization on Facebook and Twitter in the USA. We learned a lot: from how to work with APIs to how to craft reflective conversations on social media.
We think it worked: we have evidence that the interventions made people reflect on the way they were engaging on social media. Even more: this pilot gives us confidence that we can create positive, depolarizing change that can outlive our intervention.
Note: If you want a deeper dive into this project, find our full report here.
Polarization is happening to us
The core premise of The Commons is that we believe a majority of people in the USA are not actively driving polarization. Rather, polarization is happening to us. Through the last decade, researchers working to understand the impacts of emerging ICTs (Information and Communication Technologies) posit that political groupings have become siloed, increasing the polarization of public discourse. Moving people from passively accepting a context that escalates conflict to constructively engaging in mediating dialogue in their society is an enormous challenge.
In the USA, a plethora of initiatives that leverage ICTs to encourage constructive dialogue / engagement online have emerged over the past few years. However, many of these initiatives reach very few people, and mostly people who already predisposed to depolarized behaviors.
That’s the gap we are trying to fill: finding people who don’t realize polarization is happening to them. The Commons identifies people engaged in political discussions about the USA on Twitter and Facebook, analyzes the likelihood that they are polarized or polarizing based on their behavior, uses automation (bots) to engage with them, and organizes a network of trained conversation facilitators to follow up on automated contact.
The two strategies that worked
To maximize our learning, we ran three interventions on Twitter and Facebook over six months, iterating on our approach each time. In the end, we found two winning automation strategies that had a high conversion rate into conversations with facilitators.
On Twitter, the most effective strategy was to tweet messages that used the most liberal and the most conservative hashtags about political topics, pointing out that the conversation on that topic was polarised by suggesting people were not being heard. When people responded positively to an automated tweet, they were automatically assigned to a trained facilitator who started a conversation by addressing them on Twitter. Our Twitter bot contacted 880 people with this strategy, 11.7% responded positively to the bot, and 50% of those engaged in conversation on Twitter with a trained facilitator.
The facilitator gave space for the person addressed to identify what they are doing (if they are doing something) and then if they were not doing something, gave them options for what they could be doing. In addition, they sought to recruit commitments of action. One of those commitments involved joining us in a phone conversation to extend the online engagement to a an online, real-time mode of communication to build a deeper community.
On Facebook, the most effective strategy consisted of posting specific prompts on The Commons Project Facebook page, with micro-targeted ads. When people responded positively to a Facebook post, they were automatically assigned to a trained facilitator who started a conversation by addressing them on the comment thread on Facebook. The ads were targeted towards the ‘most polarized cities’ as based on political campaign donations identified through this Crowdpac article. We ran two types of ads: (i) city-targeted ad that asked people whether they recognized a political divide in their city, and (ii) topical ads, using the same cities as geographic targets, and focusing on key topics that were more likely to be divisive — for instance, immigration, gun control, and healthcare. Our Facebook ads reached 24,971 people, 874 responded positively to the ad, and 332 of those engaged in conversation on Facebook with a trained facilitator.
The ads proved to be a successful means of engaging people and significantly more successful than our attempts to reach out directly on comment threads. The most popular, in terms of comments, reactions and shares, were the ads that spoke directly to polarization in cities. The conversations we engaged on with people who commented on the ads ranged in responses and quality. Throughout the conversations, we worked hard to invite people to take action either through recommending our list of resources or by inviting people to engage in a phone call facilitated by our team.
Impact that outlived our interventions
The strategies described above determine the success of our different engagements in the moment, making no claims about their overall impact on behaviors beyond the intervention. We have also gone a step further and explored whether we can see any change among candidates on overall behaviors related to polarization on social media beyond the behaviors explicitly encouraged through the intervention. The ultimate objective of The Commons is to affect the overall polarization of conversations on social media by eliciting a change in the behavior of social media users that encourages more connection across the aisle, exposure to a diversity of views, and identification of shared values. Unlike other initiatives that attempt to tackle polarization, we do not explicitly encourage reaching across the aisle through our interventions. Rather, we encourage a reflection on polarization that (we believe) will have an effect on a user’s general behaviors on social media, and may eventually encourage some to become active “connectors”.
In designing this pilot, we put in place methods to attempt to measure this overall change. This was an ambitious goal, and for both technical and methodological reasons, we find very little that is quantifiable on overall behavior change. On Facebook, a change in February 2018 to the policy that determines available API data from public pages made it impossible for us to measure change as we had hoped. On Twitter, we identified users contacted by the bot using a set of behavior on social media, and then split them randomly into control and treatment groups for our interventions. Comparing the follower / following behavior of treatment and control users does appear to suggest some impact: the treatment candidates look to be more interconnected and drawn towards the center than the control candidates. We are fully aware that this is a tentative result, we do not claim this is a significant change. In future implementations, regression analysis based on this data could provide more robust measures.
We also carried out a small number of interviews (with both Facebook and Twitter users) to gather a qualitative measure of change based on reported experiences, which allow us to glean some further information. A common theme from these interviews was that real human conversation online with a desire to listen can be powerful on both facilitators and the candidates. A number of people remarked how rare it was to have a real conversation about politics online.
Continuing to build The Commons
Based on the experience, outcomes and lessons learned from this pilot, we believe The Commons offers an ethically sound approach to tackling polarization online, with the potential for large scale impact.
Concretely, we believe The Commons could be scaled up to target more people in the USA. We also have a sense that The Commons approach and methods could be replicable to other contexts where Facebook and Twitter are shaping public conversations and contributing to growing polarization.
When we started The Commons, we also set out to start a conversation about the role that bots (or automated data interventions) can play in peacebuilding. You can read more details about the pilot in our full evaluation report. We look forward to hearing from you. For more, follow Build Up on Facebook and Twitter.