Don’t forget to request access for our 2022 midterm disinformation dashboard.
The Ecosystem of Mistrust
Disinformation threatens elections in several ways: misinformation can prevent people from voting, falsely portray candidates to sway voters, or undermine the election process and outcomes. The science of addressing disinformation has been focused on why people share it, how to dissuade people from sharing it, and how to correction misinformed beliefs. Most of what we know about fighting mis and disinformation is about preventing everyday people from believing it. The problem for election security however is that many people already believe disinformation. In a recent SSRS poll, 37% of Americans still believe the 2020 election was stolen and roughly half believe an election will be stolen in the next few years.
Disinformation sharing is highly concentrated. For example, in our research on the 2016 election, 80% of disinformation on Twitter came from just 0.1% of people. Those who will be spreading disinformation about the midterms are likely those who are already spreading it about the 2020 election. Fortunately, they represent only a tiny percentage of Americans. But, they reach many people in aggregate. In that same study, the average person had disinformation in 1 out of everyone 100 tweets from those they followed. While small, this finding means everyone likely saw at least some disinformation.
So, when addressing disinformation, there are two types of people to consider. The true believers are the 0.1% of people superspreading disinformation. The vulnerable bystanders are the 30-60% of people watching the conversation who might be swayed to believe the lie. In another post, I’ll go through research that can be used to help address the 30-60%. Today, I’ll focus on the 0.1%.
Addressing True Believers
As mentioned, we have little research on how to address people who already believe disinformation. But there are two analogies in the literature I believe we can use to inform our approach. The first is about extremism and belief change, how we change deeply held beliefs about the world. The second is about socialization. Finally, deplatforming works and sometimes it’s the only way to address extremists.
Deradicalization
Changing people’s beliefs about the world is different from changing their beliefs about information. Deeply held beliefs are part of someone’s identity. Challenging these beliefs is often interpreted as a personal threat. Misinformation corrections are unlikely to work on true believers. The literature on belief change has given us broad outlines on how to change deeply held beliefs. However, the process requires building a relationship, finding common ground, and slowly working away at the many misconceptions. It’s not something you can do on social media.
Lesson 1: Deradicalizing takes a long time. Keep the focus local.
Instead, this de-radicalization process has to be done where you have the time, energy, and relationships - the local level. Outreach to true believers has to focus on local political parties and campaigns, social media groups, and influencers.
Lesson 2: Deradicalization can mean reducing beliefs in disinformation but it can also mean reducing the threat to the integrity of elections.
Many Americans believe the elections are already rigged whether because of the need for private fundraising or the entrenchment of the two-party system. Whether these beliefs are right or wrong, these people did not storm the capital on January 6th and are not arguing state legislatures should be able to throw out election results. Combatting disinformation, even among true believers, can mean recommitting to democratic norms, the rule of law, and honesty about elections.
De-socialization
Inside Organized Racism, Kathleen Blee’s work on how people become white supremacists, led to an eye-opening insight. People don’t start with white supremacists beliefs and then join the local supremacist organization. Instead, most of them made friends with white supremacists who slowly introduced them to the extremist lifestyle, and with it, extremist ideas. What keeps them there is as much the comradery of friends as the desire to oppress non-whites. Similarly, those who got out had to find a new group of friends. The takeaway is that getting into and out of extremism is partly a socialization process.
Lesson 3: Encourage people to look critically at who and where their facts are coming from and encourage them to listen to credible sources.
In the context of disinformation strategy, de-socializing people means challenging the organizations and people they get their information from, whether they are news sources, Facebook groups, or political campaigns. In 2020, much of the disinformation was from people who knew nothing about election processes, let alone about particular elections procedures in particular districts. People didn’t believe these facts in a vacuum. They believed them because the people they listen to told them they were true.
De-platforming
In the United States, the first amendment protects individuals’ freedom of speech from government restriction. It does not protect individuals from Facebook’s terms of service and it does not protect people who make threats or incite violence. With extremists, it is important to know where to draw the line. Militias can run training drills for overthrowing the government but when they formulate a plan to kidnap the governor, the government steps in.
Similarly, elections communicators can engage with those spreading disinformation to deradicalize or desocialize. But, these conversations can lead to harassment campaigns, doxing, and other unprotected behavior. Rather than trying to deal with such behavior on a case-by-case basis, it is important to set out ahead of time what level of harassment one is willing to tolerate and know what recourse you have for protecting yourself.
Lesson 4: Familiarize yourself with deplatforming processes and have standards in place for requesting an account be deplatformed.
Deplatforming superspreaders reduces the spread of disinformation. When it comes to election misinformation or harassment, platforms have made a public effort to enforce their rules. It is important to know how to report violations of platform rules and how to hold platforms accountable to their own terms. In the same way we call the police when someone breaks the law, you can call on platforms when someone violates their rules.
Using the dashboard for true believers
What distinguishes true believers from bystanders is often just how much disinformation they engage in and if they know the lingo of the different conspiracies, claims, or affiliated groups. If someone uses the phrase donkeypox, for example, it’s a good indicator of a true believer because of how narrowly the term is used.
Finding true believers. You can use the midterm elections dashboard to find the disinformation circulating about the election as well as keywords being used in tandem with that disinformation. Then, when you see those words or hear someone discussing the stories, you can get a sense of how connected they are to disinformation.
Formulating deradicalizing responses. If you hear someone discussing or sharing a news item or making a weird claim you believe to be disinformation, you can look for it in our dashboard to see whether and how that topic is being discussed among disinformation sources. By understanding how this information is being constructed by disinformers, you can the kinds of evidence being brought to bear to justify the claims and begin to develop counter-messaging to address those beliefs.
If you want access to the dashboard: