Conspiracy theories have been around since basically forever, but until now most theories were fairly harmless. Their reach was usually limited to a few people here and there, and often read for entertainment. Some theories even had a positive impact on society, by forcing authorities to be transparent about major events, like assassinations and terrorist attacks.
The conspiracy theory is different: not only has it received widespread media attention, it has also had an actual negative impact on society. What sets QAnon apart from other conspiracy theories and what can be done about it?
This week’s paper takes a look at the history of QAnon, its relationship to other conspiracy theories, and possible strategies to counter conspiracy thinking.
Internet culture has always been a bit weird. In the early 2000s few places were as weird as 4chan, an image board where users could anonymously publish images, memes, and shitposts. The board gave rise to the hacktivist collective Anonymous, which played a role in the Occupy Wall Street protests and Arab Spring uprisings in 2011. But 4chan also became known for its /pol/ (politically incorrect) , which attracted white nationalists, conspiracy theorists, angry young men, and edgy teens.
The 4chan community often joked about child pornography (“CP”). A well-known example is the Pizzagate conspiracy theory, which involved presidential candidate Clinton and her 2016 campaign manager Podesta (“CP”), a pizza parlor named Comet Ping Pong (“CPP”), and cheese pizza (“CP”). To anyone familiar with chan culture it’s pretty clear that “Pizzagate” is a joke, but somehow the theory managed to make its way to mainstream social media platforms like Twitter, YouTube and Facebook, where some parts of the community took the joke theory seriously.
In that same year, an anonymous chan user who claimed to be an intelligence or military insider started posting “proof” of various pre-existing conspiracies and quickly became known as Q, because they claimed to have top security clearance for classified information.
Like Pizzagate, QAnon also quickly spread to other platforms, where users without
strong critical information literacy skills (including the US president)
proselytised their interpretations of Q’s messages. It’s estimated that as many
as one-third of US Republicans currently believe QAnon to be
Democracies need an informed citizenry that is capable of critical thinking, but the pervasiveness of misinformation and disinformation poses challenges for many people. Clearly conspiracy theory thinking should be countered. But how?
One method involves threshold concepts. For example, we typically think of authority as inherent to a thing or an individual. Another way to think about authority is through the dual lenses of cognitive authority and second-hand knowledge: some knowledge is gained through direct observation and experience, but most of what we know is learnt from other people that we trust, based on our perception of their competence and expertise, both of which evolve over time. Most people ascribe cognitive authority to well-known sources like scientists and mainstream news outlets, while QAnon followers ascribe cognitive authority to Q based on those same principles.
Other important components of trust evaluation have to do with the purpose of the information, who is responsible for creating it, and how it is created; does the source intend to be accurate, transparent and trustworthy? For example, one might ask a QAnon adherent how accurate Q’s predictions are and what quality control mechanisms exist for Q’s messages. Naturally, this should lead to the conclusion that Q’s messages are bullshit.
When confronted with misinformation and disinformation, librarians and
journalists are often quick to recommend fact-checking techniques and tools.
However, these don’t work very well for conspiracy theorists, who believe that
fact checkers deliberately keep “inconvenient truths” out of the conversation
and often reject all authoritative and mainstream sources. This is problematic,
while healthy skepticism encourages us to evaluate critically, global
skepticism leads to a suspicious mindset.
The solution is cognitive flexibility, which embraces open-mindedness and acknowledges where we might be wrong. Two things that can help us get there are metacognition and critical thinking, which help people to evaluate and correct themselves. As an example of what happens without these two things: a 2010 study found that students with low metacognitive skills would jump into faulty decision making, despite being very well aware that they didn’t understand the sources they had been given!
Research from the social sciences suggests that conspiracy ideation stems from multiple factors, many of which have to do with lack of trust, and feelings of uncertainty and anxiety:
People who believe in conspiracy theories might have a greater need to find an explanation for random occurrences or feel a need to be seen as unique.
Conspiracy theorists are more likely to overattribute events to hidden forces, purposes, and motives.
People with a suspicious mindset who buy into the plausibility of a nefarious cover-up are also more likely to buy into a whole scheme of conspiracies, even if some of those conspiracies contradict each other.
People who distrust others around them are also more likely to distrust institutions and society more broadly.
Our tribal instinct is to classify the world into “Us” versus “Them”. With a little bit of paranoia, it’s easy to overanalyse information until we see hidden motives and signs of deceit everywhere.
Perceived exploitation, sustained vulnerability, feelings of powerlessness and alienation make some people turn to conspiracy thinking, especially when they are welcomed by “receptive” communities like QAnon.
Those with lower levels of education and analytical thinking are more likely to see causal intentionality everywhere and suffer from “myside bias”, in which they evaluate, generate, and test hypotheses in a way that favours their own opinions. Having said that, educated people are not immune to conspiracy thinking.
I already mentioned above that fact-checking conspiracy theorists is rarely
effective. You also shouldn’t engage in counter-arguments when conversing with
conspiracy theorists. Labelling a person’s belief a “conspiracy theory” may even
conspiracy theories have a kind of romanticiscm to them.
Instead, it’s better to act as a bridge between their world and (y)ours. Don’t offer contrary information, but let the conspiracy theorist explain their logic while you listen empathetically, use careful language (don’t make it about them, say “and” instead of “but”), and be ready for de-escalation strategies in case the conversation turns heated.
People who believe in conspiracy theories are often simply looking for certainty. Showing compassion and interest helps to build trust. This might not help much at first, but over time you might be able to help them get rid of their biases.
QAnon originated on an image board as a joke theory, but quickly spread to other platforms by people who took it seriously
People with low information literacy, critical thinking, and metacognitive skills are more prone to conspiracy ideation
Conspiracy theorists often suffer from lack of trust, and feelings of uncertainty and anxiety
Don’t try to (immediately) fact-check conspiracy theorists, build trust first using nonviolent forms of communication