Why do previously reasonable people go down the rabbit hole of conspiracy theories, and what can be done to bring them back?

A handful of problems can plausibly be put forward as obstacles to solving all other problems: climate change and the corruption caused by money in politics are two that pop to mind. Whatever other problem you might be trying to solve, chances are that at least one of those two will get in your way.
But a third problem is joining that group: the explosion of conspiracy theories and the disinformation they spread. Want to control the pandemic? You’ll wind up dealing with people who think Anthony Fauci has been behind the virus all along, or that the vaccines contain microchips that track your movements.
Want to cut greenhouse gas emissions? You must have been duped by the conspiracy that is using the climate change hoax to institute a global socialist dictatorship.
Worried about the state of our democracy? You obviously don’t understand that there’s nothing to save, because all our elections are already rigged. Millions of illegal immigrants are allowed to vote! And dead people. And the servers that count our votes are actually in some other country.
Whatever else you might want to focus on is a waste of time anyway. It’s just a distraction from the blood-drinking child-sex ring that controls the world. That’s the real problem.
David Neiwert’s book. I first heard of David Neiwert when he was writing the Orcinus blog. Already in 2004, he was warning about the right-wing drift towards fascism, but doing so in a responsible way, i.e., actually defining fascism and checking current developments against that definition rather than just throwing around loaded words. (That’s why he called the drift of 2004 American conservatism “pseudo-fascism”. It had the seeds, but they hadn’t fully sprouted yet.)
His 2020 book Red Pill, Blue Pill: how to counteract the conspiracy theories that are killing us is a quick read that is full of insight. It falls into a few separable parts:
- a history of conspiracy theories from the medieval blood libel to the Yellow Peril to the Red Scare to QAnon. I found this fascinating, but if you don’t, you could skip over it.
- why conspiracy theories are attractive and who they attract
- how someone can get drawn in
- what can be done to pull someone out
The title comes from the red-pill/blue-pill choice Morpheus gives Neo in The Matrix. The red pill represents awakening to the hidden reality that other people fail to see or refuse to see. Conspiracy theorists often talk about the moment they were red-pilled.
How to tell real conspiracies from conspiracy theories. A question I often raise on this blog is whether a term actually means something or is just an insult. Political correctness, cancel culture, critical race theory — do they have any content beyond being pejorative labels?
You might think conspiracy theory is another term with little objective meaning — just “a theory other people believe, but I don’t”. But Neiwert uses the term more precisely than that: A conspiracy theory isn’t just a theory about a conspiracy, it’s a theory that goes against everything we know about actual conspiracies.
People really do conspire sometimes, but actual conspiracies (Watergate, say) are narrow in scope, limited in time, and involve a fairly small number of conspirators. Cross any of those three lines, and odds are excellent that your conspiracy won’t stay secret long enough to achieve its goals.
Conspiracy theories, on the other hand, postulate vast conspiracies that control everything and yet operate in the shadows for decades or even centuries. The Illuminati has been manipulating world politics since the 1700s, and the conspiracy of blood-drinking child abusers is so large that QAnoners expect thousands of arrests and executions will be needed to stamp it out. The goal of the reptilian conspiracy is to control the Earth, forever.
Some conspiracy theories start with a plausible hypothesis. It’s not crazy, for example, to wonder if alien civilizations exist, or if alien explorers might have visited this planet. But such speculations become conspiracy theories when countervailing evidence that would at least prune the branches of an ordinary hypothesis gets explained away by expanding the conspiracy to completely implausible proportions.
Conspiracy theory epistemology. Conspiracy theorists are hard to argue with because they literally think differently. A conspiracy theory catches on not because it is well supported by evidence, but because it connects a lot of dots. The wider and wilder a theory is, the more interest it generates.
A person trained in mainstream critical thinking will want to pick out a small part of a theory and nail down whether it is true or false before moving on to other parts. But a community of conspiracy theorists isn’t interested in that kind of analysis. The attraction of the theory is its broad sweep; whether any particular part of it is true is almost irrelevant. For example, the fact that JFK Jr. did not return from his apparent death a few weeks ago probably did not disillusion most of the people who came to Dallas expecting to see him.
Think about the attempts to debunk Trump’s Big Lie of how the election was stolen from him. (Neiwert’s book came out before the election, so the Big Lie is not discussed.) Debunkers are fighting a hydra: There is no single explanation of how the election was supposedly stolen, but rather dozens of independent theories of rigged voting machines, hacked servers, boxes of ballots appearing from nowhere, dead voters, fraudulent mail-in ballots, illegal voters, and so on. Debunk one, and the theory’s proponents shift to another. (And as soon as your back is turned, the theory you debunked will rise again.) The conspiracy constantly grows as even Republican officials — Brad Raffensperger, the Michigan Senate — refuse to validate it.
It is not inherently crazy to believe that elections can be stolen. But by now the Big Lie is clearly recognizable as a conspiracy theory.
Psychology of conspiracy theorists. The experts Neiwert quotes paint the following picture: People who feel a lack of control in their lives are attracted to conspiracy theories for two main reasons:
- The conspirators become scapegoats. They — not me — are to blame for the way the world (and my life) is going. Rather than falling victim to random events or societal trends, I have an enemy: Illegal aliens have taken my place in the economy, and the Jews helped them do it.
- The theory inserts the believer into a more hopeful, more powerful narrative. By learning about the conspiracy, the believer has joined a heroic resistance group that will expose and ultimately defeat the evil conspirators.
These underlying motives explain why conspiracy theorists reject debunking evidence: Evidence was not the primary reason they bought into the theory in the first place.
How people get drawn in. If your first contact with a conspiracy theory is full-blown nuttiness, you’ll probably turn away without a second thought. No one hears out of the blue that the British royal family are shape-changing alien reptiles and thinks, “I should look into that.”
But even the wildest conspiracy theories have a plausible-looking public face. Jeffrey Epstein, for example, appears to have really maintained a stable of under-age sex partners he could offer his global-elite-level friends. Understandable concern for the possibility that missing children could have been kidnapped for sex leads many people to read social media posts or watch YouTube videos that slowly introduce them to the QAnon theory that such a child-sex ring has world-dominating power.
And once you start investigating one conspiracy theory, you will run into others that connect even more dots that you had always wondered about. The people you meet in one conspiracy-theory online community will introduce you to other conspiracies, which interlock in weird ways.
Social media algorithms accelerate this process. If you watch a video that raises plausible-sounding doubts about the effectiveness of masks or vaccines, YouTube will then suggest more radical videos suggesting that vaccine side-effects are being covered up, and then others claiming Covid was engineered by the Chinese — maybe with the connivance of the CDC — to attack America.
If you’d run into that last video first, it probably would have made no impression. But YouTube has groomed you to accept it.
A common mistake. Neiwart points out something I had not thought of: The way we typically research new topics increases our vulnerability. Dylann Roof is a case in point: His journey into mass murder began with a simple Google search: “black on white crime”.
The problem is that the phrase “black on white crime” is primarily used by White racists. (The people who do academic research on crime seldom break things down that way. But how would you know that if you hadn’t thought about this topic before?) If you read one of the White-racist articles Google sends you to, you’ll run into other phrases that are part of that worldview (and seldom occur elsewhere). Google them, and you’re on your way down the rabbit hole.
Instead, Neiwert recommends a media-consumption practice that media-literacy expert Michael Caufield calls SIFT: Stop, Investigate, Find, Trace.
Before reacting to something you see on social media, and rather than continually going deeper into a topic, take a moment to stop and investigate the source: Who is making this claim? What other claims have they made? How credible are they?
Then try to find a more reliable source for the same information. And once you have, trace claims back to their origins: If, for example, so-and-so is supposed to have said something outrageous, see if you can find a full transcript or a video. If a new law is supposed to do something horrible, what law is it exactly? And what does it really say?
Where the rabbit hole goes. It’s striking how many parallels there are between conspiracy theories and drug addiction. The drug provides a feeling that life is getting better, while actually making it worse; so the perceived need for the drug grows.
If someone’s underlying problem is a lack of efficacy in life, believing in a conspiracy is not going to fix it. Instead, a conspiracy obsession will pull a person away from their support system, alienating friends and relatives. But each loss in the real world makes the conspiracy a more important part of the life they still have. Believers become ever more attached to other conspiracy theorists, and to the fantasy that someday (after the Storm comes, say) their former friends and loved ones will see the truth and come back to them begging forgiveness.
Eventually, the believer has no human contacts outside the conspiracy-theory community. And since many of the other conspiracy theorists are broken in one way or another, conspiracy-related relationships tend to be brittle. Groups often fracture, or turn against individual members.
When someone has given up everything for a conspiracy-theory obsession, and then feels rejected by the conspiracy-theory community too, the stage is set for violence.
Like drug addiction, not everyone goes all the way. Most casual users of illegal drugs never become street people who will do anything for their next fix. For many, similarly, Trump’s Big Lie is a relatively harmless way to meet people online and channel an otherwise amorphous rage. They have learned not to discuss their conspiracy-theory hobby with normies, and they will never storm the Capitol or beat police with flagpoles.
But the possibility is always there, and it’s hard to say what will send someone into a tragic spiral.
Can you pull a friend out? Independent of the negative effects conspiracy theories have on our democracy and our social cohesion, many of us know and care about individuals whose lives are being sucked down that rabbit hole. Is there anything we can do to help?
The closing chapter of Neiwert’s book is a 15-step plan based on research he gleans from a number of sources. Before explaining the steps, he warns that the plan doesn’t always work, it takes a lot of effort, and if you aren’t really committed to it you can make things worse.
The gist of the program is that people are pulled out of conspiracy theories when they’re ready and through personal relationships with people who care about them. Again, I see addiction parallels: You’re not going to pull a friend out of drug addiction during the early phase when the drug seems to make everything wonderful.
So the underlying idea is to stay in an honest relationship with your friend — not pretending to agree about stuff you think is crazy, because they’ll eventually see through you — until they hit a crisis of their own and are looking for a way out. Don’t try to argue them out of it by assembling counter-evidence, because evidence is not the point. Instead, try to understand the needs the conspiracy fills and how it fills them. Compassionate listening plays a bigger role than passionate explaining. Keep your other shared interests alive and even expand them if you can. And wait.
The final pages of the book are about the holes conspiracy theories fill in society, and how we can close them. This is largely a work in progress. For example: Conspiracy theories are largely a problem of trust, and who can claim that our current institutions are 100% trustworthy? Long-term, the challenge is to create more trustworthy ways of getting information, and to rebuild our power structures to be more transparent and more responsive to people’s real needs.