Tag Archives: conspiracies

Does the red pill have an antidote?

Why do previously reasonable people go down the rabbit hole of conspiracy theories, and what can be done to bring them back?


A handful of problems can plausibly be put forward as obstacles to solving all other problems: climate change and the corruption caused by money in politics are two that pop to mind. Whatever other problem you might be trying to solve, chances are that at least one of those two will get in your way.

But a third problem is joining that group: the explosion of conspiracy theories and the disinformation they spread. Want to control the pandemic? You’ll wind up dealing with people who think Anthony Fauci has been behind the virus all along, or that the vaccines contain microchips that track your movements.

Want to cut greenhouse gas emissions? You must have been duped by the conspiracy that is using the climate change hoax to institute a global socialist dictatorship.

Worried about the state of our democracy? You obviously don’t understand that there’s nothing to save, because all our elections are already rigged. Millions of illegal immigrants are allowed to vote! And dead people. And the servers that count our votes are actually in some other country.

Whatever else you might want to focus on is a waste of time anyway. It’s just a distraction from the blood-drinking child-sex ring that controls the world. That’s the real problem.

David Neiwert’s book. I first heard of David Neiwert when he was writing the Orcinus blog. Already in 2004, he was warning about the right-wing drift towards fascism, but doing so in a responsible way, i.e., actually defining fascism and checking current developments against that definition rather than just throwing around loaded words. (That’s why he called the drift of 2004 American conservatism “pseudo-fascism”. It had the seeds, but they hadn’t fully sprouted yet.)

His 2020 book Red Pill, Blue Pill: how to counteract the conspiracy theories that are killing us is a quick read that is full of insight. It falls into a few separable parts:

  • a history of conspiracy theories from the medieval blood libel to the Yellow Peril to the Red Scare to QAnon. I found this fascinating, but if you don’t, you could skip over it.
  • why conspiracy theories are attractive and who they attract
  • how someone can get drawn in
  • what can be done to pull someone out

The title comes from the red-pill/blue-pill choice Morpheus gives Neo in The Matrix. The red pill represents awakening to the hidden reality that other people fail to see or refuse to see. Conspiracy theorists often talk about the moment they were red-pilled.

How to tell real conspiracies from conspiracy theories. A question I often raise on this blog is whether a term actually means something or is just an insult. Political correctness, cancel culture, critical race theory — do they have any content beyond being pejorative labels?

You might think conspiracy theory is another term with little objective meaning — just “a theory other people believe, but I don’t”. But Neiwert uses the term more precisely than that: A conspiracy theory isn’t just a theory about a conspiracy, it’s a theory that goes against everything we know about actual conspiracies.

People really do conspire sometimes, but actual conspiracies (Watergate, say) are narrow in scope, limited in time, and involve a fairly small number of conspirators. Cross any of those three lines, and odds are excellent that your conspiracy won’t stay secret long enough to achieve its goals.

Conspiracy theories, on the other hand, postulate vast conspiracies that control everything and yet operate in the shadows for decades or even centuries. The Illuminati has been manipulating world politics since the 1700s, and the conspiracy of blood-drinking child abusers is so large that QAnoners expect thousands of arrests and executions will be needed to stamp it out. The goal of the reptilian conspiracy is to control the Earth, forever.

Some conspiracy theories start with a plausible hypothesis. It’s not crazy, for example, to wonder if alien civilizations exist, or if alien explorers might have visited this planet. But such speculations become conspiracy theories when countervailing evidence that would at least prune the branches of an ordinary hypothesis gets explained away by expanding the conspiracy to completely implausible proportions.

Conspiracy theory epistemology. Conspiracy theorists are hard to argue with because they literally think differently. A conspiracy theory catches on not because it is well supported by evidence, but because it connects a lot of dots. The wider and wilder a theory is, the more interest it generates.

A person trained in mainstream critical thinking will want to pick out a small part of a theory and nail down whether it is true or false before moving on to other parts. But a community of conspiracy theorists isn’t interested in that kind of analysis. The attraction of the theory is its broad sweep; whether any particular part of it is true is almost irrelevant. For example, the fact that JFK Jr. did not return from his apparent death a few weeks ago probably did not disillusion most of the people who came to Dallas expecting to see him.

Think about the attempts to debunk Trump’s Big Lie of how the election was stolen from him. (Neiwert’s book came out before the election, so the Big Lie is not discussed.) Debunkers are fighting a hydra: There is no single explanation of how the election was supposedly stolen, but rather dozens of independent theories of rigged voting machines, hacked servers, boxes of ballots appearing from nowhere, dead voters, fraudulent mail-in ballots, illegal voters, and so on. Debunk one, and the theory’s proponents shift to another. (And as soon as your back is turned, the theory you debunked will rise again.) The conspiracy constantly grows as even Republican officials — Brad Raffensperger, the Michigan Senate — refuse to validate it.

It is not inherently crazy to believe that elections can be stolen. But by now the Big Lie is clearly recognizable as a conspiracy theory.

Psychology of conspiracy theorists. The experts Neiwert quotes paint the following picture: People who feel a lack of control in their lives are attracted to conspiracy theories for two main reasons:

  • The conspirators become scapegoats. They — not me — are to blame for the way the world (and my life) is going. Rather than falling victim to random events or societal trends, I have an enemy: Illegal aliens have taken my place in the economy, and the Jews helped them do it.
  • The theory inserts the believer into a more hopeful, more powerful narrative. By learning about the conspiracy, the believer has joined a heroic resistance group that will expose and ultimately defeat the evil conspirators.

These underlying motives explain why conspiracy theorists reject debunking evidence: Evidence was not the primary reason they bought into the theory in the first place.

How people get drawn in. If your first contact with a conspiracy theory is full-blown nuttiness, you’ll probably turn away without a second thought. No one hears out of the blue that the British royal family are shape-changing alien reptiles and thinks, “I should look into that.”

But even the wildest conspiracy theories have a plausible-looking public face. Jeffrey Epstein, for example, appears to have really maintained a stable of under-age sex partners he could offer his global-elite-level friends. Understandable concern for the possibility that missing children could have been kidnapped for sex leads many people to read social media posts or watch YouTube videos that slowly introduce them to the QAnon theory that such a child-sex ring has world-dominating power.

And once you start investigating one conspiracy theory, you will run into others that connect even more dots that you had always wondered about. The people you meet in one conspiracy-theory online community will introduce you to other conspiracies, which interlock in weird ways.

Social media algorithms accelerate this process. If you watch a video that raises plausible-sounding doubts about the effectiveness of masks or vaccines, YouTube will then suggest more radical videos suggesting that vaccine side-effects are being covered up, and then others claiming Covid was engineered by the Chinese — maybe with the connivance of the CDC — to attack America.

If you’d run into that last video first, it probably would have made no impression. But YouTube has groomed you to accept it.

A common mistake. Neiwart points out something I had not thought of: The way we typically research new topics increases our vulnerability. Dylann Roof is a case in point: His journey into mass murder began with a simple Google search: “black on white crime”.

The problem is that the phrase “black on white crime” is primarily used by White racists. (The people who do academic research on crime seldom break things down that way. But how would you know that if you hadn’t thought about this topic before?) If you read one of the White-racist articles Google sends you to, you’ll run into other phrases that are part of that worldview (and seldom occur elsewhere). Google them, and you’re on your way down the rabbit hole.

Instead, Neiwert recommends a media-consumption practice that media-literacy expert Michael Caufield calls SIFT: Stop, Investigate, Find, Trace.

Before reacting to something you see on social media, and rather than continually going deeper into a topic, take a moment to stop and investigate the source: Who is making this claim? What other claims have they made? How credible are they?

Then try to find a more reliable source for the same information. And once you have, trace claims back to their origins: If, for example, so-and-so is supposed to have said something outrageous, see if you can find a full transcript or a video. If a new law is supposed to do something horrible, what law is it exactly? And what does it really say?

Where the rabbit hole goes. It’s striking how many parallels there are between conspiracy theories and drug addiction. The drug provides a feeling that life is getting better, while actually making it worse; so the perceived need for the drug grows.

If someone’s underlying problem is a lack of efficacy in life, believing in a conspiracy is not going to fix it. Instead, a conspiracy obsession will pull a person away from their support system, alienating friends and relatives. But each loss in the real world makes the conspiracy a more important part of the life they still have. Believers become ever more attached to other conspiracy theorists, and to the fantasy that someday (after the Storm comes, say) their former friends and loved ones will see the truth and come back to them begging forgiveness.

Eventually, the believer has no human contacts outside the conspiracy-theory community. And since many of the other conspiracy theorists are broken in one way or another, conspiracy-related relationships tend to be brittle. Groups often fracture, or turn against individual members.

When someone has given up everything for a conspiracy-theory obsession, and then feels rejected by the conspiracy-theory community too, the stage is set for violence.

Like drug addiction, not everyone goes all the way. Most casual users of illegal drugs never become street people who will do anything for their next fix. For many, similarly, Trump’s Big Lie is a relatively harmless way to meet people online and channel an otherwise amorphous rage. They have learned not to discuss their conspiracy-theory hobby with normies, and they will never storm the Capitol or beat police with flagpoles.

But the possibility is always there, and it’s hard to say what will send someone into a tragic spiral.

Can you pull a friend out? Independent of the negative effects conspiracy theories have on our democracy and our social cohesion, many of us know and care about individuals whose lives are being sucked down that rabbit hole. Is there anything we can do to help?

The closing chapter of Neiwert’s book is a 15-step plan based on research he gleans from a number of sources. Before explaining the steps, he warns that the plan doesn’t always work, it takes a lot of effort, and if you aren’t really committed to it you can make things worse.

The gist of the program is that people are pulled out of conspiracy theories when they’re ready and through personal relationships with people who care about them. Again, I see addiction parallels: You’re not going to pull a friend out of drug addiction during the early phase when the drug seems to make everything wonderful.

So the underlying idea is to stay in an honest relationship with your friend — not pretending to agree about stuff you think is crazy, because they’ll eventually see through you — until they hit a crisis of their own and are looking for a way out. Don’t try to argue them out of it by assembling counter-evidence, because evidence is not the point. Instead, try to understand the needs the conspiracy fills and how it fills them. Compassionate listening plays a bigger role than passionate explaining. Keep your other shared interests alive and even expand them if you can. And wait.

The final pages of the book are about the holes conspiracy theories fill in society, and how we can close them. This is largely a work in progress. For example: Conspiracy theories are largely a problem of trust, and who can claim that our current institutions are 100% trustworthy? Long-term, the challenge is to create more trustworthy ways of getting information, and to rebuild our power structures to be more transparent and more responsive to people’s real needs.

What Makes a Good Conspiracy Theory?

https://www.thesuburban.com/opinion/editorial_cartoons/napoleon-s-cartoon-conspiracy-theories/image_8291e550-5e3d-523c-9d22-262dae2f4ca5.html

We’ll never get rid of them, but can we at least process them better?


On this blog I frequently debunk conspiracy theories that spread among conservatives: QAnon, Obama’s birth certificate, Dominion voting machines, Antifa’s role in the Capitol insurrection, and so on. But this week a liberal conspiracy theory kept showing up in my social-media news feeds: The accusations against Andrew Cuomo are part of a scheme to install a Republican as governor of New York, so that he can use his pardon power to protect Donald Trump from New York state prosecutions.

Debunking the Cuomo theory. Before I start using this as an example of a conspiracy theory, though, let’s dismiss it as a sensible interpretation of events: Suppose Cuomo resigns or is impeached. His replacement is the Democratic Lieutenant Governor Kathy Hochul, who has no reason to pardon Trump. Next in the line of succession are the Temporary President of the Senate, the Speaker of the Assembly, and the Attorney General — all Democrats.

Then comes the 2022 election. New York electing a Republican governor is not unheard of: George Pataki served three terms from 1995-2006. But Pataki Republicans are not exactly Trumpists, and in recent cycles Democrats have done quite well in New York. Cuomo won his last election (2018) by 23%. But he doesn’t have some unique ability to pull in votes that puts the governorship in danger if he can’t run. Biden beat Trump in New York in 2020 by 23% as well. Kirsten Gillibrand won the New York senate race in 2018 by 34%. Letitia James won the 2018 Attorney General race by 27%. And the names being discussed as 2022 Republican challengers are not ones that should cause Democrats to quake in fear, particularly if a Trump pardon becomes one of the issues.

In short, raising phony accusations against Cuomo in order to keep Trump out of jail would be a wild scheme that had almost no chance to succeed. Not even Trumpists are crazy enough to invest the kind of resources even a failed attempt would require. And besides, there’s a far more mundane explanation for Cuomo’s problems: Being an asshole finally caught up to him.

My rare attempt at bipartisanship. If conspiracy theories appear in both parties, then sensible people in both parties should want to debunk them. That’s why I was pleased to see someone I rarely agree with, New York Times conservative columnist Ross Douthat, contribute to that effort a little while ago with “A Better Way to Think About Conspiracies“.

He starts with the following observation: The only way to get rid of conspiracy theories completely is to induce everyone to accept the expert consensus on everything. Not only is that never going to happen, it shouldn’t happen, because sometimes the expert consensus is self-serving or corrupt or just wrong in the ordinary people-make-mistakes way. I mean, how many experts told us that Saddam had WMDs, or that Trump couldn’t possibly beat Hillary? Worse, occasionally there are real conspiracies, like Nixon’s Plumbers or the baseball owners’ free-agency collusion.

So if we can’t just deny all conspiracies, or insist that people believe whatever the experts say, what can we do?

If you assume that people will always believe in conspiracies, and that sometimes they should, you can try to give them a tool kit for discriminating among different fringe ideas, so that when they venture into outside-the-consensus territory, they become more reasonable and discerning in the ideas they follow and bring back.

Douthat suggests a few sorting principles that can keep people from falling down the Q-Anon rabbit hole.

  • Simple theories are better than baroque ones.
  • Be skeptical of theories that seem tailored to reach a predetermined conclusion.
  • Take fringe theories more seriously when the mainstream narrative has holes.
  • Don’t start accepting all fringe theories just because one of them looks right to you.

To illustrate the simple vs. baroque distinction, he contrasts two origin-of-Covid-19 conspiracy theories: One says “it was designed by the Gates Foundation for some sort of world-domination scheme”, and the other that “it was accidentally released by a Chinese virology lab in Wuhan, a disaster that the Beijing government then sought to cover up”. Douthat rejects the former out of hand, but finds the latter plausible — not true, necessarily, but possibly worth investigating further.

The difference is that the Gates theory requires postulating a whole bunch of other stuff not in evidence. (What powers those nano-chips in the vaccine once they get into your bloodstream?) But the lab-accident theory just has one unusual event, after which a lot of people behave the way we know a lot of people behave: They would rather lie than accept blame. [1]

He illustrates the predetermined-conclusion point by looking at Trump’s various stolen-election theories. If you’ve ever argued with a Trumpist about this, you’ve probably observed what Douthat did: When you disprove one election-fraud theory, the Trumpist doesn’t reconsider his position, but just comes back with another election-fraud theory. If Georgia’s hand-recount disproves the corrupted-voting-machine-software theory (it does), then what about Detroit having more votes than voters? After you debunk that, what about dead people voting? And so on. The conclusion (Trump really won) remains fixed; the conspiracy theories are just roads to get there.

That should count against them.

Douthat’s point about holes in the mainstream narrative is similar to Thomas Kuhn’s account of scientific revolutions: Novel theories shouldn’t dislodge an accepted theory unless the accepted theory is having trouble explaining anomalies. As Einstein reflected, “If the Michelson–Morley experiment had not brought us into serious embarrassment, no one would have regarded the relativity theory as a (halfway) redemption.”

The example Douthat gives is Jeffrey Epstein. Epstein’s career is so unlikely that you can hardly blame people for trying to place him in a larger story. I would point to the pee-tape theory of Putin and Trump. There is essentially no evidence of a pee tape, but Trump’s defenders have never offered an alternative explanation of why he was so subservient to Putin. Instead, they just denied what we could all see. If the alternative to the conspiracy theory is believing that the Trump/Putin news conference in Helsinki is perfectly normal behavior for an American president, then I’ll keep looking for a pee tape.

The fourth point ought to go without saying, but there is a strong pull in the opposite direction: Once you leave the mainstream, other outside-the-mainstream folks feel like compatriots. (Once you accept alien visitors, why shouldn’t Atlantis be real?) Douthat makes a good point, though: All the world’s revealed religions have stressed that not every voice that pops into somebody’s head is the voice of God. You have to practice discernment.

I’ll support him by pointing out that even though the experts aren’t always right, they usually are. So when you believe a conspiracy theory, you’re betting on a long shot. Long shots occasionally come in, but no gambler makes a successful career out of betting on one long shot after another.

My additional principles. I agree with all of Douthat’s principles, but I don’t think he goes quite far enough. I want to add some ideas that I can easily imagine him agreeing with. And even if he doesn’t …

You don’t have to accept the convention wisdom, but you should know what it is. If you reject it, you should have a reason. Before you retweet something bizarre, take a moment to google a news story on the topic, or check some reference like Wikipedia. Is there a widely accepted explanation you hadn’t considered? Is there a reason not to accept it? If you have such a reason, fine. But at least consider a non-conspiracy explanation.

Evil people face the same problems you do. Have you ever tried to organize something? It’s hard. It gets harder the more people you need to coordinate, and harder still if it’s something like a surprise party, where it’s supposed to be secret, so you can’t just blast out an announcement.

It’s not any easier to organize something nefarious. If you can’t imagine how a richer, more powerful version of yourself could pull something off, be skeptical that somebody else is managing it.

Who are “they”? One way to avoid realizing just how big and complicated a conspiracy would have to be is to attribute it to a nebulous “them”, as Donald Trump Jr. does in this clip: “There’s no place that they won’t go. This week alone, they canceled Mr. Potato Head, they canceled the Muppets. They’re canceling Dr. Seuss from reading programs.” They who?

Everybody in a conspiracy needs a motive. The reason the baseball-owner-collusion theory was plausible (even before it turned out to be true) was that all 32 owners had the same financial incentive: paying their players less.

Now consider the theory that ICUs are faking the Covid pandemic. Everybody who works there needs to be in on it: nurses, doctors, cleaning staff, and so on. Either they’re not telling their loved ones, or the loved ones are in on it too. What motives could possibly unify all those people?

Very few people are motivated by evil for its own sake. A theory I heard fairly often as same-sex marriage cases were working their way through the courts was that same-sex couples weren’t actually interested in getting married, they were just trying to destroy marriage for the rest of us. We are all occasionally tempted to do something out of spite, but seriously: Would you devote a big chunk of your life to a project that gained you nothing, but just destroyed something for somebody else? Not many people would. [2]

As new information comes in, bad conspiracy theories have to grow. A good conspiracy theory might even shrink. A sure sign of a bad theory is that every objection is met by expanding the conspiracy. “They’re in on it too.”

But if you imagine organizing a conspiracy yourself, you wouldn’t be constantly trying to bring more people in, because each new person is a new risk. Instead, you’d try to identify the smallest possible group that could pull the operation off.

So if you’re on the trail of an actual conspiracy, the more you find out, the closer you should get to understanding the vision of the planner. Rather than “He’s in on it too”, you should start to realize how a small group of people really could do this. [3]

Contrast this with the nanobots-in-the-vaccine theory. Anybody who has access to a Covid vaccine might put it under a microscope and see those bots. Why aren’t they saying anything? They must all be in on it.


[1] My favorite Kennedy-assassination conspiracy theory is similar: After Oswald fires that first non-fatal shot, a Secret Service agent’s gun goes off by mistake, killing JFK. The agent’s superiors then try to cover that up, and things spiral from there.

Having brought up the Kennedy assassination, which educated my whole generation in conspiracy theories, I have to tell this joke: Two authors of JFK-assassination-conspiracy books are sharing a car as they drive to a convention where they’ll both be on a panel. Unfortunately, they are involved in a highway accident and die. But they’re both virtuous people, so they arrive together in the afterlife.

Their introductory tour of Heaven is given by God Himself, and somewhere between the infinite beach and the endless ice cream bar he tells them that there are no secrets in Heaven. “So if you ever want to know anything — about Heaven, about the Earth, about Me — you just have to ask.”

So one of the authors raises the question he’s been wrestling with for years: “Who really did kill Kennedy?”

And God answers, “Oswald, acting alone, pretty much the way the Warren Report says.”

The authors go silent for a while, until eventually one leans over to the other and whispers, “This goes up much higher than we ever imagined.”

[2] This is one reason I suspect that conspiracy theories do better among religious groups that believe in an active Devil. Unlike anybody you actually know, the Devil is motivated by evil for its own sake. And if the Devil has minions, they also are just trying to do harm.

[3] One of my favorite Kennedy-assassination conspiracy books was Best Evidence by David Lifton. (I’m not endorsing his theory, I’m just illustrating a point.) His theory revolves around how investigators think: They trust some kinds of evidence more than others, and they’ll explain away less-trusted evidence if it contradicts more-trusted evidence.

In a murder case, the best evidence is the body; or, after the body is out of reach, the autopsy. So if you could control that evidence, then you wouldn’t need to involve the whole FBI; they would naturally discount eye-witnesses who saw something that the autopsy says didn’t happen.