Jason Stanley has written an insightful book in the language of philosophers. Let me try to translate.
The popular view of propaganda is that it’s nothing more complicated than repeating the same lie over and over: Just keep telling people that voter fraud is a serious problem, Mexican immigrants are disease-carrying criminals, and more guns will solve the gun-violence problem; eventually they’ll start believing such things and repeating them to their friends. You pound a lie into people’s ears until it starts coming out their mouths.
But why do some falsehoods and misdirections catch on while others don’t? Why are some notions impervious to contrary evidence? How do they win out over truths whose perception ought to be in people’s best interest? Why, for example, will a person be unmoved by a hundred accounts of climate change from qualified experts, then listen to one crank claiming it’s a conspiracy to establish world socialism and think, “I knew it!”
In the marketplace of ideas, not all products are created equal. Some are born with inherent advantages that don’t depend on logic or evidence. How does that work?
In order to explain in an intellectually rigorous way, Yale Philosophy Professor Jason Stanley has to define or redefine a bunch of terms, and then argue that these concepts will survive the slings and arrows that other philosophers are likely to launch at them. For a layman like me, that makes for a slow and repetitive book (though not a tremendously long one: a little less than 300 pages). But while some of the basic ideas are familiar — motivated reasoning, confirmation bias, echo chambers, dog whistles, and so on — it’s rare to see them assembled in such a complete package.
Defining propaganda. Stanley proposes a broader definition of propaganda than just lies; it’s “manipulation of the rational will to close off debate”. In less technical terms, it’s the use of deception, emotion, misdirection, intimidation, or stereotype to eliminate certain facts or points of view from the discussion.
A specific use of a slur, for example, may not contain any false information, but instead pushes out of mind the humanity of the slurred person or group. Having police pay special attention to “thugs” doesn’t sound as bad as racially profiling young black men. Undermining “that bitch at the office” is easier to justify than driving women out of the workplace. The point of view of “thugs” or “bitches” doesn’t seem worthy of consideration.
Democratic propaganda. The canonical examples of propaganda come from totalitarian states like Nazi Germany or Soviet Russia, which had ministries of propaganda and officially sanctioned media like Pravda. But Stanley is more interested in the special problem of propaganda in countries that style themselves as liberal democracies, where ideas are supposed to be debated freely in an independent press in front of a autonomous electorate. In America, the echo chambers have unguarded exits. Why do so many citizens choose to remain inside?
Stanley points up a key difference: In a totalitarian state, you can easily recognize propaganda, but don’t know whether to take it seriously. (When Hitler cast the Jews as vermin he wanted to exterminate, a man in the street might have shrugged and said, “That’s just propaganda.”) In a liberal democracy, we take the news media more seriously, but have a harder time recognizing when it contains propaganda. (Example: Judith Miller’s NYT articles about Saddam’s WMD program.)
Another difference is that while propaganda fits perfectly into totalitarianism, it strikes at the heart of democracy: If citizens are not rational actors who use the democratic system to defend their interests and values, but instead are manipulated into some other kind of public discussion, then what’s the justification for giving them a say at all?
Two kinds of propaganda. Stanley breaks propaganda down into two types: supporting and undermining. Supporting propaganda is in some sense straightforward: It promotes what it appears to be promoting. For example, a government might raise support for its war effort by publicizing real or imagined atrocities committed by the enemy.
What’s more dangerous for a democracy, though, is undermining propaganda: appeals to public values to promote goals that in fact undermine those very values. For example, by popularizing the false belief that America has a significant voter-fraud problem, voter-suppression tactics can be put forward as necessary to defend the integrity of our elections. A laudable democratic value — integrity of elections — is used to undermine the integrity of elections.
Similarly, the false belief that Christians are discriminated against in America justified Kim Davis in denying marriage licenses to gay couples. The democratic values of equality and fairness were invoked to undermine equality and fairness.
Flawed ideology. Those examples raise another key concept in Stanley’s system: flawed ideology. A flawed ideology is a set of false or misleading ideas that are impervious to evidence. If your target audience has a flawed ideology, then your propaganda doesn’t have to lie to them. The lie, in some sense, has already been embedded and only needs to be activated.
For example, suppose you are addressing people who believe (or at least take seriously the possibility) that President Obama’s anti-ISIS policy is intentionally inept, because he’s a secret Muslim. Instead of making that claim explicitly, all you have to do is activate the flawed ideology by calling the President “Barack Hussein Obama”. Your audience will add the secret-Muslim point to whatever other criticisms you make of Obama’s moves in the Middle East.
What’s more, content that is evoked like this (and not explicitly stated) is harder for the listener to filter out. Stanley gives the non-political example: “My wife is from Chicago.” If the speaker says, “I am married”, the listener might consciously consider whether or not that is true. But “My wife is from Chicago” calls attention to the claim about Chicago, sneaking in the idea that the speaker is married.
In mid-conversation, it may be hard for the listener to specify exactly what content has been evoked by “Barack Hussein Obama”, much less consider whether it is true. Similarly, when Newt Gingrich referred to Obama as “the Food Stamp President”, he evoked all the content that had been previously associated with food stamps: that undeserving people get them because they’re too lazy to work, that most of those lazy people are black, that (because he is black himself) Obama is on their side rather than the side of hard-working white people, and so on. Challenging the explicit claim — that food stamp usage increased during the Obama administration — misses the point, because that part is true. In fact, challenging it and letting supporters defend its accuracy only reinforces their impression that the unspoken content must be true as well.
Flawed ideology is social. Once a flawed ideology exists, it gets reinforced by each use. So American Christians who believe they are persecuted closely followed the Kim Davis story, and came away more convinced than ever that they are persecuted.
But where does flawed ideology come from in the first place? Stanley roots flawed ideology in self-interest, particularly our unconscious attraction to comfortable ideas that tell us we are good and justified in what we hope to do. But the ideas most impervious to evidence aren’t just the ones that further our personal interest, but the ones that support our social identity.
Stanley gives the example of the near-universal belief among pre-Civil-War Southern slave-owners that slavery was justified, and that blacks were too lazy, stupid, and childlike to benefit from freedom. To turn away from that complex of beliefs, you would not only have to realize that your own standard of living is based on a great wrong, but you would also have to indict your parents, your church, your teachers, your friends, and your entire community for conspiring to commit that injustice. Literally everyone you had believed to be good might have to be reclassified as evil. So the stakes are far higher than just increasing the labor expense of your plantation or learning to make your own bed. Changing your ideas about blacks and slavery could change everything for you.
No wonder so few people did. Ideas like slavery could not be examined dispassionately, carefully weighing evidence for and against. Hearing persuasive criticisms of slavery would naturally evoke fear of losing your whole sense of self, so you might seize pro-slavery rationalizations like an overboard sailor grabbing a life preserver.
Today, the reason so few Americans leave their unguarded echo chambers is that those echo chambers are communities that define their social identity. As our politics becomes more polarized and entire states see themselves as blue or red, changing your ideas about abortion or race or Islam or guns or capitalism could mean becoming a whole new person with new friends and memberships, maybe living in a new town or neighborhood. Even your family relationships could be shaken.
Some ideologies threaten democracy more than others. As far back as Plato and Aristotle, philosophers have recognized that different forms of government are based on different values. A dictatorship values decisiveness and loyalty, an aristocracy refinement and breeding, a plutocracy wealth. A corporate state reveres efficiency and orderly procedures. Democracies are based on competing values of freedom, fairness, and equality; and a properly functioning democracy fosters a constant debate about how to balance those values and compromise each with the others.
So the propaganda that most threatens democracy isn’t the kind that argues directly for the values of another system — if people really want more efficiency, we should talk about that — but the undermining propaganda that invokes freedom, fairness, and equality to justify actions that diminish freedom, fairness, and equality.
Consequently, the flawed ideology that most threatens democracy is the self-justifying ideology of privileged groups, like the Confederate slave-owners. If our group has some unfair advantage that is based on foreclosing the options of other people, we will naturally want to believe that our advantage doesn’t really exist (there is no inequality), or that it’s actually fair because of the comparative virtues of our people and those who lack our privileges, or that the un-privileged folks are freely choosing not to do the things that (in reality) the system discourages them from doing.
Complexes of ideas that tell us such things — that freedom, fairness, and equality demand that my people keep their privileges — are so welcome that they seem obvious and natural. “Of course,” you say, “I should have seen that myself.” That nagging sense that our way of life is unjust and unsustainable vanishes. We are the good guys, and those who want to take away our advantages are the bad guys.
Former Republican Congressman Bob Inglis frames climate-change denial just that way in this clip from the movie Merchants of Doubt.
It’s not just a head thing. This is very much a heart issue. It’s not the science that’s affecting us. I mean, the science is pretty clear. It’s something else that’s causing this rejection. Many conservatives, I think, see that action on climate change is really an attack on a way of life.
The reason that we need the science to be wrong is otherwise we realize that we need to change. That’s really a hard pill to swallow, that the whole way I’ve created my life is wrong, you’re saying? That I shouldn’t have this house in the suburb? I shouldn’t be driving this car? That I take my kids to soccer? And you’re not going to tell me to live the way that you want me to live.
And along come some people sowing some doubt, and it’s pretty effective, because I’m looking for that answer. I want it to be that the science is not real.
So: personal interest leads to social identification with the people who share those interests; maintaining social identity prevents the examination of notions that would threaten our way of life, leading to flawed ideology; the false information contained in that ideology can be activated and reinforced by propaganda that may contain no false information of its own; with the result that freedom, fairness, and equality seem to demand the maintenance of our unfair and unequal advantages — “You’re not going to tell me to live the way you want me to live.” — even if it ultimately means that others will have their freedom diminished. The resulting beliefs are then almost impossible to refute with evidence, because such an argument is tied to a threat to the believer’s community and social identity.