Category Archives: Articles

This is What Judicial Activism Looks Like

When John Roberts was being confirmed as Chief Justice in 2005, he likened his role to an umpire in a baseball game:

Judges are like umpires. Umpires don’t make the rules; they apply them. … I will remember that it’s my job to call balls and strikes and not to pitch or bat.

This was his way of invoking a common conservative trope: that “activist liberal judges” had “legislated from the bench” to create laws that were impervious to repeal through the political process. Roberts was pledging to be a different kind of judge, one who applied the law to the facts the way an umpire applies the rulebook’s definition of the strike zone to the pitch he just saw.

The umpire analogy was always suspect. As Justice David Souter pointed out in his 2010 Harvard commencement speech, cases that can be resolved just by reading the text and applying the facts usually don’t make it to the Supreme Court.

Even a moment’s thought is enough to show why it is so unrealistic. The Constitution has a good share of deliberately open-ended guarantees, like rights to due process of law, equal protection of the law, and freedom from unreasonable searches. These provisions cannot be applied like the requirement for 30-year-old senators; they call for more elaborate reasoning to show why very general language applies in some specific cases but not in others, and over time the various examples turn into rules that the Constitution does not mention.

Constitutional values, Souter recognized, often “exist in tension with each other, not in harmony.” Resolving those conflicts in a way that stays as true as possible to the spirit behind the Constitution as a whole … that requires a judge, not an umpire.

Souter was in many ways the model of what conservatives didn’t want to see in George W. Bush’s judicial appointments: Appointed by Bush’s father, Souter had drifted into the Court’s liberal wing, the wing that conservatives accused of making up laws. Roberts was promising not to do that. He would stay objective, rather than drifting into liberal activism.

When the Court’s McCutcheon v Federal Election Commission decision came out earlier this month, we saw just how ironically things have worked out. The decision, written by Roberts and building on the Roberts Court’s earlier decisions in Citizens United and McComish, is one more step in his completely original remaking (or rather, unmaking) of campaign finance law. John Roberts has become arguably the most activist Chief Justice in U.S. history.

When you read McCutcheon, the most striking thing is the way that Roberts is talking to himself. The precedents quoted are almost entirely those of the Roberts Court itself, many written by Chief Justice Roberts.

Moreover, the only type of corruption that Congress may target is quid pro quo corruption. Spending large sums of money in connection with elections, but not in connection with an effort to control the exercise of an officeholder’s official duties, does not give rise to quid pro quo corruption. Nor does the possibility that an individual who spends large sums may garner “influence over or access to” elected officials or political parties. Citizens United v. Federal Election Comm’n, 558 U. S. 310, 359. The line between quid pro quo corruption and general influence must be respected in order to safeguard basic First Amendment rights, and the Court must “err on the side of protecting political speech rather than suppressing it.” Federal Election Comm’n v. Wisconsin Right to Life, 551 U. S. 449, 457 (opinion of ROBERTS, C.J.). Pp. 18–21.

That bright line between quid pro quo corruption (direct bribery, where a campaign contribution is exchanged for a vote or other favor) and the more general buying of influence — and the idea that the Constitution limits Congress to legislate only on the quid pro quo side of that line — is a pure invention of John Roberts. It did not exist anywhere in law or legal tradition before he joined the Supreme Court.

Roberts also cites an older decision, Buckley v Valeo from 1976, but slides over the fact that he is reversing that decision. Buckley was the Court’s response to the post-Watergate rewriting of campaign finance laws. It upheld the part of the law that restricted campaign contributions, but threw out the law’s limits on campaign expenditures. The Court reached this conclusion via an interesting piece of reasoning that Roberts has completely written over: When a candidate spends money on his campaign, he is exercising his freedom of speech, and the government needs a very serious reason to stop him. But when a contributor gives money to a campaign, he is not himself speaking; contributors are exercising their right to free association, which is also a First Amendment right, but one that is not quite so sensitive as the freedom of speech.

In other words, in 1976 money was not speech.

The 1976 Court upheld the exact kind of restriction that McCutcheon throws out: an overall restriction on the amount of money an individual can give to federal campaigns during a two-year election cycle. So McCutcheon is a reversal, though you will struggle hard to find that fact acknowledged in the text. In Supreme Court tradition, reversals are not done lightly. A major reversal like Brown v Board of Education is a historical landmark, and typically happens only as a last resort. (See David Strauss’ book The Living Constitution for an account of all the ways the Court had tried for decades to make sense of “separate but equal” before recognizing in Brown that it just wasn’t going to work.)

If there is one cardinal symptom of judicial activism, reversal-on-a-whim is it. But Roberts does not struggle at all with reversing Buckley, he simply ignores that he’s doing it. And it’s not just Buckley. In Justice Breyer’s dissenting opinion, he quotes McConnell v FEC, the last major pre-Roberts campaign finance case, which upheld restrictions on soft money contributions:

Plaintiffs argue that without concrete evidence of an instance in which a federal officeholder has actually switched a vote [in exchange for soft money] . . . , Congress has not shown that there exists real or apparent corruption. . . . [P]laintiffs conceive of corruption too narrowly. Our cases have firmly estab­lished that Congress’ legitimate interest extends be­yond preventing simple cash-for-votes corruption to curbing ‘undue influence on an officeholder’s judg­ment, and the appearance of such influence.’

But as Breyer complains, Roberts now quotes Citizens United as if it had reversed McConnell.

Did the Court in Citizens United intend to overrule McConnell? I doubt it, for if it did, the Court or certainly the dissent would have said something about it.

Another major symptom of judicial activism is a judge valuing his own view of reality above that of the legislature. Judges are presumed to be experts in the law. But often a case hangs on not on the law alone, but on facts about the world. Congress can hold months of hearings and require reports from the full apparatus of government, and so is in general better situated to investigate the state of the world than a court is. Within the court system, a district court can spend weeks or months assembling a body of expert testimony, and so higher courts typically defer to a lower court’s findings of fact. In our entire system, no one is more poorly positioned to assess the state of the external world than the Supreme Court.

Non-activist judges realize that.

Lots of reality-based issues enter into campaign finance law: How does corruption really work? How corrupting are various kinds of contributions? How diligently will contributors and political parties look for loopholes in the law? What kinds of legal restrictions are practically enforceable, and which ones require the government to prove intentions that no one can really know? How does the appearance of corruption influence the behavior of voters and the overall health of democracy?

The Bipartisan Campaign Reform Act (BCRA) of 2002 was passed after Congress had assembled massive amounts of testimony and evidence. Moreover, congressmen themselves have direct experience with the temptations towards corruption, and significant interactions with the voters. When McCutcheon came before a district court, that court upheld the law in view of the Buckley precedent, before getting to the evidence-gathering part of the trial. Breyer summarizes:

The District Court in this case, holding that Buckley foreclosed McCutcheon’s constitutional challenge to the aggregate limits, granted the Government’s motion to dismiss the complaint prior to a full evidentiary hearing. … If the plu­rality now believes the District Court was wrong, then why does it not return the case for the further evidentiary development which has not yet taken place?

Why indeed? Is it that Chief Justice Roberts is afraid the facts would get in the way of what he wants to do? Or is he convinced that he already knows everything he needs to know?

Here’s the kind of thing I wish Justice Roberts knew: Last week I was in my home town, where I had dinner with my best friend from grade school. We have argued politics since we were seven, and he is quite conservative today. But we found one issue where we completely agree: No bank should be too big to fail. We agreed that Congress has done practically nothing to fix the financial system after the meltdown of 2008, and neither of us was optimistic that it would.

Why not? Not because the People want banks to be too big to fail. Between the two of us, I believe we represent a fairly broad public consensus on the issue. And not because bankers are delivering sacks of cash to congressmen in quid pro quo exchange for their votes. But the broader influence of big money in politics — the kind that Justice Roberts has placed beyond legal remedy — makes the too-big-to-fail issue unapproachable. Neither I nor my friend is actively pushing for Wall Street reform because … well, what’s the point?

That’s corruption of the political process undermining democracy. And Chief Justice Roberts has decreed that nothing can be done about it.

Who Should Be Beyond the Pale?

Maybe you heard about Brendan Eich, who briefly was CEO of Mozilla. The media’s one-line summary of his story is that Eich was hounded out of his job because he opposes marriage equality for gays and lesbians. The somewhat longer version goes like this:

  • Mozilla (says Wikipedia) “is a free software community best known for producing the Firefox web browser.” (I’ve used Firefox off and on for years, and it has been my main browser for the past few months.)
  • Brendan Eich became CEO of Mozilla on March 24. He was a co-founder of Mozilla and had been Chief Technology Officer previously. The Mozilla blog said Eich “has been deeply involved in every aspect of Mozilla’s development starting from the original idea in 1998.” Back 1998, Marc Andreessen wrote about “Brendan Eich, who single-handedly designed and implemented the scripting language known as JavaScript”.
  • The same day, the small app-development company Rarebit, founded by a married gay couple (one of whom is British and could only get permanent residency in the U.S. after marriage), blogged “It’s personal for us” and announced it would protest by removing its apps from the Firefox Marketplace.
  • On March 28 The Wall Street Journal reported that three Mozilla board members were resigning. The stated issue was not Prop 8, but that Mozilla had picked an insider rather than “a CEO from outside Mozilla with experience in the mobile industry who could help expand the organization’s Firefox OS mobile-operating system and balance the skills of co-founders Eich and [Mitchell] Baker”. The article also noted that “Some employees of the organization are calling for Eich to step down because he donated $1,000 to the campaign in support of Proposition 8, a 2008 California ballot measure that banned same-sex marriage in the state.” The Mozilla blog claims the protests came from “less than 10 of Mozilla’s employee pool of 1,000. None of the employees in question were in Brendan’s reporting chain or knew Brendan personally.”
  • Eich did in fact give the $1000 back in 2008. The public-record listing includes Mozilla as his employer, but that’s just part of the form. Mozilla did not make the contribution, authorize it, or endorse it.
  • Negative buzz developed on Twitter and other social media. By March 31, the online dating site OkCupid was greeting Firefox users with a statement that included “Mozilla’s new CEO, Brendan Eich, is an opponent of equal rights for gay couples. We would therefore prefer that our users not use Mozilla software to access OkCupid.” The statement ended with links for downloading other browsers. (If you insisted on continuing with Firefox, though, you got through.) The OkCupid move seems to have been the trigger to turn a techie Silicon Valley controversy into a mainstream story. (I have to wonder whether OkCupid’s motive was political, or if they realized what a great publicity stunt this would turn out to be. I know I’d never heard of them before, but now I have.)
  • On April 3, Eich resigned. Mozilla insists that he was not fired or asked to resign. The next day, Mozilla insider Mark Surman blogged, “Brendan didn’t need to change his mind on Proposition 8 to get out of the crisis of the past week. He simply needed to project and communicate empathy. His failure to do so proved to be his fatal flaw as CEO.” Rarebit blogged, “I guess this counts as some kind of ‘victory,’ but it doesn’t feel like it. We never expected this to get as big as it has …”

So a better summary is more like: The personal politics of an already controversial choice for Mozilla CEO drew bad publicity to the organization, so he and Mozilla amicably parted ways. It’s still not what I would call a heartwarming story, but let’s at least be accurate.

Backlash. However it really played out, the Eich Affair has turned into an opportunity for right-wingers to denounce “leftist fascists“. Kevin Williamson at National Review wrote “Welcome to the Liberal Gulag.” Over at the web site of the conservative religion-in-public-life journal First Things, Robert George predicted:

Now that the bullies have Eich’s head as a trophy on their wall, they will put the heat on every other corporation and major employer. They will pressure them to refuse employment to those who decline to conform their views to the new orthodoxy

A number of pro-marriage-equality writers used this incident to establish their centrist credentials and distance themselves from what the Brits used to call “the Loony Left“. Andrew Sullivan wrote: “The whole episode disgusts me.” Slate‘s William Saletan denounced “the new Moral Majority” and compared Eich to people who have been fired for being gay. The Atlantic‘s Conor Friedersdorf wrote two articles on the topic, arguing first that pressuring Eich to resign was a violation of liberal values, and then discussing more abstractly the question I raise in the title: When is a point of view so objectionable that good people should stigmatize it and refuse to deal with its proponents in any way? Who should be beyond the pale?

The Wide Pale. Personally, I believe in what you might call a wide pale. Ostracism and boycott have their place, but I prefer to hold them as a last resort. So I continued to use Firefox all through the Eich Affair. My pale’s limits got tested a month or two ago, when a well-known white supremacist posted comments to “The Distress of the Privileged“. Should I just delete them on principle? I decided to wait and see. He posted a few slogans, didn’t insult the other commenters, and didn’t create any disturbance requiring my intervention. The comments are still up.

The wide-pale issue is particularly important when a once-fringe movement becomes mainstream, as gay rights is beginning to. Patterns established when the movement was small and powerless need to get re-evaluated and often are not. For example, the generation of Zionists whose worldview was forged in the Holocaust had trouble taking seriously the idea that Jews could be oppressors. Or, going further back in time, the Puritans who escaped persecution in England couldn’t wrap their minds around the reality that they had become the persecuting establishment in the Massachusetts Bay Colony.

It’s over the top to say that gay rights has gotten to that point already, especially at a time when the right to marry exists in only about half the country, and states are passing laws to legitimize discrimination against gays in the marketplace. But the trends are there. Reading between the lines in the Rarebit blog (“We never expected this to get as big as it has”), I don’t think they ever envisioned themselves as the powerful side of the conflict. A constructive use of the Eich Affair would be to think these issues through.

Morality, not law. The first thing to realize about the Eich Affair is that there’s no legal issue. This isn’t about the First Amendment, because the government isn’t punishing Eich for his views. As in the Duck Dynasty flap in December, everybody involved is exercising freedom under the law: Eich freely contributed to a political campaign, his critics on Twitter and at OkCupid freely stated their objections, consumers freely decided to use or not use the Firefox browser, and Mozilla and Eich came to a free agreement that he should leave.

Of course, many of the abuses during the McCarthy Red Scare of the 1950s were expressions of freedom too. You were free to plead the Fifth Amendment when the Committee asked if you’d ever been a Communist, and all your friends and employers and associates were free to shun you afterwards.

The question is: As a culture, is this how we want to behave? Do we want to evaluate the politics of everyone we deal with, or would society be a more pleasant place if we all made a bigger effort to tolerate people we disagree with? This issue comes up every now and then on the Sift, most clearly during the Chick-fil-A boycott in 2012. In a piece I called “Is That Sandwich Political?“, I confessed to a certain can’t-I-just-eat-lunch annoyance and concluded:

[I]n general, I’m against balkanizing the economy into liberal and conservative sectors. If you really like Chick-fil-A’s food, I don’t think you should let anybody guilt you out of it … But if [Chick-fil-A CEO Dan] Cathy has left a taste in your mouth that a super-sized Coke won’t wash away, don’t let anybody guilt you about that either.

Start here: You feel what you feel. Large chunks of the economy are about giving you pleasure or making you feel good in some way. Sometimes, knowing the backstory of a product or a person ruins that good feeling and consequently ruins the product. This isn’t a rational process and you shouldn’t pretend that it is.

For example: Woody Allen movies. They’re supposed to make you laugh, but if you can’t stop wondering whether or not he sexually abused his adopted daughter, you’re not going to laugh very much. So don’t go. But it’s important to realize that this cuts both ways. Watching Ellen DeGeneres’ show is supposed to be fun. But if knowing that she’s lesbian disgusts you, you’re not going to have much fun. So don’t watch.

Part of the charm of Firefox is that you feel virtuous for using it, because you’re not helping Microsoft/Google/Apple take over the world. (For similar reasons, all the book links on the Sift go to a co-op bookstore rather than Amazon.) But if knowing that Eich was CEO messed up that good feeling for you, it made Firefox less valuable.

That’s why I have a hard time finding fault with Rarebit. As they said, it was personal for them. They were a gay couple spitting into the wind against the larger forces that had tried to keep them apart and made it hard for them to start their company. That’s a little more than just “I don’t like that guy’s politics.”

Given that you feel what you feel, though, the next question is whether you should try to get over those feelings, or instead fan them and try to engender them in others. Do you just privately decide “I’ve eaten enough Chick-fil-A in my life” or do you make a crusade out of it and try to convert others? These are the kinds of questions that become more and more important as your movement gains power and starts to have more responsibility.

The usefulness of purity standards. One point of a boycott is to bring a distant issue into everyday life. The Gallo boycott of the 1970s is a good example. The treatment of farm workers in California was easy to ignore if you were planning a fraternity party at Yale. But if some of the people you invited were boycotting Gallo wine, you had to think about it. Similarly today, eating local or organic or vegan might be a health option, but it’s also creates openings to evangelize against the factory farm system or its treatment of animals.

Having purity standards about what you use — refusing to ignore the moral backstory — can be an important way to balance the nihilism of the marketplace. Blood diamonds, slave labor, dolphin-safe tuna … the market tends to hide the moral implications of our consumption, and refusing to play along is sometimes appropriate. Also, in an era where one of the two major parties opposes regulations on principle, taking action in the marketplace may be the only way you can influence corporate behavior.

So there’s a balancing act to be done: I don’t want a fully politicized marketplace where I have to quiz the baker before eating her muffins. But I also don’t want to advocate a wall of separation between politics and the market.

Rules of thumb. I don’t think there’s a clear line between what should be politicized and what shouldn’t. But these are some rules of thumb I’m using.

  • Corporations are better targets than people. My main objection to campaign against Eich was that it had nothing to do with corporate policy. No one was arguing that Mozilla was being run in a homophobic way. By contrast, Chick-fil-A contributed corporate funds to anti-gay campaigns. So if you bought their food, you were subsidizing those contributions. (More recently, they’ve been downplaying that.) More importantly, corporations are amoral institutions, so you can’t really dialog with one. Hitting it in the bottom line may be the only way to get its attention.
  • If people are targeted, did they make themselves targets or were they ferreted out? This is why I find Eich a more sympathetic figure than Duck Dynasty‘s Phil Robertson. Robertson said a bunch of ignorant, bigoted stuff to a magazine reporter. Again, that’s his right as an American. But it also means that if you’re helping make him a celebrity, you’re helping him promote those views. It’s totally legit to decide you don’t want to do that any more. Eich, on the other hand, gave $1,000 to support Prop 8, which is something any prosperous guy with his views could do. There was no sign he intended to use his position with Mozilla as a platform to campaign against marriage equality.
  • Have attempts at dialog failed? People don’t always realize the full implications of their actions, and can change their minds.
  • Is some drastic action pending that requires you to do something? During the Wisconsin union-busting conflict in 2011, I took heat from a reader for endorsing the boycott of companies supporting Scott Walker. (I sold my stock in Johnson Controls.) I felt that Wisconsin was the beginning of a nation-wide effort to destroy public-employee unions, and a major blow against the existence of all unions. Drastic action was being taken on one side, and similarly drastic action was needed on the other. Prop 8, on the other hand, was settled by the Supreme Court last summer, and all the momentum on the issue belongs to the pro-equality side.
  • Is the view you’re objecting to so reprehensible that you can’t imagine a good person holding it? In some theoretical sense I can imagine a good person being a Neo-Confederate who defends slavery, but my mind revolts when I try to flesh out that vision. Or if you tattoo swastikas on your biceps, sorry, but you’ve lost all my sympathy. On the other hand, I can disagree strongly about abortion and gay rights without demonizing my opponents. (Up to a point. If you want to implement the Biblical injunction to have gays stoned, I can’t see you as a good person.)

If you can think of other rules of thumb for these situations, leave a comment.

Slavery Lasted Until Pearl Harbor

One of the trick questions American History teachers ask their classes is: “When did slavery end?”

The answer that is both obvious and wrong is: with President Lincoln’s Emancipation Proclamation, which you might count either as 1862 (when it was announced) or 1863 (when it went into effect).

It’s a trick question because the Emancipation Proclamation by itself freed almost nobody. It only applied to the Confederate states (not the slave-holding border states that stayed in the Union), and those were precisely the places where no one was paying attention to President Lincoln’s proclamations. Those states had their own president, and he thought slavery was just fine.

The answer the teacher is probably looking for is: with the 13th Amendment, which (as the Lincoln movie dramatized) passed Congress in early 1865. The amendment is short and gets right to the point:

Section 1. Neither slavery nor involuntary servitude, except as a punishment for crime whereof the party shall have been duly convicted, shall exist within the United States, or any place subject to their jurisdiction.

Section 2. Congress shall have power to enforce this article by appropriate legislation.

It became part of the Constitution that December, when newly reconstructed Georgia became the 27th state to ratify it.

But in his 2008 Pulitzer-Prize-winning book Slavery by Another Name, (which in 2012 PBS made into a documentary that you can watch free online) Wall Street Journal reporter Douglas Blackmon came a different conclusion:

Certainly, the great record of forced labor across the South demands that any consideration of the progress of civil rights remedy in the United States must acknowledge that slavery, real slavery, didn’t end until 1945 — well into the childhoods of the black Americans who are only now reaching retirement age.

The loophole. The reason slavery was able to last so long is that the 13th Amendment has a loophole. (Did you notice it? It went right past me.) The loophole is “except except as a punishment for crime whereof the party shall have been duly convicted”. So if you can rig the local laws and get the cooperation of the local law enforcement and court system, you can convict people of “crimes” pretty much whenever you want. Then they can be sentenced to hard labor, and the  state or county can auction them off to the highest bidder until their sentence (or their useful working life) is up.

That’s what happened across the South when whites regained control of state governments after Reconstruction. Vaguely worded laws created crimes like “vagrancy”, enforced almost exclusively against blacks. For other crimes like petty larceny or disorderly conduct, the say-so of a law-enforcement officer or a white “victim” was sufficient to convict, particularly after blacks were disenfranchised and banned from juries. For minor crimes, justices of the peace were empowered to assess fines without a jury, and when inflated court costs were added to a misdemeanor fine, the total was often far beyond any amount that a black worker could raise. He could then be sentenced to forced labor until the state or county recouped the debt through the “rent” paid by an employer. In this way, even a minor offense could result in months or even years of forced labor without pay, under whatever conditions the employer chose.

Instead of true thieves and thugs drawn into the system over decades, the records demonstrate the capture and imprisonment of thousands of random indigent citizens, almost always under the thinnest chimera of probable cause or judicial process. The total number of workers caught in this net had to have totaled more than a hundred thousand and perhaps more than twice that figure.

In the PBS documentary, Blackmon says the convict market was driven by demand, not supply:

In the fall, when it was time to pick cotton, huge numbers of black people are arrested in all of the cotton-growing counties. There are surges in arrests in counties in Alabama in the days before, coincidentally, a labor agent from the coal mines in Birmingham is coming to town that day to pick up whichever county convicts are there.

Industrial slavery. One of the arguments made by apologists for slavery — it goes back at least to John Calhoun’s 1837 speech to the Senate, “Slavery a Positive Good“, and you can still hear it occasionally today — is that the black slaves on Southern plantations were in fact treated better than the immigrant industrial workers of the North, whose bosses did not live side-by-side with them or care about them in the personal way that, say, Scarlett O’Hara cared about Mammy.

Reading Blackmon’s book, in which slaves are used in the mines and furnaces of Birmingham’s growing steel industry, you see that (to the extent that there is anything to it at all) this observation tells you more about the difference between agrarian and industrial society than about slavery. When you compare apples to apples, the evil of slavery is undiminished: Hired field hands in the North were treated better than plantation slaves in the South, and industrial slaves in the South were treated worse than free industrial workers in the North.

That was true even under the Confederacy, but post-Reconstruction industrial slavery was far worse: The slave was rented rather than owned, and so was treated as renters typically treat property. As historian Adam Green says in the PBS documentary: a leased convict could be “worked literally to death. … when [one] worker died, one simply had to go and get another convict.”

Green Cottenham. To give his story a face, Blackmon focuses on Green Cottenham, a Alabaman arrested for vagrancy in 1908 and sentenced to work for a subsidiary of U. S. Steel in the Pratt mines outside of Birmingham.

There he was chained inside a long wooden barrack at night and required to spend nearly every working hour digging and loading coal. His required daily “task” was to remove eight tons of coal from the mine. Cottenham was subject to the whip for failure to dig the requisite amount, at risk of physical torture for disobedience, and vulnerable to the sexual predations of  other miners — many of whom already had passed years or decades in their own chthonian confinement. … Forty-five years after President Abraham Lincoln’s Emancipation Proclamation freeing American slaves, Green Cottenham and more than a thousand other black men toiled under the lash at Slope 12. Imprisoned in what was then the most advanced city of the South, guarded by whipping bosses employed by the most iconic example of the modern corporation emerging in the gilded North, they were slaves in all but name.

Cottenham died of disease before his sentence was up and was buried in an unmarked grave near the mine.

Tip of the iceberg. It’s tempting to compare Blackmon’s 100,000-200,000 estimate to the four million slaves held at the start of the Civil War and see at least some progress. But leased convicts were just the extreme edge of a more general slavery.

Another common practice was for wealthy whites to pay the misdemeanor fines of able-bodied blacks, in exchange for a “contract” pledging to work for a specified period of time. Once “employed”, they were chained and subject to the whip. Often they were kept beyond their contract period, because they had no way to claim their freedom.

One level removed from this were the sharecroppers, who by contract could only sell their crop to their landlord, for whatever price he named. Typically they borrowed from the landlord to buy seed, and never got out of debt. Bankruptcy laws did not apply to them, and running out on a debt was illegal — and could result in being sold to work the mines in Birmingham. Similarly, if you worked as a house servant or in a shop, the name of your white employer was your only defense should the sheriff come looking for “vagrants” he could sell to U.S. Steel.

Freedom was largely an illusion, not just for the leased convicts, but for all blacks.

False dawn. All this was clearly against the federal Civil Rights Act of 1875, but the Supreme Court held in the Civil Rights Cases of 1883 that Congress had no authority to overrule state laws in this way. Effectively, the states could do as they liked, as long as they didn’t call it slavery.

In 1903, President Theodore Roosevelt appointed a U. S. attorney in Alabama who naively decided to enforce federal laws against “peonage” — slavery for debt. A federal judge took his indictments seriously, and a handful of whites were put on trial. But as it became clear that these were not isolated cases, and that truly enforcing the law would disrupt the entire economy of the South, the Justice Department lost its nerve. The attorney was re-assigned, the attorney general got another job, and the main defendant was pardoned without ever spending time in prison.

Pearl Harbor. When the U.S. entered World War II, the Franklin Roosevelt administration realized that the continued existence of involuntary servitude in the South undermined the propaganda war against the Axis. Less than a week after Pearl Harbor, Attorney General Francis Biddle issued a directive to all federal prosecutors instructing them to prosecute cases of “involuntary servitude and slavery”. Finally, the law would be enforced.

It was a strange irony that after seventy-four years of hollow emancipation, the final delivery of African Americans from overt slavery and from the quiet complicity of the federal government in their servitude was precipitated only in response to the horrors perpetrated by an enemy country against its own despised minorities.

Significance today. Taking this story seriously reframes the Civil Rights movement and the entire history of race in America. Those who marched with Martin Luther King were not just the grandchildren of slaves; some had probably been slaves themselves. Likewise, when the Supreme Court demanded the desegregation of schools in 1954 or President Johnson signed the Civil Rights Act of 1965, the South was not a century past slavery, but only a few years.

In my previous posts about race, I have often run into comments about the long history of black crime, or comparisons to the Chinese, many of whom were also brought to America under forced-labor conditions in the 1800s. But that “long history” evaporates if the original post-slavery “crime wave” was actually instigated by whites seeking to re-enslave African Americans.

And no American race or ethnic group faced anything remotely resembling the black experience. Whatever hardships the Chinese or the Irish or any other immigrant group faced, once things turned around, they turned around. Only blacks experienced multiple false dawns, where rights were granted only to be later taken back or ignored. When today’s blacks look skeptically at authority or seem paranoid about the hidden intentions of whites, they are not reacting to the slavery experiences of great-great-grandparents they never met, but possibly of the parents who raised them.

In short: Slavery is a much fresher wound than most of us have been led to believe.

Not Primarily Students, Not Really Amateurs

A labor ruling knocks the wind out of the fantasy of the amateur athlete. The Raiderettes sue. And we’re heading towards another baseball strike.


For some reason I kept running into stories about sports and labor this week. The big one was that Northwestern University’s football players are a step closer to unionizing. A ruling by National Labor Relations Board regional director Peter Sung Ohr says:

In sum, based on the entire record in this case, I find that [Northwestern University]’s football players who receive scholarships fall squarely within the [National Labor Relations] Act’s broad definition of “employee” when one considers the common law definition of “employee.”

Previous rulings (against graduate-student teaching and research assistants unionizing at Brown in 2004) don’t apply because the football players “are not primarily students”. Northwestern says it will appeal to the full NLRB, and from there I’d be surprised if the courts didn’t get involved. This probably won’t be resolved for years.

Citizen Kain

It sounds weird to think of “amateur” athletes organizing. But it’s hard to argue with the claim that a college football player is there to play football, not to be a student. When I was a teaching assistant at the University of Chicago in the 1980s, I had come there to be a graduate student and was teaching to defray the cost. But football players come to Northwestern to play football, and take classes primarily to maintain their football eligibility.

To me, the players don’t resemble amateurs as much as interns. They work very hard in a business that makes an enormous profit, but pays them no salary. Many submit to this deal because they’re building a resume towards a salary-paying job they hope to get later in the NFL. Most of them won’t get that job.

The best thing I saw on the issue was a clip from ESPN’s Outside the Lines, which focused on the ringleader of the unionization movement, Northwestern quarterback Kain Colter. (Colter exhausted his NCAA eligibility this season, and is hoping to catch on with the NFL as a receiver. CBS Sports rates Colter as the 48th best receiver in the draft, or the 393rd best overall prospect. In other words, his goal of playing in the NFL is improbable but not completely absurd.) (Northwestern is fighting this, but I’d put Colter on the cover of my pamphlet: Northwestern shapes leaders who change the world.)

OTL traces Colter’s radicalization to the experience of his uncle, former All-American defensive back Cleveland Colter, who injured his knee in his junior year and was never drafted by the NFL. Cleveland continues to have knee issues to this day, but his medical coverage from USC ended with his playing career. This is not uncommon: A player never does make money from football, but has lifelong expenses related to the wear-and-tear on his body.

Meanwhile, the NCAA and the school can market the player’s name and image, but the player can’t. (Quarterback Johnny Manziel was suspended for half a game this season on the charge that he signed football memorabilia for money.) This system is being challenged in court by former UCLA basketball star Ed O’Bannon.

Often the player doesn’t even get a degree. In that respect, Northwestern behaves better than most universities. BleacherReport says it has the highest graduation rate: 97%, compared to 47% at third-worst Oklahoma. But even at Northwestern, the time commitment of football prevents athletes from receiving the kind of education Northwestern offers its typical student. (The findings in Ohr’s ruling indicate that football is the first commitment of a scholarship athlete; he can only take classes that don’t conflict with football practice.)

ESPN’s Jay Bilas comments on the claims that recognizing players’ rights would kill NCAA sports:

It’s amazing how the rest of us can operate in a free-market system and the world doesn’t spin off its axis, but if athletes got it, boy we’d be in trouble. … People who support the NCAA structure as is, including some politicians, say it’s going to change fundamentally, all those [non-revenue-generating] sports are going to go away, we’re just going to have football and basketball. That’s a doomsday scenario scare tactic, and really it’s shameful because it’s just not true. … But even if it were, we lay all the responsibility on the athlete: If those greedy athletes who may ask for more than a scholarship were to get what they want, all this would go away. We don’t say that about coaches who are making $8 million a year. And we don’t say that about administrators who are making millions. … Nobody says, “Hey, the wrestling program’s going to go away if we pay you this much money.”


U-N-I-O-N

Meanwhile, the Raiderettes could use a union (though they probably won’t get one), because the Oakland Raiders aren’t treating their cheerleaders very well. Several sued the team in January, charging that their $1250 annual salary works out to less than $5 an hour even before hair-and-make-up expenses (which the team demands but doesn’t pay for), and that they are subject to “fines” for offenses like bringing the wrong pom-poms to practice.

Forbes has estimated the value of the Raiders at $825 million. They made $19.1 million in 2012, which is low compared to most NFL teams.


And expect a baseball strike when the current contract with the players’ union expires after the 2015 season. As stupendous as those nine-figure superstar contracts sound, the players as a whole are making an ever-smaller percentage of the league’s revenues; 40% at last count. That’s down from 56% in 2002 and still falling. Hardball Times writes:

Most other major sports leagues have salaries close to half of league revenues, and baseball players were actually doing slightly better than until the last 10 years, when suddenly they started getting a smaller share.

Expect the union to want to turn this situation around, while baseball owners have consistently been the most pig-headed owners of any major league sport. If we lose less than the whole 2016 season, I’ll be surprised.

And revenue is only part of the story of a baseball franchise. As a capital asset, a major league baseball team has been one of the best investments around. After all, it’s a collectors item, one of a limited edition of 30. And Nate Silver points out one of the consequences of rising inequality: As the rich get richer, more people can afford to bid on a baseball team.

Denouncing overpaid players is a crowd-pleasing tactic, but at least the players do something for their money. What exactly do the owners add to the game? They have become the private custodians of a city’s civic pride, and they collect a massive rent on that. They profit from an antitrust exemption that allows them to limit their competition and to decide which cities can and can’t have major-league teams. (If you started a new team, the major league teams would refuse to schedule games with you at any price, which in every other industry would be an illegal restraint of trade.) Most of the teams’ wealth was actually created by government, not by their owners’ entrepreneurial creativity.

Do the owners provide any value for their billions? Back in 1979, Allan Jacobs published a story in Harper’s,The Civil-Service Giants“, in which San Francisco took over its baseball team under eminent domain. It was intended to be humorous, but if such a takeover really happened, if every team wound up being owned by its city, who would know the difference?

The Real Politics of Envy

Whose message is actually capitalizing on envy and resentment?


Tuesday, Politico reported the latest example of — this is happening so often we need to give it a name — Plutocrat Persecution Psychosis:

“I hope it’s not working,” Ken Langone, the billionaire co-founder of Home Depot and major GOP donor, said of populist political appeals. “Because if you go back to 1933, with different words, this is what Hitler was saying in Germany. You don’t survive as a society if you encourage and thrive on envy or jealousy.”

Yes, Langone is echoing fellow PPP sufferer Tom Perkins, who recently warned in The Wall Street Journal that a “Progressive Kristallnact” against the 1% is on its way. (Apparently, only being allowed to vote once — in spite of all his money — chafes on Perkins. Those of us free from the burden of vast wealth can barely hope to imagine what other persecutions he suffers.)

I could sympathize if some terrorist group were burning down mansions, or assassinating “malefactors of great wealth” as Teddy Roosevelt used to call them. But no, this Nazi-like persecution seems to consist mainly of calls to raise our low taxes on the very wealthy (and their corporations), to insist that they pay their employees a somewhat higher minimum wage, and a few rhetorical flourishes that fall far short of having the President of the United States refer to you as a malefactor of great wealth (or, as Teddy’s cousin Franklin put it a few years later “unscrupulous money changers“).

But let’s ignore the over-the-top Hitler reference — many others have taken Langone to task for that — and focus on Langone’s underlying points:

  • There is a growing politics of envy in America.
  • Liberal rhetoric about inequality is based on that envy.
  • The primary push towards envy and resentment in our politics comes from the Left.

I figure this is the venom that is supposed to stay in the public’s bloodstream after the Hitler-barb is plucked out. That’s how these things work: If Langone had compared your moustache to Hitler’s, and you denied it without calling sufficient attention to the fact that you don’t have a moustache, what would stick in the public mind is the vague sense that your moustache is probably more like Stalin’s, or maybe Ming the Merciless’.

Before addressing any of that, let’s spiff up the terminology a little: envy here is actually short for envy-based resentment. By itself, envy is just wishing that some aspect of another person’s life could be part of my life, and it isn’t necessarily destructive. (If I envy a friend’s ability to speak French, maybe I’ll go take a class.) Consumer capitalism couldn’t function without this non-destructive kind of envy. If Americans looked at the neighbor’s fancy new car and just said, “Good for him!” the economy would probably collapse or something

Resentment, on the other hand, wishes others harm, and envy-based resentment means wishing people harm because they have some advantage I wish I had. (Somebody ought to give that fancy new car a dent or two.) So, for example, as a writer I envy Stephen King’s ability to fill a complicated plot with interesting characters. But that’s benign, because I don’t resent him — I don’t wish bad things would happen to him to even the score between us. (If good things would cause him to finish his next novel faster, I’m all for them.)

This distinction is important because of course the rest of us envy the rich. (Think of all the places I would have gone if traveling were as simple as telling my pilot to fire up the jet.) But whether we resent them, and whether that resentment motivates our politics, is another matter entirely.

It’s an article of faith among the very rich that liberal policies (like progressive taxation and regulations that sometimes block the most direct path towards amassing even greater fortunes) are primarily motivated by resentment: We lesser mortals want the government to even the score a little by inflicting some pain on the lords of wealth. Part of Mitt Romney’s core message (said in almost the same words in interviews here and 11 months later here) was: “If one’s priority is to punish highly-successful people, then vote for the Democrats.” And CPAC front-runner Rand Paul echoed that sentiment in his 2014 State of the Union response:

If we allow ourselves to succumb to the politics of envy, we miss the fact that money and jobs flow to where they are welcome. If you punish successful business men and women, their companies and the jobs these companies create will go overseas.

The idea that you might just want to raise revenue by getting it from the people who would miss it the least; or that even though you have nothing against the rich personally, you think that a vast and growing gap between rich and poor is unhealthy for society … that just doesn’t figure. The only conceivable reason you might support a policy the rich don’t like is because you are burning with resentment and want to see them punished for having more than you do.

Jonathan Chait examined this claim and could find no supporting evidence — not even in columns promoting it. (That’s why Langone had to specify “with different words”. You can’t defend his point if you restrict yourself to what people are actually saying.) Politicians, no matter how liberal, are not promising to wreak vengeance on the 1%.

In practice, the politics of class emerge from the context of budgetary choices, where Democrats have positioned themselves against low taxes for the rich for the sole reason that it would come at the expense of more important fiscal priorities. … Gore, Kerry, and Obama were all making the exact same point: Clinton-era tax rates for the rich needed to stay in place not because the rich needed to be punished, but because cutting those rates would create more painful alternatives, like higher structural deficits or cuts to necessary programs.

But does that mean that resentment isn’t a factor in politics or that no one is trying to fan that flame? No, it doesn’t, because resentment-stoking is a constant drumbeat from the Right. Consider, for example, this ad that the Club for Growth ran in Wisconsin in 2011 during Governor Scott Walker’s successful campaign to bust the state employees’ unions.

All across Wisconsin, people are making sacrifices to keep their jobs. Frozen wages. Pay cuts. And paying more for health care. But state workers haven’t had to sacrifice. … It’s not fair. … It’s time state employees paid their fair share, just like the rest of us.

The ad doesn’t promise that anything good will happen to “the rest of us” if the unions are broken. You could imagine an argument similar to the Gore/Kerry/Obama point about taxes: “We’re sorry that we can’t fully fund the pensions of our hard-working teachers and other state employees, but something has to give and we’d rather keep taxes low and spend our limited resources on other priorities.” But instead, this ad is about punishing the state employees, because their unions have shielded them from the kind of employer aggression that has victimized private-sector workers; so let’s bust their union and make them suffer the way other working people suffer.

That’s pure resentment, a political movement very directly trying to “encourage and thrive on envy”. If someone knows of anything nearly that explicit coming from the Left, I’d like to see it.

Or recall the Right’s campaign against Sandra Fluke, when she had the audacity to defend ObamaCare’s contraception mandate. (Rush Limbaugh became the face of this campaign, but he was far from alone, as this timeline makes clear.) Limbaugh’s focus wasn’t that his listeners would benefit from cancelling the mandate. (That would be a hard case to make, since it’s possible that the prevented pregnancies save insurance companies more money than the contraception costs.) Instead, he pounded on the notion that Fluke is a slut: She’s having so much sex she can’t afford her contraception (as if the pill worked that way). He painted a picture in which Fluke has the kind of sex life Limbaugh’s older male listeners can only wish for, so they should want to screw that up for her.

Resentment.

Or consider the way the Right campaigns against the poor. Remember the “lucky duckies” who don’t have to pay income tax (because they’re too poor)? Or the way that Fox News made one lobster-eating surfer a symbol of all Food Stamp recipients? (Jon Stewart’s take-down of this whole campaign is priceless.) Somewhere, “America’s poor are actually living the good life” as a promo for Fox’s “Entitlement Nation” special put it — and all without working like you do. Don’t you wish you could get by without working? Don’t you want to screw the people who (you imagine) do? Take something away from them? Maybe harass them with drug tests that cost more than they save? Because the point isn’t to save money — or to do you any good at all — it’s to inflict harm on people who might be getting away with something you daydream about.

That’s the primary way the politics of resentment affects our economic debate. It’s not directed at the rich by the Left, but at the poor by the Right.

Across the board, one side is trying to encourage and thrive on envy and jealousy: It’s the Right, not the Left.

Does Paul Ryan Care About Poverty Now?

Ryan’s new report obscures a broad consensus about the government’s role in helping the poor.


On the surface, poverty appears to be one of America’s most polarizing issues. Liberals contend that people are poor because our economy doesn’t provide enough opportunities to get ahead. Conservatives argue that people are poor for personal reasons, because they are too lazy or feckless or drug-addicted to take advantage of the opportunities the system offers (or could offer if not for government interference).

Both sides can find examples to back their case. No matter how hard it might be to get out of poverty, some people muster heroic efforts and make it, while others confound every attempt to help them. In between are the people who jump at the brass ring, but somehow don’t manage to jump high enough. Maybe the ring could be lower, maybe they could jump higher — you can frame it either way. You could look at almost any failing individual and say, “He could be doing more.” But you can also find plenty of poor people whose struggles make you ask “Why does it have to be this hard?”

The partisan debate obscures something important: Underneath the polarized opinions about poor people in the abstract, Americans share a broad consensus about the kinds of people who need help and the kinds of things that should be done to help them. For example, just about everyone believes that the best way out of poverty is to get a good-paying job. Conservatives sometimes try to claim this position as their own, but in fact it’s pretty much universal. (Liberals disagree about how to create good jobs, not the value of getting one if you’re poor.)

That get-out-of-poverty-by-working plan might fail for one of four reasons:

  • There are no jobs for people like you.
  • The jobs you can get don’t pay enough to keep you out of poverty.
  • There are good-paying jobs available, but you don’t have the skills to get them.
  • There are jobs you could get if you wanted them, but you’d rather not work.

There’s even a broad public consensus about the appropriate government role in each case:

  • If there really is no job for you, the government should either create a job (by say, funding a WPA-style public works program or subsidizing jobs in the private sector) or support you directly at some level consistent with human dignity (through old-age pensions, disability payments, or long-term unemployment insurance during deep recessions).
  • If you are working at the only kind of job available, the government should provide (or make your employer provide) the extra little nudge you need to stay out of poverty. (Hence the minimum wage, the earned income tax credit, and a variety of supplemental programs like Food Stamps.)
  • If all you need to prosper is training, the government should help you get it. (Free public schools, inexpensive community colleges, job training programs, student grants and loans, and so on.)
  • But if you just don’t want to work, the government shouldn’t help you at all. You need to learn to take responsibility for yourself.

A few people would argue with each of those positions, but not anywhere near a majority. Our substantive political arguments over poverty aren’t about what to do with the people in each category, but rather which category is typical and how well government programs target the people in the category they’re supposed to help.

So liberal rhetoric focuses on people in the first three categories, who are legitimately seeking the help that Americans are proud to provide for each other. (Some other countries may let good people starve in the streets, but that’s not who we are.) Conservative rhetoric focuses on people in the fourth category who nonetheless get benefits because they masquerade as people in one of the first three categories: They aren’t really disabled, they are getting by fine without assistance (and so blow their Food Stamps on luxuries), and they aren’t really looking for a job or training for one. They’re just soaking up government benefits because they can. If those benefits went away, they’d realize that they have to get off their butts and work.

Few seriously dispute that both kinds of people exist: those who need and deserve government help, and those who get it even though they shouldn’t. The argument is more about the number of people in each group and (more subtly) something I’ve called the mercy/severity balance: How many people who need and deserve your help are you willing to leave to fend for themselves in order to prevent one lazy guy from cashing his government check and laughing at you?

For example: The USDA estimates that about 1% of Food Stamps are illegally sold for cash, while 3% of benefits are overpayments to people who either don’t qualify or shouldn’t get as much as they got. Does this strike you as a huge scandal that brings the legitimacy of whole program into question? Or do you focus instead on the good done for families who qualify for Food Stamps legitimately and use them as they were intended?

If somebody came up with an auditing program that would eliminate this waste and fraud, but would cost more (because paying auditors is expensive) than it saved, would you be for it or against it? What if it cost double what it saved? Ten times? A hundred?

From the conservative focus on the fourth category comes the conservative anti-poverty plan: If the way out of poverty is to take jobs that are readily available, and if the possibility of conning the government is keeping people from taking those jobs, then the way to reduce poverty is to cut government anti-poverty programs. Of course this means that some number of people who need and deserve help won’t get it, but that’s collateral damage.

Now you’re in a position to understand The War on Poverty: 50 Years Later, a report put out last Monday by the staff of the Republican majority of the House Budget Committee, i.e., Paul Ryan’s staff. In particular, you understand its conclusion:

Today, the poverty rate is stuck at 15 percent — the highest in a generation. And the trends are not encouraging. Federal programs are not only failing to address the problem. They are also in some significant respects making it worse.

The report looks extremely well supported — it has 683 footnotes, most of which reference reports by academic or government researchers (sometimes inaccurately). But looks are deceiving. For example, you’d think a statement like “Federal programs … are making it worse” would be footnoted to death. It’s not. Instead, what you’ll see if you go through the report’s review of nearly 100 government programs, is a lot of “results were not demonstrated” (said about the Low Income Home Energy Assistance Program) or “The program’s costs outweigh its benefits to society” (Job Corps) or “the program didn’t have any performance metrics or targets for the level of performance.” (Emergency Food and Shelter Program). I quit after reviewing about half the programs, but I didn’t find a single “Poor people would be better off without this program” with a footnote to a study proving that point.

That’s typical. The ten-page Overview at the front of the report seems to float freely above the evidence collected in subsequent sections. A lot of the evidence presented in the overview is of the correlation-is-not-causation variety. For example:

The Brookings Institution’s Ron Haskins and Isabel Sawhill point out that if a person works full time, gets a high-school education, and waits until he or she is married to have children, the chances of being poor are just 2 percent.

Of course, a lot of things are probably already going right in your life if you’re able finish high school, find a full-time job, and attract someone you’d want to marry. It’s not any great surprise that you’re not poor, and I’m not sure what there is to learn from that fact.

Only 2.7 percent of Americans above the age of 16 who worked full time year-round were in poverty, even in 2007 — before the Great Recession had taken firm hold.

Since recessions increase poverty by raising unemployment, it shouldn’t surprise anyone that people who kept their jobs didn’t get poorer. And what really should grab our attention is: There are people who work full time year-round and are still poor! Why isn’t that rate zero?

Another substantive claim of the overview is:

since the beginning of the War on Poverty … male labor-force participation has fallen dramatically. In 1965, it was approximately 80 percent. Today, it has fallen to a record low of below 70 percent. Since 2009 alone, male labor-force participation has fallen 3.3 percentage points. Among working-age men, the labor-force participation rate has fallen from 97 percent in 1965 to 88 percent in 2013. In recent years, female labor-force participation has also declined. Since it reached its record high of 60.3 percent in 2000, female labor-force participation has fallen to 56.9 percent — declining 2.5 percentage points since 2009. And among working-age women, the labor-force participation rate has fallen from 77 percent to 74 percent from 2000 to 2013.

But again, what’s the cause? The War on Poverty has coincided with a long-term reduction in male labor-force participation, but is there some reason to believe that’s anything more than a coincidence?

The implication is that anti-poverty programs encourage laziness, particularly in men. To conservatives, I’m sure this conjures up images of able-bodied 20-somethings choosing to hang around on street corners rather than look for work. (Liberals are more likely to picture multinational corporations shipping jobs overseas.) But if you get into the discussions of specific programs that the report claims discourage work, you get a different picture. For example:

[E]xpansions of the [earned income tax credit] are associated with a reduction in labor market participation by married mothers.

In other words, in poor households where both parents work, mothers are likely to cut back their hours (and spend more time at home with the kids) if the family is better able to get by on what the father makes. Another example:

SSI reduces the labor supply of likely SSI participants aged 62–64. A $100 increase in SSI benefits is associated with a 5 percent reduction in the employment rate.

In other words, people who are limping towards retirement with disabilities will retire sooner if they can afford to. That’s not exactly strapping young dudes hanging out on street corners, is it? (Maybe one of those young guys will get the job that the guy with the bad back retired from.)

So when you get down to details, the overview-level statements are less convincing than they sound. And that connects to a third point:

Congress has taken a haphazard approach to this problem; it has expanded programs and created new ones with little regard to how these changes fit into the larger effort. Rather than provide a roadmap out of poverty, Washington has created a complex web of programs that are often difficult to navigate.

Imagine what Republicans might say if Congress had taken a unified approach. Wait, we don’t have to imagine, because the Affordable Care Act was a unified program with a comprehensive vision of increasing access to health care. Republicans complained about its mammoth size and invented all kinds of scary stories about what might be hidden in that enormous law.

That’s why there are hundreds of anti-poverty programs. If liberals presented one unified program to help the poor, we’d hear about this incredible octopus that had its tentacles in everything and was thousands of pages long and had an unimaginable cost. Every story of someone abusing a poverty program would be an argument against the whole thing. So instead, we have $65 million going to a separate “Education for Homeless Children and Youth” program and $11.5 million for “Job Placement and Training” for American Indians. If you want to cut them, you have to explain why you don’t want to educate homeless children and youth, or find jobs for Indians.

This is the basic dichotomy of American politics: In the abstract, voters will tell you that government spends too much. But the vast majority of things that the government spends money on are popular. No one would be happier than liberals if we could create a unified vision of how to help the poor, and then fund that vision. But I doubt that’s what Ryan is talking about, and I fear that he wants to unify the anti-poverty hydra so that there is only one throat to cut.

In short, it would be wonderful if Ryan’s report represented an honest attempt to examine what works and what doesn’t, and to assemble a unified program to do what the broad consensus of Americans want done: support the unemployable, find a job for everybody who wants one, make sure that people who work full time stay out of poverty, train people who have the talent and desire to move up to skilled labor or the professions — and keep lazy people from abusing the system.

I wish I could believe it did.


An interesting side-debate raised in the footnotes of the report is whether the War on Poverty is failing at all. The report references “Winning the War: Poverty from the Great Society to the Great Recession” by Bruce Meyer of the University of Chicago and James Sullivan of Notre Dame. They propose measuring poverty by consumption rather than income, and they adjust for inflation differently. I can’t judge whether they’re right or not, but they show a different story than the official poverty rate. Their measure of poverty starts far higher 50 years ago and drops far lower today.


Another side-debate could happen over what success means. The report says:

The true measure of success is the number of people who get off these programs and get out of poverty.

And certainly everyone should agree that when a government program helps a poor person get a good job and join the middle class, that’s a success story. But limiting the suffering of the poor is a worthy goal in itself, and sometimes a government program succeeds simply by keeping things from getting worse.

For example: The United States has a terrible rate of what public-health professionals call “amenable mortality” — people who die of treatable conditions because they don’t get appropriate medical care. If the subsidies in ObamaCare bring that rate down, I’ll count that as a success.

Religious Liberty and Marriage Equality

Are the principles that protect religious liberty secure, or are recent court decisions steps on a slippery slope?


One of this week’s big stories was Arizona Governor Jan Brewer’s veto of S.B. 1062, “An Act … Relating to the Free Exercise of Religion“. Proponents claim that this law (and similar proposed laws around the country) is necessary to protect Christians from being forced to participate in same-sex marriage celebrations, in violation of their freedoms of conscience and religious liberty.

There’s one important thing you need to understand about this controversy: It’s symbolic. I went looking for cases where businesses were forced to deal with same-sex weddings and I found exactly five in the entire country.

  • In New Mexico, a photography business was successfully sued by a lesbian couple whose commitment ceremony (same-sex marriage being illegal in New Mexico) it refused to photograph. (I covered the ruling in a weekly summary last August.)
  • The Oregon Bureau of Labor and Industries ruled that a bakery had violated state law when it refused to make a wedding cake for another lesbian couple.
  • A judge in Colorado similarly ruled against a bakery.
  • A Vermont inn was sued for refusing to host a wedding reception for a same-sex couple, which the owners claim was a misunderstanding. The case was settled out of court, so we don’t know what a judge would have said.
  • A suit is pending against a florist in Washington.

Some writers make it sound like these are representative examples out of many, but they may well be the only instances to date.

Last June, the Pew Research Center estimated that over 70,000 same-sex marriages had been performed in the United States, plus an uncounted number of civil unions and legally unrecognized commitment ceremonies like the one in New Mexico. In all but a handful of them, people seem to have worked out whatever differences they had. Wedding planners, photographers, bakers, dress-makers, tuxedo-rental places, florists, celebrants, meeting halls, church sanctuaries … either they approved or they swallowed their disapproval or the couples took the hint and looked for service-with-a-smile elsewhere. Or maybe they found compromises they could all live with. (“I’ll sell you the cake, but you’ll have to put the two brides on top yourself.”)

In short, S.B. 1062 does not address a practical issue. Across the country, people are behaving like adults and working things out without involving the government. Governor Brewer recognized as much in her veto statement:

Senate Bill 1062 does not address a specific or present concern related to religious liberty in Arizona. I have not heard one example in Arizona where a business owner’s religious liberty has been violated.

The uproar is also symbolic on the other side. Critics of S.B. 1062 warned about “gay Jim Crow” laws, but just as there is no flood of suits against fundamentalist Christian florists, neither are large numbers of businesses waiting for the state’s permission to display “No Gays Allowed” signs. As The Christian Post pointed out, Arizona (like many other states) has no state law protecting gays from discrimination. (New Mexico does, which is why the lesbian couple won their suit against the photographers.) So outside a few cities that have local anti-discrimination ordinances, Arizona businesses are already free to put out “No Gays Allowed” signs without S.B. 1062. If any have done so, nobody is making a big deal out of it.

What this all resembles more than anything is the argument over the constitutional amendment to ban flag-burning. Actual flag-burnings are so rare that most of the amendment’s backers couldn’t cite a particular case, but they felt very strongly about it all the same. The few cases that actually exist are merely chips in a poker game; they are symbols of some deeper philosophical conflict, but mean little in themselves.

That’s not to say that philosophical conflicts are unimportant, but they are also not urgent. Because major injustices against one side or the other are not happening every day — and depending on your definition of “major injustice” may not be happening at all — we can afford to take some time to think this through calmly: What principles of religious liberty should we be trying to protect, and are any of those principles implicated in the cases that have been decided?

In my view, one basic principle is: No one should be forced to participate in a religious ritual. That’s why I don’t want teachers leading prayers in public school classrooms, especially when the children are too young to make a meaningful choice about opting out. For the same reason, it would be wrong to sue a priest who refused to perform a Catholic marriage ritual for a marriage his church did not sanction.

Some supporters of laws like S.B. 1062 (and the pending H.B. 2481) are citing this principle, but I think we need to be careful not to stretch the definition of a religious ritual. For example, civil marriage is not a religious ritual, so neither an officiating judge nor the clerk who issues a license is participating in religion. (If they were, that would seriously violate the separation of church and state.) Requiring that they do their jobs is not a violation of their religious liberty. The fact that you don’t make the laws and may disagree with them is a normal hardship of working for the government, not a First Amendment issue.

Similarly, a wedding reception is not a religious ritual; it’s a party that happens to take place after a religious ritual. Baking the cake or DJing the music or manning the bar are not sacramental roles, and do not deserve that kind of protection.

A second principle is: No one should be compelled to make a statement against his or her conscience. This was used as a defense in the Colorado bakery case. Administrative Law Judge Robert Spencer rejected it like this:

There is no doubt that decorating a wedding cake involves considerable skill and artistry. However, the finished product does not necessarily qualify as “speech,” as would saluting a flag, marching in a parade, or displaying a motto. The undisputed evidence is that [the baker] categorically refused to prepare a cake for Complainants’ same-sex wedding before there was any discussion about what that cake would look like. [The baker] was not asked to apply any message or symbol to the cake, or to construct the cake in any fashion that could be reasonably understood as advocating same-sex marriage.

So if a wedding-reception singer refused to sing some special gay-rights anthem, I would support him under this principle. But if he refused to perform at all, or refused to perform more-or-less the same collection of songs he does for everyone else who hires him, then I wouldn’t. Leading the friends and families of a same-sex couple in “The Hokey Pokey” is not a religious or political statement that should challenge anyone’s conscience.

Weighing against these exceptions is a public-accommodation principle that got established during the Civil Rights movement: If a business serves the public, then it should serve the whole public. The point of Jim Crow laws wasn’t to protect the consciences of white business owners, it was to exclude black people from the general public. If excluding gay and lesbian couples from the general public is the purpose behind refusing to serve them, that shouldn’t be allowed.

People try to fudge this principle by creating me-or-him situations. I grew up reading Ann Landers’ advice column in the newspaper. Ann used to regularly get questions like: “My good friend says she can’t come to my wedding if my other good friend is going to be there. What should I do?” As best I remember, her answer was always something like: “Invite everyone who you want to see there. If your friend doesn’t want to come, that’s her decision.” The same idea works here: Everyone should be invited to the marketplace. If you feel that the presence of gays and lesbians in the marketplace means you can’t be there, that’s your decision. No one has forced you out. (This is my answer to the U.S. Council of Catholic Bishops, who claim “Catholic Charities of Boston was forced to shut down its adoption services.”)

The other frequently raised issue has to do with venues: Will the law force my church sanctuary to be available for same-sex marriages? The idea that a sanctified site will be used for some unholy purpose strikes many people very deeply.

The case that is always cited — often not very precisely — involves a Methodist group, the Ocean Grove Camp Meeting Association in New Jersey. The OGCMA owned a boardwalk pavilion, which the judge described as “open-air wood-framed seating area along the boardwalk facing the Atlantic Ocean.” The Methodist group used the facility “primarily for religious programming”, but had received a tax exemption for the property the pavilion was on. One condition of the exemption was that the facility be open to the public. The OGCMA had a web page advertising “An Ocean Grove Wedding”, which cost $250 in rent. The OGCMA did not conduct or plan the weddings, and the page said nothing about Methodist doctrines concerning marriage.

Until the OGCMA turned down a lesbian couple that wanted to celebrate a civil union in 2007, no one could recall a wedding being refused for any reason other than scheduling. After the couple sued, OGCMA re-organized its use of the pavilion. It stopped advertising it to the public and sought a different kind of tax exemption available to it as a religious organization. The judge found:

[The OGCMA] can rearrange Pavilion operations, as it has done, to avoid this clash with the [New Jersey Law Against Discrimination]. It was not, however, free to promise equal access, to rent wedding space to heterosexual couples irrespective of their tradition, and then except these petitioners.

Recognizing that the couple mainly sought “the finding that they were wronged” and that the OGCMA had not “acted with ill motive”, the judge assessed no damages.

In other words, this example is not particularly scary when you know the details. The principle here is pretty simple: If you worry about the sanctity of your holy space, don’t rent it out to the public – which is good advice in general, irrespective of same-sex marriage. If you do rent it out, then we’re back to the public-accommodation principle.

In conclusion, I’m not seeing anything particularly alarming in the five cases (six, if you add the boardwalk pavilion case) that are motivating people to support S.B. 1062 or similar laws. Reasonable principles are prevailing, and I do not see a slippery slope.

So if you’re worried about your minister being forced to bless a same-sex wedding in your sanctuary or go to jail, don’t be. It’s not happening and nobody is advocating for it to happen. Nothing in the cases that have been decided leads in that direction.

Are You Sure You’re White?

Daniel Sharfstein tells the story of three families who crossed the color line, and their descendents who forgot.


One of Dave Chappelle’s most memorable bits is his portrayal of Clayton Bigsby, a blind white supremacist who doesn’t know he’s black. Bigsby writes racist books whose readers also think he’s white. He lives in a remote area with few neighbors, and only appears in public in his KKK hood. A few white supremacist friends know the truth, but they keep the secret because “He’s too important to the movement.”

Bigsby is an exaggerated version of Mr. Oreo, a character created as a thought experiment by philosopher Charles W. Mills of Northwestern. Mr. Oreo was born to parents who identified as black and he appears black himself, but he has always thought of himself and described himself as white. At some point he goes through a medical process that alters his features, hair, and skin color so that he becomes indistinguishable from whites. Is he white? Or is there an unalterable underlying reality to his blackness?

According to professors who have discussed Mr. Oreo in class, students almost unanimously judge Mr. Oreo to be black. As David Livingston Smith explains in Less Than Human (his fascinating book on dehumanization, which devotes a lot of time to the belief that certain races are subhuman), our culture commonly believes that some personal traits are changeable (a weak man can go through a muscle-building process to become a strong man) while others, like race, are not.

We tend to think — perhaps in spite of ourselves — that black people constitute a natural kind, whereas weak people don’t. … We say a person has large muscles, but we say they are of a certain race. … A person can gain or lose muscle while remaining the same person, but we tend to think that if they were to change their race, it would amount to becoming an entirely different person.

Real life provides its own examples, some even more compelling than Mr. Oreo. In her 1949 autobiographical essay collection Killers of the Dream, Lillian Smith recalls Janie, a white-skinned little girl taken from a poor black family newly arrived in the colored part of town. (They “must have kidnapped her”, the local whites decided.) Janie was brought to live with the Smiths, and Lillian fell into a big-sister role.

It was easy for one more to fit into our ample household and Janie was soon at home there. She roomed with me, sat next to me at the table; I found Bible verses for her to say at breakfast; she wore my clothes, played with my dolls, and followed me around from morning to night.

But in a few weeks, word came from a distant colored orphanage: Janie only appeared to be white; she was “really” black and had to return to the black family who had adopted her. At first, Lillian could not see the sense in this, but eventually she yielded to superior adult wisdom.

I was overcome with guilt. For three weeks I had done things that white children are not supposed to do. And now I knew these things had been wrong.

In The Invisible Line: a secret history of race in America, Daniel J. Sharfstein tells a more elaborate and challenging story, one that “has been hiding in plain sight” for centuries. He describes it as a “hidden migration”:

African Americans began to migrate from black to white as soon as slaves arrived on the American shore. This centuries-long migration fundamentally challenges how Americans have understood and experienced race, yet it is a history that is largely forgotten.

In earlier eras historians have acknowledged the passing-for-white phenomenon, but considered it virtually untraceable. After all, anyone motivated to pass for white was even more motivated to hide the evidence. But the genealogy boom (empowered by easy access to records over the internet and the possibility of analyzing your DNA for information about your ancestors) has unleashed thousands of amateur investigators and turned up many new cases. Lots of Americans are not as white as they think they are, and some are starting to find out.

Sharfstein traces three families who crossed the color line at different points in American history.

The Gibsons. Prior to Bacon’s Rebellion of 1676, race was not nearly as significant in Virginia as it later became. White indentured servants had more in common with the black slaves than with their upper-class masters, and mixed-race children were not unusual. The law classed a child as belonging to the same race as its mother. Gibby and Hubbard Gibson were mixed-race children of a white mother, and so were free. They moved inland, cleared land, and intermarried with the other frontier property-owning families.

As racial standards tightened generation-by-generation, the Gibsons stayed just on the favored side of the color line, and just far enough away from the race-conscious coastal cities that few cared enough to make an issue of their darker-than-average skin. They moved to North Carolina, and then to the wild western side of South Carolina. By the time they reached Kentucky and Louisiana in the 1800s, no one remembered that the family’s race had ever been an issue.

Gibson boys became officers in the Confederate Army, and Yale-educated Senator Randall Gibson of Louisiana played a key role in the negotiations that resolved the contested 1876 presidential election by trading Southern electoral votes for President Hayes’ promise to end Reconstruction. Randall also was a major player in the founding of Tulane University, convincing Paul Tulane to revise his bequest from “serve young men in the City of New Orleans” to “serve young white men in the City of New Orleans”.

A later generation married into the Marshall Field family of Chicago. As curator of the Field Museum of Natural History, Henry Field commissioned a series of sculptures illustrating over a hundred separate “races” for the Hall of Races of Mankind that opened in 1933. He had no clue he was anything but 100% European.

If anyone out there has media connections, I think The Gibsons would make a fabulous miniseries.

The Walls. Stephen Wall was a North Carolina plantation owner who never married, but fathered several children with his female slaves. In the 1830s he appeared to be selling his children to a plantation in Alabama, but in fact this was a ruse. Instead, a family friend delivered the Wall children to a Quaker settlement in Indiana, where Stephen provided resources for them to be raised and educated.

One of those children, O.S.B. Wall, was instrumental in convincing the Ohio governor to field a black regiment in the Civil War. He recruited black soldiers across the state and became a captain, though he arrived at the front too late to see combat. After the war, Wall moved to Washington, D.C., where he became part of a budding freedman aristocracy and held several positions in the local political machine.

But D. C. became one of the first places to disenfranchise blacks after the war. When the city ran into financial difficulties in the Panic of 1873, the federal government took direct authority over local affairs, shunting local elected officials aside for decades. When Democrats (who at the time openly identified themselves as “the white man’s party”) came to power with Grover Cleveland in 1884, white supremacy followed.

Captain Wall married a light-skinned woman, and his children found that they were frequently mistaken for white. His son Stephen married a white woman, but continued to identify as the son of a prominent leader in the black community, for all the good it did him. He was repeatedly let go from his job in the government printing office without cause, only to be rehired later. The final straw came when his indistinguishable-from-white daughter was barred from the public school in his suburban neighborhood, and he lost a series of court cases to have her reinstated, despite being legally in the right. (By prevailing definitions, Isabel’s black ancestry was sufficiently diluted that she should have been considered white. But whatever the text said, the spirit of the law was to protect white families from “falling” into the black community due to the discovery of an unexpected dark ancestor, not to allow a Negro man to marry a white woman and launch his children into white society.)

The family moved, changed its name to Gates, and began passing for white. Two generations later, Thomas Murphy (a “white” Georgian with considerable prejudice against blacks) got a nasty shock from his genealogy research. “You can’t call me a racist because I is one of you,” he told his black co-workers at the Atlanta airport.

The Spencers. Freed slaves had a hard time finding a place for themselves. Slave-owners viewed freedom as a contagious notion, so they didn’t want the freedmen around, and no state wanted to advertise itself as a destination for other states’ former slaves. For many, the solution was to go someplace without a lot of neighbors.

George Freeman and Jordan Spencer (who might been his son) were mixed-race freed slaves (of the white Spencer family) who settled in the hill country of eastern Kentucky in the early 1800s. They married sisters from a white family that passed through and left their daughters behind. When they ran into legal trouble from the local whites, Freeman stayed and hired a lawyer, but Spencer moved deeper into the wilderness. After he arrived in Johnson County, Kentucky, he didn’t exactly proclaim himself a white man, but he just started acting like one. White men, for example, were required to muster with the local militia and drill, while black men were forbidden to have weapons. Spencer showed up for drills, and nobody took it on themselves to tell him he shouldn’t.

At the time, even the South Carolina Supreme Court was recognizing the extent to which race was socially constructed. In an 1835 case, Justice William Harper wrote:

The condition of the individual is not to be determined solely by the distinct and visible mixture of negro blood, but by reputation, by his reception into society, and his having commonly exercised the privileges of a white man. But his admission to these privileges, regulated by the public opinion of the community in which he lives, will very much depend on his own character and conduct; and it may be well and proper, that a man of worth, honesty, industry, and respectability, should have the rank of a white man, while a vagabond of the same degree of blood should be confined to the inferior caste.

The hill country was more focused on clans than on races, and over time the Spencers became just another clan, darker than most, but respectable in their way. Jordan’s children intermarried with other clans — some of whom were not too clear about their own ancestry — who then found it convenient to describe the Spencers as white, if they were forced to describe them at all.

Two generations later, slavery was gone and Jim Crow had begun. Suddenly, one provable drop of “black blood” might be all it took to find yourself on the wrong side of the color line. George Spencer had moved across the border to the hill country of western Virginia, where he was doing fine until a feud started with a wealthier family, who started spreading rumors that the Spencers were “God damned negroes”. A slander trial ensued, with detectives going back to Kentucky to interview old people about where Jordan Spencer might have come from and whether he anyone had ever suggested he might not be white. A jury found against the Spencers, but the Virginia Supreme Court threw the verdict out and the case was never retried. That was enough for the locals to go on treating the Spencers as white, maybe with an occasional wink or nod.

Summing up. We look back on American history and say that people (including our own ancestors) were “white” or “black” as if those words had some natural meaning that remained constant through time and space. But in fact, the lines between the races have fluctuated, and even the apparent rules have applied differently to one family than to another. Sometimes all you had to do to cross the color line was move somewhere new and let people make assumptions about you.

At all times in American history, being considered white has brought certain advantages, and in every generation there have been light-skinned people who didn’t see why they or their children shouldn’t have those advantages. Both sides of the racial divide have had reason to minimize this phenomenon. For whites, the fact that the color line was fluid and permeable undermined the whole concept of white superiority. For blacks, those who forsook their black heritage lent credence to the notion that African ancestry was something to be ashamed of. And those who crossed over had reason to hope no one would ever find out, including, perhaps, their own children.

But reclaiming the “hidden migration” has a role to play in ending racism and healing the racial divide. Not only is racial purity an unworthy goal, it is a myth. We have never had racial purity in America. We are a lot closer to being one big family than most of us ever suspected.


BTW, I thought I’d head off an obvious comment: I realize that this post’s title assumes the reader is white (or thinks s/he is). I ask the indulgence and forgiveness of the Sift’s non-white readers. No inclusive title I could think of brought the issue to a head quite so sharply.

What Should “Racism” Mean?

There’s a type of faux scandal that’s been happening … well, I haven’t exactly kept track, but it seems like there’s a new one every month or two. They all fit this pattern: President Obama does something that symbolically asserts his status as president, and the right-wing press gets outraged by how he’s “disrespecting” something-or-other related to the presidency.

So, for example, in January, 2010 this photo caused FoxNation.com to ask whether Obama was “disrespecting the Oval Office” by putting his feet up on the antique desk.

Of course, it didn’t take long to uncover similar photos of previous presidents, none of which had raised any particular outrage at the time. But everybody forgot again, and so we had an almost identical flap last September. “This just makes me furious,” one woman tweeted. “He was raised so badly.”

Or remember last May when marines held umbrellas over President Obama and visiting Turkish Prime Minister Recep Erdogan. Horrors! He’s treating our revered warriors like servants! How dare he! It was front-page news.

Once again, it wasn’t too hard to find similar photos of previous presidents, like this one of the first President Bush, which wasn’t front-page news — or any kind of outrage at all.

Other such “scandals” involve the First Lady: Did you know that Michelle had the audacity to wear an expensive gown to a recent state dinner, like first ladies have been doing, well, forever? Compare to this 2005 WaPo column in which Laura Bush is said to look “regal” — and that’s a compliment. Until 2009, the First Lady was supposed to look regal. Remember Jackie Kennedy? But when Michelle dresses up, she’s Marie Antoinette.

The Obama’s vacations are another issue, and how much taxpayers spend to protect them outside the White House. But of course when the Bush twins celebrated their 25th birthdays in Buenos Aires, nobody cared what it cost the Secret Service to keep them safe in an exotic locale. They were the president’s daughters, so of course we protected them.

The entire White House lifestyle is an issue: The Obamas are “living large” claimed National Review (and mentioned Marie Antoinette again). The Washington Post fact-checker investigated and concluded: “there appears to be no appreciable difference between Obama’s expenses and Bush’s.” If you read the NR article carefully — and most of the other articles raising this faux issue — you’ll realize they never said there was. It’s just that the Bushes living large never bothered anybody.

Town Hall criticized the extravagance of having Beyonce perform at the Obama White House. But when Frank Sinatra performed for the Reagans, nobody looked at it that way. Why would they?

Even the Obamas’ Christmas cards became an issue. This one, from 2011, disrespects the Christian holiday because it is secular and features the president’s dog:

But this one, from the Bushes in 2005, is fine.

I could go on and on. Whenever President Obama acts like the President of the United States, or the Obamas act like the First Family, it just looks wrong to a lot of people.

So here’s the $64,000 question: Is that racist?

It depends on what you think racist means. Conservatives will not only answer the question “No”, they’ll be insulted that you even raised it (and will probably launch into their canned everybody-who-disagrees-with-Obama-is-a-racist-to-you-people riff). That’s because conservatives have adopted a very restricted definition of racism: Racism is conscious hatred towards people of another race.

So, those white folks who didn’t even notice when Reagan’s or JFK’s feet were on the desk, but who see Obama’s and think “He was raised so badly.” — are they also secretly thinking “Who does that uppity nigger think he is, acting like he’s a real president or something?” Maybe a few here or there, but mostly no. They aren’t consciously hating Obama because he’s black. But they can’t look at a black president the same way they looked at the 43 white presidents. Things just look different when Obama does them.

What do you call that?

I’m asking that question seriously, not rhetorically. I sympathize with people who want to reserve racism for Adolf Hitler ordering the Final Solution to the Jewish problem or George Wallace standing in the door to block black students from enrolling at the University of Alabama. The men who lynched Emmett Till or the grand jury that refused to indict them — those people were racists. I get that it doesn’t seem right to put them in the same category with the people who only just realized in 2009 that life in the White House is pretty sweet.

But all the same, lots of whites look at Obama and can’t think “president” without thinking “black president” — and they go on to judge his actions more harshly than those of white presidents. They go on to treat him with less respect than white presidents have always received — like interrupting the State of the Union to yell “You lie!” or questioning his birth certificate when there was never any reason to do so. (This satire, which applies the same standards to Ronald Reagan’s birth certificate, is hilarious precisely because it would never have been taken seriously.)

Congressmen saying it would be “a dream come true” to impeach the President (while admitting they have no evidence of an impeachable offense), or listening patiently while constituents publicly say the President “should be executed as an enemy combatant” — that would have been unthinkable during the 43 white administrations. But today it’s considered acceptable behavior.

If you don’t want to call it racism, fine. But it’s a real phenomenon; it needs a name. What do you call it?

I’ve narrowed my focus to President Obama, but really the phenomenon is much broader. For example, read Tim Wise’s “What if the Tea Party Were Black?” or just about anything about Trayvon Martin. If Michael Dunn had been a black man shooting up a car full of white boys, I doubt jurors would have bought his I-thought-I-saw-a-gun argument.

For a lot of whites who don’t harbor any conscious racial malice, things just look different when blacks do them. What do you call that?

Teasing out the different stances that might be called “racism” is at least half the value of Ian Haney Lopez’ recent book Dog Whistle Politics. Lopez notes that racism changes from one era to the next, and somebody changes it. “Racism is not disappearing,” he says, “it’s adapting.”

Lopez uses the word “racism” for most of the possible meanings, and differentiates with adjectives. Here are some of the ones he finds:

  • racism-as-hate. The most restrictive definition, and the most comforting for whites. “For the public at large, racism-as-hate provides self-protecting clarity: if racists are like those in the 1950s who screamed at black school children and burned crosses, then most everyone can safely conclude that they, at least, are not racists. … Since conservatives on the Supreme Court adopted a malice conception of racism in 1979, when using this approach the Court has rejected every claim of discrimination against nonwhites brought before it.”
  • structural or institutional racism. This is racial injustice that seems to be the fault of nobody in particular, because it’s embedded in the way society works. Vicious cycles (like poverty leading to dysfunctional behavior which leads back to poverty) may trace back to past sins like slavery or Jim Crow, but now they are self-replicating. “Structural racism is racism without racists. All that said, precisely because institutional racism implies a need to change society, it was rejected long ago by conservatives, including those on the Supreme Court who repudiated this understanding of racism in the early 1970s.”
  • implicit bias. This is the it-just-looks-different response I have been describing, or the kind that shows up in Implicit Association Test you can take online.
  • commonsense racism. “The social world through which we move reflects centuries of racism that extends right up to the present. But this is hard to grasp in its particulars. Instead, we see clearly only the results, and with the underlying causes hidden, we tend to accept the extant world as a testament to the implacable truth of racial stereotypes.” The commonsense racists “are not hate-filled bigots but decent folks who see racial injustice as a normal feature of society. … For many, it simply seems ‘true,’ an unquestioned matter of commonsense, that blacks prefer welfare to work, that undocumented immigrants breed crime, and that Islam spawns violence.”
  • strategic racism. New appeals to racial prejudice and new rationalizations for racial injustice don’t create themselves. When the old racial manipulations stop working, somebody figures out new ones. “Strategic racism refers to purposeful efforts to use racial animus as leverage to gain material wealth, political power, or heightened social standing. … [B]ecause strategic racism is strategic, it is not fundamentally about race. … [S]trategic racists act out of avarice rather than animus.”

Lopez retells a lot of American history to illustrate how when one avenue for racial injustice was blocked, another was usually found in short order. (His discussion of how in the Reconstruction Era convict leasing developed into a new form of forced black labor to replace slavery, and continued in that form well into the 20th century, was new and eye-opening to me.) He sees this not as blind evolution, but as clever people working out the new arrangements and constructing ways to rationalize them to the masses.

Lopez also describes the usual course of racial conversation these days: If you introduce any of the above ideas into a conversation, conservatives will interpret it as an explicit or veiled accusation of racism-as-hate; you are saying they are like the white supremacists who yelled obscenities at the black little girls trying to integrate public schools. They will experience this as an injustice, and then see themselves as the victims rather than the people whose suffering you were trying to point out.

Strategic racists have turned this into

the rhetorical punch, parry, and kick of dog whistle racial jujitsu. Here are the basic moves: (1) punch racism into the conversation through references to culture, behavior, and class; (2) parry claims of race-baiting by insisting that absent a direct reference to biology or the use of a racial epithet, there can be no racism; (3) kick up the racial attack by calling any critics the real racists for mentioning race and thereby “playing the race card.”

“Most racists,” Lopez recognizes, like the South African whites Lopez met during the apartheid era “are good people. This is not a book about bad people. It is about all of us.” Most whites — even the most conservative whites — are not haters. But so many on the Right have been trained in the recast-yourself-as-the-victim reflex that it has become hard to have any kind of discussion at all about the more subtle and pervasive forms of racism. And until we get to the bottom of that, our democracy will always be vulnerable to the manipulations of the strategic racists.

Sam We Am

We’ve seen this movie before, we know the lines, and we know what role we’re going to wish we had played.


Last week, All-American defensive tackle Michael Sam let the world know that whichever NFL team drafts him will have the first openly gay player in American major league sports.*

This week the sports world responded, and the discussion had a quality I didn’t expect: It was old. As ESPN said when they broke the story:

In 2014, “Gay Man to Enter Workforce” has the everyday-occurrence sound of a headline in The Onion.

The objections to Sam joining the NFL rehash the ones the public just rejected in the debate over ending Don’t-Ask-Don’t-Tell and letting gays serve openly in the military. If you look further back in history, those arguments are a rehash of what Truman heard when he let blacks into the military, or Branch Rickey heard when he brought Jackie Robinson to the Dodgers. (And they’re not that different from the arguments against letting women into businessmen’s clubs or blacks into white schools.)

By now, we’ve got this conversation’s number. It’s 42.

We’re told NFL teams will avoid drafting Sam so as not to screw up their “locker room culture”. In 2011 and in 1948, people worried about military “unit cohesion” and “morale”. It was code for: “We already have bigots, and they’ll be upset.”

That code doesn’t fool anybody anymore. If bigots cause a problem, it’s on them.

We’re told players will feel oogy, because, you know … showers. We’ve heard that before: about gays in the military, and about blacks, too, if you go back that far. It’s hard to reconstruct the argument now — I guess something about blacks was supposed to contaminate whites in some way — but in 1948 it was a big deal: Young white men from Jim Crow states couldn’t even use the same urinals as black men, so how could the Army expect them to shower together?

We’re told the NFL isn’t “ready” for gay players, as if baseball had been ready for Jackie Robinson or racing for Danica Patrick. Decades ago that seemed like a good point — maybe if we prepare for a few more years everything will go smoothly — but today it’s a fat pitch, a batting-practice lob. Ta-Nehisi Coates hit it over the fence like this:

The NFL has no moral right to be “ready” for a gay player, which is to say it has no right to discriminate against gay men at its leisure

In 2014 we know how this movie comes out, and we know the lines. That’s how people you never would have picked out as gay rights advocates are able to be so forceful and eloquent. Like Dale Hansen, the sports anchor at ABC’s Channel 8 in Dallas:

Since you know the lines, you get to pick your role. We can all be Atticus Finch this time, if we want to. Former NFL receiver Donte Stallworth had the strong-but-reasonable thing down pat when he wrote this for ThinkProgress on Friday:

Michael Sam will only be a distraction if his organization, head coach, and teammates let him become one because of their own biases and lack of leadership. … In my experience with Bill Belichick, the head coach of the New England Patriots, I feel he would handle this by not making it a big deal to begin with. Bill would walk in on day one, as he does every year, and tell his players that he expected them to treat everyone in this organization with respect and a professional attitude. Anything less in that organization is intolerable.

What about the other Super Bowl coach Stallworth played for?

John Harbaugh, the head coach of the Baltimore Ravens, … [would] tell players to handle their problems in the locker room as a family would. If they had something to say, they should discuss it with each other, man to man, brother to brother, as a family. Harbaugh would tell us that if there were any issues among the team that we should hash it out in the locker room or a team meeting. If that failed, he’d tell us to come see him in his office or to go see general manager Ozzie Newsome, who has an open door policy and is always there for players to have an honest talk. Those guys would help players figure out their options or other ways to address whatever problems they had.

Those are separate ways to handle it, but they’re both effective because they both address the fundamental point: that this isn’t something that should distract players from doing the job they’re being paid to do. When you have strong leadership from your head coach and other players in the locker room, that’s an easy message to send. When you don’t, it means your problems are much bigger than a gay football player.

Usually, people give inordinate credit to the fig-leaf arguments of the status quo, and it takes a long time to see through them. But because we’ve been through this before and recently, this story has moved really fast. In just a few days, the question has flipped from “Will Michael Sam be a problem?” to “Is your team professional enough for Michael Sam?” After all, Sam’s college teammates at Missouri could handle having a gay teammate. They went 12-2 and finished the year ranked #5 in the country.** If your NFL team can’t deal with the situation as well as a bunch of amateur college kids, what’s the matter with you?

Overnight, the “manly” reaction flipped from being homophobic to having the maturity to respect your teammates, even if they’re different from you.

Before his announcement, the consensus judgment on Michael Sam was: He won’t be a superstar in the NFL, but he can play. He can help a team win games. At some point in the middle rounds of the draft, he’ll be the best player on the board.

Sam didn’t change any of that by telling us he’s gay.

So when he’s at the top of the board, the onus won’t be on him, it will be on the general managers of the teams. What are you saying, GMs, if you let him go by and draft somebody less talented? You’re saying that you think your players (who you signed) are immature and unprofessional, and that your coaches (who you hired) don’t have what it takes to handle them. You’re saying that you care more about making your job easy than about winning.

When you reach that point, NFL general manager, I’ve only got two words for you: Man up.


* Jason Collins would have had that distinction if any NBA team had signed him this year. But he was a journeyman veteran whose career might have over anyway.


** Missouri students deserve some credit too. When 14 members of the Westboro Baptist Church hate group came to campus to demonstrate against Sam, hundreds of students wearing “Stand with Sam” buttons and “We are all CoMo Sexuals” shirts formed a human wall. (Googling “como sexual” didn’t get me anything enlightening. I assume it means Missouri (MO) students together (co) to support people of all sexual preferences.)

Follow

Get every new post delivered to your Inbox.

Join 1,035 other followers