Shut up or die. It’s hard to think of a more frontal assault on the basic values of Western freedom than al-Qaida’s January slaughter of French journalists for publishing cartoons they disliked. I disagree with what you say, and I’ll defend to the death my right to make you stop saying it: the battle cry of neo-medievalism. And it worked. The New York Times, in reporting the Charlie Hebdo massacre, flinched from printing the cartoons. The London Telegraph showed the magazine’s cover but pixelated the image of Muhammad. All honor to the Washington Post and the New York Post for the courage to show, as the latter so often does, the naked truth.

Illustration by Arnold Roth
Illustrations by Arnold Roth

The Paris atrocity ought to make us rethink the harms we ourselves have been inflicting on the freedom to think our own thoughts and say and write them that is a prime glory of our Bill of Rights—and that its author, James Madison, shocked by Virginia’s jailing of Baptist preachers for publishing unorthodox religious views, entered politics to protect. Our First Amendment allows you to say whatever you like, except, a 1942 Supreme Court decision held, “the lewd and obscene, the profane, the libelous, and the insulting or ‘fighting’ words—those which by their very utterances inflict injury or tend to incite an immediate breach of the peace,” though subsequent decisions have allowed obscene and profane speech. A 1992 judgment further refined the “fighting words” exemption, ruling that the First Amendment forbids government from discriminating among the ideas that the fighting words convey, banning anti-Catholic insults, for example, while permitting slurs against anti-Catholics. In other words, government can’t bar what we would now call “hate speech”—speech that will cause “anger, alarm or resentment in others on the basis of race, color, creed, religion or gender.”

This expansive freedom prevails nowhere else on earth. European countries, and even Canada, have passed hate-speech laws that criminalize casual racial slurs or insults to someone’s sexual habits. An Oxford student spent a night in jail for opining to a policeman that his horse seemed gay. France, which has recently fined citizens for antigay tweets and criminalized calls for jihad as an incitement to violence—a measure that our First Amendment would allow only if the calls presented a “clear and present danger”—also (most improperly) forbids the denial of crimes against humanity, especially the Holocaust. The pope has weighed in as well, with the platitude that no one should insult anyone’s religion—or his mother.

I am as scandalized by Holocaust denial or race baiting as anyone else, but I think Madison right to say that the proper response is not criminalization but argumentation. In a remarkable foreshadowing of John Stuart Mill’s 1859 classic, On Liberty, Madison wrote in 1800 that it is to free speech and a free press, despite all their abuses, that “the world is indebted for all the triumphs which have been gained by reason and humanity, over error and oppression.” Only out of freewheeling discussion, the unbridled clash of opinion and assertion—including false, disagreeable, and unpopular opinions, Madison believed no less than Mill—can truth ultimately emerge. So it is troubling to see that the camel of repression has gotten his nose under the Constitutional tent by a law allowing the prosecution of bosses for tolerating speech by some employees that allegedly creates a “hostile environment” for others. The Court ought to squelch such an affront to the First Amendment. And it is equally troubling that state and federal laws have created such a thing as a “hate crime.” All that should matter to the law is whether the perpetrator of a crime acted with criminal intent, not whether that intent rested on an outlandish opinion.

As John Stuart Mill observed in On Liberty, though, it is not law but “stigma which is really effective” in silencing “the profession of opinions which are under the ban of society.” So it ought to be with Holocaust denial or racial slurs. Yet when scorn stifles the free expression of opinion that is unorthodox or unfashionable, but over which reasonable men can differ—and that could prove incontestably true, as has happened often enough—trouble begins.

Let me give you an example from my own experience. Over the course of a year or two as the 1970s turned into the 1980s, I lost all my friends, for saying what I had recently come to believe. I was teaching at Columbia, and my friends were my English department colleagues, along with some of what used to be called the New York Intellectuals. But I was moving rightward politically, pushed by the reality I saw all around me in emphatically ungentrified Morningside Heights.

In those days, the War on Poverty was in full swing. Welfare, as a kind of reparation for racism, was a come-and-get-it proposition, and as newly destigmatized out-of-wedlock childbearing skyrocketed, one in seven New Yorkers went on the dole. Meanwhile, make-work, affirmative-action jobs on the city payroll mushroomed, along with taxes to fund them. Tax-subsidized housing projects loomed like menacing outposts of disorder over down-at-the-heels neighborhoods like mine and threatened to invade such bastions of hard-won, quasi-suburban middle-class respectability as Forest Hills, Queens. Though the era’s national emphasis on school desegregation had turned the focus of urban education from learning to racial equality, here in New York a bitter, racially incited 1968 teachers’ strike had pushed it more toward racial antagonism, as happened in Boston six years later, when court-ordered school busing for racial integration began. Also under the banner of racial equality, New York’s public colleges had opened their doors to all comers, ready or not; and, as graduation rates fell into the teens, standards fell still faster, so that few of the small band of graduates could get real jobs, and few real learners could get real learning.

The streets and parks grew squalid and menacing, as police turned a blind eye to so-called victimless crimes, from loitering, disturbing the peace, and public urination, to retail dope dealing and solicitation by prostitutes of every gender. Deinstitutionalized madmen panhandled with desperate aggression, when they weren’t too far gone merely to babble or hit. On my stretch of Broadway, able-bodied bums, seeing what easy touches my students were, swelled the beggarly ranks, with no interference from the police, wary of accusations of racism—and one bum killed one of my neighbors. We all lived in constant fear of violence, for crime became epidemic, nasty, and brutish.

So all my liberal nostrums had gotten a fair trial, and this was the result. If we do not learn from reasoning upon our observation and experience, what do we learn from?

Maybe the criminal isn’t a victim, I hazarded at one dinner party. Maybe he’s to blame for his actions, not “society.” Maybe the real victim is, well, the victim. Shocked silence, as if I had flatulated. “That’s racist,” one guest muttered to her plate, tacitly admitting the not-to-be-mentioned truth that criminals were disproportionally minority. Then conversation resumed on another topic, as if no noxious disturbance had occurred—certainly not one that polite society would acknowledge. In those days, every right-thinking person knew that crime had its “root causes” in poverty and racism, and to understand that was to excuse the criminal, who might even be a justified, if somewhat heavy-handed, rebel against oppression, for which we around the comfortably plentiful dinner table were ultimately responsible.

Later, I opined to another friend, a music professor, that rent control was an injustice to the landlord, confiscating what was rightfully his—and this in my friend’s rent-controlled apartment. “Do you want me to be homeless?” he spluttered incredulously. “Do you want to evict me from New York?” However tactless—one doesn’t speak about the Fifth Amendment takings clause in the house of the rent-controlled—I really wasn’t being personal. But alas, so ended another long and cherished friendship.

But however gauche, such opinions stood the test of experience. When I said these insensitive things, New York was dying, with 1 million well-educated and prosperous residents, along with two-thirds of its big corporate headquarters, streaming out of town. But the following two decades of activist policing that treated a crime as a crime regardless of the race of the criminal or victim, along with the realization that the victim of “victimless” crimes against public order was the city itself, turned Gotham back into the glittering metropolis to which so many flock today. And the steady erosion of rent controls helped fuel a gentrification boom, which ended in a building boom that included fancy apartment towers for all the rich foreigners who felt safe having their families and investments here.

Later still, at Diana Trilling’s dinner table, I committed yet another of my irrepressible faux pas. Turning to Christopher Lehmann-Haupt, then the august daily book reviewer of the then-august New York Times, I asked, in all seriousness, “Don’t you think the whole effort of modernism—in architecture, in literature, in music, in painting—might have been a huge dead end, from which Western culture will painfully have to extricate itself?” Shocked silence again, though all these decades later, the question still seems inexhaustibly interesting to me. But again, conversation resumed as if I hadn’t spoken and wasn’t there. As soon enough I wasn’t, for the invitations stopped.

Thus I learned the truth of Mill’s argument that social stigma can be as powerful as law in silencing heterodox opinion, except for people rich enough to be “independent of the good will of other people.” Everyone else who utters “opinions which are under the ban of society . . . might as well be imprisoned as excluded from the means of earning their bread.” No more academic career for me (fortunately, it turned out).

What prompts me to tell such slight tales is that they mark an early stage of a trend that increasingly threatens American freedom—the closing of the universities to the free and critical examination of ideas. As I didn’t know then, universities have been centers of real inquiry only for brief brilliant moments in centuries of scholastic murk. While eighteenth-century Glasgow and Edinburgh were beacons of the European Enlightenment, for instance, the Oxford and Cambridge of that time had countless dry-as-dust pedants or hard-drinking timeservers against one Isaac Newton. And there were no more ardent Nazi supporters than German university faculties, intellectual dynamos in the nineteenth century, once they became Judenrein. So the close-mindedness of today’s universities is nothing new.

But it is especially troubling, because it’s not just the elites who go to college in America any more, by contrast with eighteenth-century Oxford’s mostly highborn students, who could ensconce themselves in their own hard-drinking and gambling clubs and snobbishly ignore their hard-drinking dons. In our own day, the remarks that Constitution signer William Livingston made about American colleges more than 250 years ago still hold true: the doctrines kids learn there “pass from the memory and understanding to the heart, and at length become a second nature.” When the students grow up, these doctrines shape the culture and the laws, “appear[ing] on the bench, at the bar, in the pulpit, and in the senate.” So the intellectual intolerance now so strong on the nation’s campuses, the hostility to Mill’s politically incorrect “opinions under the ban of society,” is pregnant with a threat to the freedom of thought, speech, and press that are the foundations of American liberty, if the students bring this intolerance into adulthood.

The examples are so numerous that they become a blur, so it’s worth enumerating a few specifics, starting with the days when junior science instructors couldn’t get tenure without endorsing the theory that an asteroid impact caused a sun-blocking dust cloud that triggered the extinction of the dinosaurs. Denial would undermine the then–politically correct theory that atomic warfare would start a “nuclear winter” fatal to earthly life, save perhaps some worms and microbes—so we had better ban the bomb. Another impermissible scientific hypothesis, raised by Harvard president Larry Summers—that biological differences between men and women might account for the paucity of top female math and science professors—cost him his job, for gender-theory orthodoxy outlawed such still-unsettled questions. The refusal of college students so much as to listen to speakers whose viewpoint they think they dislike has become notorious, ever since Brown seniors shouted down commencement speaker Ray Kelly, then New York’s police commissioner, in 2013, and graduating classes from Azusa on the Pacific to Brandeis on the Atlantic, with Smith and Rutgers in between, refused last year to listen to Charles Murray, Ayaan Hirsi Ali, Christine Lagarde, and Condoleezza Rice.

College speech codes, outlawing whole lexicons of politically incorrect words and encyclopedias of heretical ideas, have become infamous, and courts, when asked, have struck them down, only to see them replaced with “trigger warnings”—cautions that Huckleberry Finn or The Merchant of Venice might cause distress to black or Jewish students, for example, who might therefore not want to read them. Oberlin has supplied teachers with a trigger-warning guide, advising them to consider not assigning works that could spark upset because of their “racism, classism, sexism, heterosexism, cissexism, ableism, and other issues of privilege and oppression,” such as Chinua Achebe’s novel Things Fall Apart, which could “trigger readers who have experienced racism, colonialism, religious persecution, violence, suicide and more.” What more? one wonders—and at the University of California at Santa Barbara, the answer is rape. And now we have the campus microaggression hysteria, outrage over instances of supposed—and certainly unintended—racism, sexism, and the like too microscopic to be discerned by any but the most exquisitely sensitive moralist, with a hair-trigger sense of grievance. (See “The Microaggression Farce,” Autumn 2014.)

If it sounds as though we are back in the days when ladies fainted at the mention of the legs of pianos, which had to wear skirts for decency, and when one couldn’t utter words “that would bring a blush to the cheek of a young person,” as Dickens jeered, we are. The Columbia Law School Coalition of Concerned Students of Color claimed that its members were “falling apart” over the failure of grand juries to indict cops for the deaths of Michael Brown in Ferguson, Missouri, and Eric Garner in Staten Island—they were so “traumatized,” in fact, “by the devaluation of Black and Brown lives,” that they were now inhibited “from sleeping at night.” After having so long borne “the burden of educating the broader community about issues that have wreaked havoc on our psyches and lives . . . with unfailing grace,” they now needed to demand “that the community care for us too,” by postponing their exams. The equally sensitive dean readily acceded—though high-profile defense lawyer Benjamin Brafman sharply noted: “If law students cannot function with difficult issues like these, maybe they should not try to become lawyers.” But for pure, three-hankie schmaltz over the Ferguson and Staten Island events, the Columbia students are no match for Harvard College dean Rakesh Khurana, who wrote “with great humility” of his “hav[ing] watched and listened in awe of our students, faculty, and staff who have come together to declare with passion, grace, and growing resolve that ‘Black Lives Matter’ and to call for justice, for ally-ship, and for hope.” Transcending sentimentality and reaching the pure empyrean of incoherence, the good dean concludes: “The diversity of our student body at Harvard College should be on the forefront of this paradigm shift.” As to thinking clearly, arguing vigorously, and writing incisively, what is that, compared with feelings?

Illustration by Arnold Roth

There are three grave problems here. First, you can’t learn much if you are unwilling to listen to ideas that challenge your self-righteous orthodoxy, nor can you even understand exactly what your orthodoxy means until you have had to think hard enough about it to defend it vigorously. All that “critical thinking” that college students were supposed to have learned before they arrived on campus and refined once they got there seems to involve nothing more than indoctrination in contempt for the politically incorrect ideas of the supposedly unenlightened. True, the humanities departments, where the race, class, and gender orthodoxy is central to the subject, are losing enrollment, but science, engineering, and business students also marinate in the all-pervasive atmosphere of such ideas, shaping their political and social assumptions, which become badges of enlightenment and superiority.

Second, the constant social pressure of having to monitor everything you say, lest some unguarded politically incorrect utterance loses you friends, dates, status, or even employment makes for (pardon the fifties’ expression) boring conformists, apparatchiks afraid to think for themselves—quite the opposite of the sturdily independent, resourceful, thoughtful, plainspoken, and creative character that used to be the American ideal. Take the case of Smith College president Kathleen McCartney, who joined her students’ “shared fury,” she said, as “we raise our voices in protest” against the grand jury decisions in Ferguson and Staten Island. Trouble is, she raised her voice in the wrong slogan, declaring that “All lives matter,” when the approved chant was “Black lives matter.” How could she be so disgracefully discriminatory in her nondiscrimination? her scandalized undergraduates exploded. A modern college president may be the very definition of an apparatchik, but there is something humiliating to human nature in the cringingly self-abasing apology that McCartney fairly sobbed out, without even having to be carted off in a dunce cap to a reeducation camp, as if she were her own Maoist cultural-revolutionary commissar. What would it take to make characters like this pull the lever at Treblinka?

John Stuart Mill worried that in intellectual matters, “society has now fairly got the better of individuality.” He feared that “everyone now lives as under the eye of a hostile and dreaded censorship. . . . Thus the mind itself is bowed to the yoke: even in what people do for pleasure, conformity is the first thing thought of; they like in crowds; they exercise choice only among things commonly done; peculiarity of taste, eccentricity of conduct are shunned equally with crimes, until by dint of not following their own nature they have no nature to follow: their human capacities are withered and starved.” Such intellectual conformity, he argued, squashes “the qualities which are the distinctive endowments of a human being. The human faculties of perception, judgment, discriminative feeling, mental activity, and even moral preference are exercised only in making a choice.” In 1859, when the brilliantly irascible Thomas Carlyle and the bracingly judgmental John Ruskin were writing their great works, when the inimitable Charles Dickens was peopling the Victorian imagination with “Dickens characters” drawn from the eccentrics he saw all around him, this warning was, Lord Macaulay thought, like crying “Fire!” in Noah’s flood. But it was a prophetic warning. Look at President McCartney and ask yourself how many such bland, interchangeable items of mortality man the bureaucracies that now organize our society.

Third, this educational climate has already ushered in an era of what left-wing academic Herbert Marcuse praised as “liberating tolerance” in 1965. In his day, he claimed, America’s system of “civil rights and liberties” permitted “opposition and dissent”—as long as they didn’t lead to “violent subversion” of “established society,” in which the interests of the laborer, the consumer, and the intellectual must always yield to those of the boss, the producer, and the college administrator. It is, at bottom, a “repressive tolerance.” Real tolerance—“liberating tolerance,” wrote Marcuse (in Orwellian Newspeak)—“would mean intolerance against movements from the Right and toleration of movements from the Left. As to the scope of this tolerance and intolerance: . . . it would extend to the stage of action as well as of discussion and propaganda, of deed as well as of word.” (See “Illiberal Liberalism,” Spring 2001.) Well-funded Stasi-like groups such as UnKochMyCampus, reports the Wall Street Journal’s Kimberley Strassel, are already at work, seeking “trusted informants” among faculty and students to target and harass the few remaining conservative professors whose thought runs counter to “progressive values” and might “undermine environmental protection, worker’s rights, health care expansion, and quality education.”

Sure enough, the self-righteous oppression that calls itself tolerance has moved out from the universities into the larger culture. On Election Day 2008, California voters passed Proposition 8, outlawing same-sex marriage. In the pure spirit of liberating tolerance, the campaign against the proposition included boycotts of the businesses of big donors to the pro–Proposition 8 effort, and the boycotts only expanded once the proposition had passed, helped by online interactive maps showing who donors were, where they lived, and where they worked. Some received death threats; others got envelopes of white powder in the mail (harmless, it proved, but scary). One, Mozilla CEO Brandon Eich, got forced out of his job, once two married gay Silicon Valley executives took their business away from his company in disgust, and activists laid siege to his board, which apologized for not firing him quickly enough. “Taking a public stand on Eich means painting a target on yourself,” one tech executive told a columnist. With his usual courage, blogger Andrew Sullivan summed up: “If this is the gay rights movement today—hounding our opponents with a fanaticism more like the religious right than anyone else—then count me out. If we are about intimidating the free speech of others, we are no better than the anti-gay bullies who came before us.” In 2010, a federal judge threw out the state’s gay-marriage ban.

Activists tarred Proposition 8 as antigay bigotry. It wasn’t. A person with no animus against homosexuals can reasonably believe that the only justification for the state to get involved in marriage—formerly a church concern—is that it has an interest in encouraging, by inheritance laws, the reproduction of society by the strong two-parent families that, mountains of research show, raise the happiest and most successful children. For someone who remembers the 1960s push to get the government out of the bedroom, the current urgency of homosexuals to drag the government back to bed seems bizarre. You don’t need Lois Lerner or Barack Obama to tell you that you really love your partner—and a limited-government conservative would insist that it is none of the government’s business. As for those who have a religious objection to homosexuality, it’s hard to see how the First Amendment’s protection of religious freedom can permit a Colorado court to order a devout baker to make wedding cakes for gay couples—or, given the First Amendment’s ban on a government establishment of religion, to order a nearby baker, in a case that is wending its way to court, to make a Bible-shaped cake with an antigay inscription that she believes to be loathsome bigotry. Indiana has just passed a law reinforcing the First Amendment rights of religious bakers, and the corporate Big Brothers are out in force to punish the state for it.

The larger point, of course, is that “liberating tolerance” can create a climate of opinion in which reasonable discussion of the merits of controversial topics such as gay marriage is impossible, and in which Congress will make laws that courts will say pass First Amendment muster when they don’t, or that unaccountable agencies of the administrative state will pass rules abhorrent to the First Amendment—a process that will begin with laws and regulations forbidding hate speech and that will end who-knows-where. If you believe in free speech, unfortunately you must sometimes hear sentiments that would bring a blush to the cheek of a young person. Since the civil rights and women’s rights revolutions have succeeded, even though the critical-race theorists and women’s studies professors don’t want to hear such job-killing news, there is no rationale whatever for hate-speech laws.

Finally, we should view with special alarm any attempt to muzzle political speech, such as outgoing Attorney General Eric Holder has been threatening. As Madison insisted, with all the earnestness of his character, “the right of freely examining public characters and public measures, and of free communication among the people thereon, . . . has ever been justly deemed, the only effectual guardian of every other right.” So recent rumblings from Democrats on the Federal Election Commission about regulating political commentary on the Internet are disgraceful (the more so because that body shouldn’t exist, since the Constitution allows only Congress to interfere with the states’ organizing of elections). And, of course, the IRS’s interference with Tea Party groups is flat-out tyranny, for all its bureaucratic banality.

Equally wrong are campaign-finance laws, which, happily, the Supreme Court’s Citizens United decision has begun to undo. In the American political system, based on man’s natural right to life, liberty, and property, money should talk. The core of Madison’s worry about the “tyranny of the majority” in Federalist 10 was that the unpropertied many might vote themselves the property of the rich few—whether by disproportionate taxation, abolition of debts, inflation to erode savings and investments, “an equal division of property, or . . . any other improper or wicked project”—which the Founders believed would be no less a tyranny than an absolute monarch’s expropriation of property. Madison argued in Federalist 10 that the clash of many competing interests in such a big republic as America would prevent such democratic tyranny from occurring; but he proved wrong. Though the Constitution requires taxes to be apportioned equally, the Sixteenth Amendment, ratified in 1913, imposed a graduated—that is, unequal—income tax that the Socialist Labor Party had first called for in the 1880s. Once the original Constitution’s shield against wealth redistribution disappeared—and also in 1913, the Seventeenth Amendment substituted popular election of senators for their election by state legislators, whom the Framers had thought would choose protectors of wealth—the redistribution juggernaut inexorably gathered speed. (See “It’s Not Your Founding Fathers’ Republic Any More,” Summer 2014.) By 2010, the richest 40 percent of households were paying 106.2 percent of federal income taxes, with the top 5 percent paying 57 percent, while the bottom 40 percent of tax filers paid minus 9.1 percent, thanks to refundable low-income tax credits, food stamps, and Medicaid, along with Social Security and Medicare, which are also income-transfer programs, in which poorer recipients get back a much higher proportion of what they paid in than do richer households.

So for just over a century, American politics has been a contest between the Founding Fathers’ vision and the Socialist Labor Party’s, most recently expressed by President Obama’s false and supercilious taunt to entrepreneurs that “you didn’t build that.” The chief protection the propertied now have for the Founders’ belief in the inalienable right of Americans to own their own property, secure against the tyranny of the majority, is their ability to speak up to defend it and to support candidates for office who pledge to do so—with as much money as they like, the Founders thought. And this is a fundamental right, determining whether it is government that gets to determine how much to take from each according to his ability, and to give to each according to his need, or whether individual citizens will make their own decisions on such matters, which is what self-government means.

If you need to offend by speaking up in defense of liberty, don’t be shy. It’s your most precious possession.

Top Photo: filadendron/iStock

Donate

City Journal is a publication of the Manhattan Institute for Policy Research (MI), a leading free-market think tank. Are you interested in supporting the magazine? As a 501(c)(3) nonprofit, donations in support of MI and City Journal are fully tax-deductible as provided by law (EIN #13-2912529).

Further Reading

Up Next