It is a platitude that we cannot defend the humanities without slipping into platitudes. Why is that? Part of the answer involves the corrosive impact of contemporary intellectual fashion. We are besieged by a resurgence of positivist scientism—the transformation of science from a method to a metaphysic, promising precise answers to age-old ultimate questions. Yet while pop-neuroscientists, evolutionary psychologists, and other defenders of quantifiable certainty have beaten back postmodern philosophical critiques, the postmodern style of ironic detachment has flourished. The recoil from modernist high seriousness, epitomized by the turn from Abstract Expressionist painting to Pop Art, has persisted long after Andy Warhol displaced Jackson Pollock as the celebrity artist du jour. As a signifier of the dominant cultural tone, the furrowed brow has been largely eclipsed by the knowing smirk. The commitment to searching out deep truths has yielded to the celebration of playing with surfaces (in the arts) or solving problems (in the sciences). The merger of postmodern irony and positivist scientism has been underwritten by neoliberal capitalism—whose only standard of value is market utility.

This convergence of postmodern style, positivist epistemology, and neoliberal political economy has turned a whole class of words into the stuff of platitude. Old words that used to mean something—ideals, meaning, character, self, soul—have come to seem mere floating signifiers, counters in a game played by commencement speakers and college catalogs. Vague and variable as their meanings may have been, there was a time when the big words of the humanities still carried weight. They sustained yearnings and aspirations; they sanctioned the notion that the four-year transition from adolescence to adulthood might be a time of exploration and experiment.

This idea has not disappeared entirely, but the last time it flourished en masse was forty years or so ago, in the atmosphere pervaded by the antiwar counterculture. Indeed one could argue that the counterculture of the 1960s and early ’70s involved far more than the contemporary caricature of sex, drugs, and rock ’n’ roll. It was in part a creation of young people who wanted to take college education seriously, to treat it as more than mere job training. Beneath the slogans and excess, the counterculture contained a probing critique of the instrumentalist mentality that managed the Vietnam War—the mad perversion of pragmatism embodied in the American major’s words: “it became necessary to destroy the town in order to save it.” Writers like Albert Camus, Martin Buber, and Dietrich Bonhoeffer may have been more often cited than read by young people in the 1960s and ’70s, but those writers’ presence in countercultural discourse suggested the urgent question at its core: How can we live an ethical life amid the demands of illegitimate power?

One place to explore answers to that question was the liberal-arts curriculum. During the late 1960s, even at my conservative Southern university, humanities enrollments soared as students packed English, philosophy, and history courses—posing fundamental questions, resisting conventional answers. The old words still had meaning, and were being called to account. Literature provided a language for challenging “the insolence of office” that was epitomized in government lies—and for exposing the technocratic hubris embodied in Ahab’s boast: “All my means are sane; my motive and my object mad.” This is how we learned what we were up against: nothing better captured the madness of the managerial rationality behind the Vietnam War and the nuclear-arms race. Many students, myself included, acted on the unarticulated assumption that reading, reflection, and introspection might provide the foundation of an independent self—skeptical of official pieties, capable of imagining more capacious ideas of patriotism and courage than the ones provided by the dominant culture—a self that could speak truth to power. That phrase was fresh to us then.

How times have changed. Nowadays “speak truth to power” has to be placed in inverted commas, to distance us from its earnestness. Among the educated professional classes, no one would be caught dead confusing intellectual inquiry with a quest for ultimate meaning, or with the effort to create an independent self. Indeed the very notion of authentic selfhood—a self determined to heed its own ethical and aesthetic imperatives, resistant to the claims of fashion, money, and popularity—has come to seem archaic. In an atmosphere dominated by postmodern irony, pop-neuroscience, and the technocratic ethos of neoliberalism, the self is little more than a series of manipulable appearances, fashioned and re-fashioned to meet the marketing needs of the moment. We have bid adieu to existential inwardness. The reduction of the mind to software and the brain to a computer, which originated among cognitive scientists and philosophers of mind, has been popularized by journalists into the stuff of dinner-party conversations. The computer analogy, if taken as seriously as its proponents wish, undermines the concept of subjectivity—the core of older versions of the self. So it should come as no surprise that, in many enlightened circles, the very notion of an inner life has come to seem passé.

One consequence of this seismic cultural shift is the train wreck of contemporary higher education. Nothing better exemplifies the catastrophe than President Barack Obama’s plan to publish the average incomes earned by graduates from various colleges, so parents and students can know which diplomas are worth the most in the marketplace, and choose accordingly. In higher education as in health care, market utility has become the sole criterion of worth. The monetary standard of value has reinforced the American distrust of intellect unharnessed to practical purposes: the result is an atmosphere toxic to the humanities. We need a defense of the humanities that takes these cultural developments into account; that claims more for the liberal arts than the promotion of “critical thinking” and “people skills”; that insists, without slipping into platitude, on the importance of the humanities for their own sake.

 

WILLIAM DERESIEWICZ, A FORMER member of Yale’s English Department, has written it. In Excellent Sheep, he presents a devastating critique of the idea that college education is simply about learning marketable skills; he also makes a compelling case for the humanities. He revives, in effect, the old words—the old quest for meaning, self, and soul. The problem is that he has attached his argument to a critique of elite higher education, even as he recognizes that the critique extends far beyond the Ivy League. He shrewdly dissects the cult of “meritocracy” on American campuses, diagnosing its elements of anti-intellectualism—the careerism, the conventionality, the managerial reduction of education to “problem-solving,” the embrace of money as the measure of all things. He acknowledges that these maladies could be found as easily at the University of Virginia or the University of Mississippi Honors Program as at Yale or Princeton, but he does not seem to recognize fully that together they constitute a plague pervading the entire society. Amid the obsession with marketable skills encouraged by neoliberal capitalism, all colleges aim to turn out excellent sheep; some are better equipped than others to do so. Some sheep are more excellent—by all the conventional criteria—than others.

Whether the students are actually satisfied to be sheep is another matter. Deresiewicz writes movingly of their anguish. No reader of his book can doubt that elite colleges are full of fearful, driven kids whose miseries include “eating disorders, cutting, substance abuse, addiction, depression...” Here are some voices from the meritocracy in training: “I only get two hours sleep per night.... I really really fear failure.... I am just a machine with no life at this place.... I am a robot just going page by page, doing the work.” It is like the mental Olympics, one student observes, but the contest never ends. Sometimes “the drug of praise” can temporarily numb the fear of failure. And sometimes it takes other drugs: “If I didn’t take Zoloft,” one former student told him, “I would hate myself.” Parents who understandably worry about their children’s mental health receive glib reassurances from administrators, who talk about how many students are depressed and how easy it is to phone the suicide hotline. The number of breakdowns is almost a point of pride, part of the price for high academic standards. A young woman of my acquaintance recalled the Old Campus at Yale (the freshmen dorms) as a hive of conventional ambition; the buildings themselves seemed buzzing with ceaseless busyness. One thing is clear from Deresiewicz’s interviews: the “meritocratic” atmosphere is death to intellectual seekers, who feel they’ve been sold a bill of goods and often keep searching after they get out. Somehow the job at Goldman Sachs just doesn’t satisfy.

The problem, for Deresiewicz, is that when you focus on problems at Ivy League universities you invite the hostility of reviewers, many of whom are associated with Ivy League universities themselves. A few might even be called excellent sheep—products of the self-styled meritocracy of recent decades. Perhaps the most egregious example is Nathan Heller’s review in the New Yorker. Heller asks “Are Elite Colleges Bad for the Soul?” and begins by describing the many forms of sleep deprivation endured by him and his classmates “early in this century” at an unspecified Ivy League university. All this makes clear that he will avoid the larger issues raised by the book and focus instead on an anecdotal defense of his own experience—a strategy followed by other reviewers as well. Deresiewicz has unintentionally invited this. So to do him justice it’s important to emphasize that his argument stretches beyond the Ivy League, toward all of higher education in the contemporary United States—and beyond our borders to encompass the striving professional classes from Canada and the United Kingdom to China and India.

Still, there is a logic to focusing on the Ivy League; it is where the meritocratic myth flourishes in its purest form. The official atmosphere is pervaded by the unspoken rhetorical question: Aren’t we great? The relentless striving for badges of achievement is more flagrantly and broadly present on elite campuses than anywhere else. The Ivy League is where the American ruling class (or at least a good chunk of it) learns that they have power and wealth because they deserve it. They are meritorious. Their credentials confirm it.

The catch is that the students have to keep acquiring more evidence of their excellence—beginning, after they graduate, with a job that pays at least $100,000 a year. You remain haunted, they say, by “the feeling of being a failure if you don’t continue to amass the blue chip names” and prodded by “the need to keep on doing the most prestigious possible thing.” Yet some still fear that they have missed something, some passionate pursuit of a success that can’t be measured by conventional criteria.

High-achieving children are the products of “high-achievement parenting,” another development of recent decades, performed by “parents who fill up their own brittle selves with their children’s accomplishments,” in the withering judgment of the psychotherapist Madeline Levine, whom Deresiewicz cites at length. His favorite example of an abusive parent is Amy Chua, whose The Battle Hymn of the Tiger Mother celebrated her own authoritarian insistence on her children’s feverish striving. Once again he picks the most virulent form of the sickness he wants to diagnose.

However strict or permissive their upbringing, children destined for elite schools display a “self that forms in response to parental expectations,” an “affable, competent, adult-oriented personality.” Not all parents embrace the meritocratic agenda, but even if they resist it, their children are swept along by the broad upper-middle-class culture of achievement. Its darker dimensions include “junior careerism, directionless ambition, risk-aversion, Hobbesian competitiveness,” and “monumental cynicism.” There’s no there there. Education comes to be seen as “not far from game theory, an algorithm to be cracked in order to get to the next level.”

The preoccupation with process over purpose, means over ends, has long been a feature of the technocratic mind, which despite occasional countercultural protests (as in the 1960s) has dominated American universities since the late nineteenth century and now seems poised to render other forms of thinking invisible. The focus on mastering technique rather than grappling with substance means that too often higher education “does nothing to challenge students’ high school values, ideals, practices, and beliefs,” as Deresiewicz observes. How can it, if it has no vision of what an educated human being should be, as Allen Bloom complained nearly thirty years ago in The Closing of the American Mind. It is interesting how often Deresiewicz cites Bloom, the bogeyman of the politically correct left in the 1980s, who was nothing if not a passionate defender of the humanities. Resistance to technocratic imperatives cuts across conventional political boundaries.

In recent decades, au courant educational ideologues have put technocratic imperatives in a determinist idiom—the train has left the station, etc.—and have added a dose of management jargon. The most egregious management-speak is the near universal use of a customer-service model for what universities do. As Deresiewicz observes, commercial values are the opposite of pedagogical ones. If you are interested in students’ long-term welfare, don’t give them what they want—don’t be afraid, he tells professors, to stand on your own authority, to assume you know something your students don’t, which they might profit by learning. The very fact that he has to make this obvious point suggests the parlous state we are in. The easy equation of students with consumers confirms Deresiewicz’s conclusion that the schools “finally don’t care about learning at all”—or about teaching. “Teaching is not an engineering problem. It isn’t a question of transferring a certain quantity of information from one brain to another,” he writes, implicitly challenging the current fashion of online education. On the contrary: “‘Educate’ means ‘lead forth.’ A teacher’s job is to lead forth the powers that lie asleep within her students. A teacher awakens; a teacher inspires.” Not every teacher can measure up to this exalted standard, but its presence at least can make us try. By comparison, when it comes to motivating teachers, the commercial model offers nothing.

The emptiness of management jargon, applied to traditional moral concepts, is nowhere more apparent than in the ubiquity of the word “leadership.” Once upon a time it was something that was considered a duty, an accompaniment of privilege. Now, Deresiewicz writes, it’s little more than “an empty set of rituals known only to propitiate the gods.” Like so many other ideals of the meritocracy (“innovation,” “creativity,” “disruption”), indeed like the meritocrats themselves, “leadership” lacks content. And where content is absent, power pours in. We are left with Mark Edmundson’s witty summation, quoted by Deresiewicz: a leader is “someone who, in a very energetic, upbeat way, shares all the values of the people who are in charge.”

The people in charge make sure that their charges inhabit “an atmosphere of constant affirmation” characterized by “the relentless inculcation of prosocial behavior.” This is how elite colleges produce “team players”—but so do many other sorts of institutions, and so they have for many decades. The difference is that team players from Ivy Schools are more likely to end up team captain.

To the question “What’s the point? What’s this team for, anyway?” the answers are as vacant as they have always been in management literature; only now they reflect the diminished expectations of our neoliberal moment. As Deresiewicz says, the dominant ethos is: “Forget about ideals and ideologies and big ideas, those scourges of the twentieth century. Just pick a problem and go to work on it. The notion is technocratic, and bespeaks the kind of technocratic education students get today.” Of course its inspiration is not the plodding gray technocracy of the mid-century corporation, but the hipness of the high-tech entrepreneur. Deresiewicz is rightly suspicious of the idea that this new social formation constitutes a “creative class.” As he writes: “The suspicion arises that the small-scale/techie/entrepreneurial model represents the expression not of a social philosophy...but of the desire for a certain kind of lifestyle”—autonomous, hip, and rich.

Still not everyone, even among the elite, is seduced by this trendy vision. Deresiewicz has spoken to many young people who resist it. They are “ardent, curious, independent—looking to college for meaning, not skills; looking to the world for possibility, not security. What they told me, invariably, was that they felt abandoned by their institution.” But it is not just the elite universities that have abandoned them; it is our entire leadership class, beginning with the president himself. During the 2008 campaign, Obama gave stirring speeches in Austin, Texas, and Madison, Wisconsin, where he insisted on the importance of music and the arts in any educational program. For a presidential candidate to be saying these things seemed too good to be true—as in fact it was. Once in office, Obama embraced the neoliberal education agenda of marketization and privatization, epitomized by his reliably anti-intellectual secretary of education, Arne Duncan. Where are intellectual seekers supposed to find legitimation for their search?

In Deresiewicz’s book, for starters. He does not mince words: “An undergraduate experience devoted exclusively to career preparation is four years largely wasted. The purpose of college is to enable you to live more alertly, more responsibly, more freely: more fully.” The key to this process is “developing the habit of skepticism and the capacity to put it into practice. It means learning not to take things for granted, so you can reach your own conclusions.” So it comes down to an effort at self-culture, as Emerson would have said. And self-culture involves an inward turn: it is “through this act of introspection, of self-examination, of establishing communication between the mind and the heart, the mind and experience, that you become an individual, a unique being—a soul. And that is what it means to develop a self.” Deresiewicz, the son of Orthodox Jewish parents, is not himself religious. But he finds religious language—beginning with the marriage of self and soul—inescapable in describing the intellectual quest fostered by the liberal arts. “People go to monasteries to find out why they have come, and college ought to be the same,” he writes. It takes real courage to make such claims amid the market-driven discourse of contemporary higher education.

 

THE CONSEQUENCE OF THIS soul-making odyssey—or at least an early way station on a lifelong journey—is precisely the kind of self that resists the siren song of contemporary intellectual fashion, a self that is fortified against disappointments and failure. “A self is a separate space, a private space,” Deresiewicz writes, “a space of strength, security, autonomy, creativity, play.” This is a romantic modernist vision, thoroughly at odds with postmodern and neoliberal notions of selfhood. And like the romantic modernists of the 1960s, Deresiewicz sometimes slips into formulaic oppositions—such as the one he poses between the young and their parents, whom he falsely assumes to epitomize the constraints of conventional expectations. He is right, though, to recognize the difficulties involved in choosing an independent path—the puzzled looks, the people who wonder why you didn’t fulfill your promise.

But if you’ve taken the humanities seriously you can withstand the puzzled looks. As Deresiewicz writes, the liberal arts curriculum remains “the best training you can give yourself in how to talk and think”—“to reflect...for the sake of citizenship, for the sake of living well with others, above all, for the sake of building a self that is strong and creative and free.” You read literature, philosophy, and history because “you don’t build a self out of thin air, by gazing at your navel. You build it, in part, by encountering the ways that others have done so themselves.” And the wider and more varied the definition of the canon, the better—the more examples you have of alternative ways of thinking and being in the world. As Bloom wrote (and Deresiewicz quotes): “The most successful tyranny...is the one that removes the awareness of other possibilities.” It was as if the conservative curmudgeon had foreseen the techno-determinists of our own time, for whom the train has always left the station and (in Maggie Thatcher’s words) “there is no alternative” to the neoliberal system. The prerequisite for independence is the realization that there are indeed other possibilities than the ones handed down by conventional wisdom.

A sense of possibility, as Deresiewicz acknowledges, is a product of class privilege. And indeed the humanities have historically functioned as the playground of the rich, before they get down to the real work of running the world. (A friend of mine, a Yale professor, once said that part of ruling-class socialization was listening to a guy with a beard talk about Marx.) Yet the humanities need not be reduced to a mere luxury. Abundant testimony exists from teachers in night-school classes, even in prisons, that comparatively uneducated students can respond to great literature with passion and intelligence. That encounter can be life-changing. A student of mine at Rutgers, a Navy veteran, found that reading Heart of Darkness forced him to come to terms with his own dark experiences in the first Gulf War. Conrad led him to Melville and W. E. B. DuBois, to exploring the mysteries of the divided self. It was a bumpy ride, but he came out of it more alert, more aware, and more fully engaged with the world.

So why shouldn’t everyone have a shot at this experience? Deresiewicz thinks everyone should. And he knows it’s more than a matter of affirmative action. In fact he recognizes what a hollow charade that policy has become—a legitimation of existing privilege. Quoting Walter Benn Michaels, he writes, “the (very few) poor people at Harvard...reassure the (very many) rich people at Harvard that you can’t just buy your way into Harvard.” Deresiewicz realizes that the only affirmative action worth the name is a policy that takes class as well as gender and ethnicity into account. But ultimately affirmative action can never be more than a Band-Aid on the carcinoma that afflicts higher education—the primacy of technocratic, monetary standards. We need to create a world, he writes, “where you don’t have to go to the Ivy League, or any private college, to get a first-rate education.” Of course it is already possible to do that at many fine state universities. But they are struggling to stay afloat amid the systematic impoverishment of the public sector that has lasted for decades and has only accelerated in the past few years. The most egregious among many recent examples is the assault on the University of Wisconsin by the Republican governor, Scott Walker. Since 1989, state spending on higher education in the United States has dropped by half—a fact few commentators mention as they bewail the rising cost of college. Of course tuition will rise under these circumstances: somebody has to pay. As Deresiewicz acknowledges, public higher education is suffering the same fate as K–12 education, not to mention public-health initiatives and other essential government services: they are all “starved of funds, then blamed for failing to deliver.” So it is clear that the problems of higher education involve far more than misplaced meritocratic mythology at Ivy League schools; they are part of a general moral and political crisis.

The question remains: What is to be done? Despite his focus on the Ivy League, Deresiewicz supplies valuable ammunition for embattled defenders of the humanities, who too often have been reduced to mumbling about corporate recruiters’ preference for English majors. It is time to go on the offensive, and he has done so in fine style. Arguing for the importance of the humanities is by no means a merely academic gesture. As the antiwar counterculture of the ’60s learned, the liberal-arts tradition has a radical edge; it is a prod to the moral imagination, a seed-bed of political possibilities. The first step toward challenging illegitimate power is the recognition that you can indeed take that step—that there are alternatives available to the future on offer. As a peace-activist colleague of mine in Missouri said, when students wondered where to begin challenging the enormity of the nuclear-arms race: “Well, you start where you’re at.”

Jackson Lears is the Board of Governors Professor of History at Rutgers University and editor in chief of the Raritan Quarterly Review. His books include No Place of Grace: Antimodernism and the Transformation of American Culture, 1880–1920 (1981), Something for Nothing: Luck in America (2003), and Rebirth of a Nation: The Making of Modern America, 1877–1920 (2009).
Also by this author
This story is included in these collections:
Published in the May 1, 2015 issue: View Contents
© 2024 Commonweal Magazine. All rights reserved. Design by Point Five. Site by Deck Fifty.