Loading...
Loading...
Click here if you don’t see subscription options
Kenneth L. WoodwardJanuary 29, 2024

This essay is a Cover Story selection, a weekly feature highlighting the top picks from the editors of America Media.

In the marketplace that higher education has become, the nation’s largest universities will now offer courses in almost anything short of carjacking if their research shows there are students willing to sign up. In Illinois, for example, where selling pot became legal in 2020, the state’s universities recently agreed to offer courses in how to cultivate and market weed. At Ohio State, a student can major in welding. And the University of Oregon offers a sequence of courses on Sports Product Design.

Higher education has traveled a long way from John Henry Newman’s The Idea of a University. The life of the mind is now the life of the paycheck. Character formation is as obsolete as mandatory chapel. What we have in this country now is what I call “Education for Employment.” As Miguel Cardona, the current Secretary of Education, has put it, “Every student should have access to an education that aligns with industry demands and evolves to meet the demands of tomorrow’s global workforce.”

Higher education has traveled a long way from John Henry Newman’s The Idea of a University. The life of the mind is now the life of the paycheck.

Vocational training is nothing new on the collegiate level. Universities have been awarding degrees in business and agriculture (now agribusiness) for more than a century. What is new is that vocational training, often of a narrowly tailored kind, has expanded to include much more of what is taught in undergraduate classrooms. Indeed, universities now see themselves as businesses—in marketing terms, a brand—and increasingly are organized on business models. The prevailing view of college curricula seems to be: If it doesn’t sell well, we won’t teach it. It’s a matter of supply and demand.

A transactional relationship

Unsurprisingly, that is the way many students and their parents see undergraduate education too. The basic model is purely transactional. For the price of tuition, room and board, students receive a piece of paper that essentially certifies where they spent the previous four to six years. The better the brand, the more valuable the piece of paper. But with few exceptions (like architecture) it doesn’t certify what, exactly, the graduates have learned.

Without that piece of paper, they are unlikely to be granted a job interview. But in that interview, they are also unlikely to be asked about what they learned, or for a copy of their grades. After a half-century of grade inflation, employers understand that an “A”—the grade most often given, according to one national study—is meaningless.

What we are witnessing is the commodification of education in the service of a society where what matters most in life is work. The commodification process begins long before a student steps onto a college campus—and among the economically fortunate, sometimes even before a child is born. In the nation’s suburbs, as any real estate agent will tell you, parents with means and ambition for their children usually choose where to live based primarily on the reputation of the school district. It is an investment in their children’s future, with acceptance at a good college the initial payoff.

In big cities like New York, where competition is keen for acceptance at private and the few high-quality public schools, children often sit for their first interview at age 3 at prestige preschools that promise entry into the kind of kindergartens that lead to acceptance into elite primary schools—and so on through the steeplechase that secondary education has become. The cost can easily run to $750,000 per child—with college yet to come.

Suburb or city, savvy students discover early on that high school is as much about resume-building as it is about mastering basic intellectual skills.

Suburb or city, savvy students discover early on that high school is as much about resume-building as it is about mastering basic intellectual skills. Grades matter on those resumes, but not without a mix of activities like sports, theater, volunteerism, summer internships—anything that might make their resume stand out from the others. The novelist Norman Mailer had an apt phrase for this sort of personal promotion. He called it “Advertisements for Myself.”

No wonder, then, that many students have come to regard schoolwork itself as their “job.” Repeated studies show that, over a lifetime, workers with a four-year college degree earn nearly two-thirds more than those without one. And given the exorbitant cost of an undergraduate degree—sticker prices of $80,000 and more at the most selective colleges and universities—who can criticize the student or parent for regarding a good starting job as a return on investment?

And yet, a quarter century into the third millennium, the United States remains a nation of high school graduates. Only 37 percent of Americans age 25 and over earn a four-year college degree of any kind. Moreover, there are four million fewer American students in college today than there were 10 years ago. As a result, since the year 2000, the United States has dropped from second to 16th among developed nations in the proportion of 25 to 34-year-olds holding a bachelor’s degree.

There are several reasons for this slide, but one of them is clearly the expense. Since 1981, the average cost of a degree from a four-year college has risen 156 percent, adjusted for inflation. No other sector of the economy comes close to that level of increase. As a result, an increasing number of high school graduates and their parents are subjecting higher education to a different kind of cost/benefit analysis. One recent study found that more than half of Americans (56 percent) do not think that a college degree is worth the cost, while another study of students ages 14 to 18 found that 60 percent of them think education beyond high school is unnecessary.

All this suggests that the time is ripe for rethinking the assumption that the purpose of acquiring a college degree is to snag a job at graduation.

The time is ripe for rethinking the assumption that the purpose of acquiring a college degree is to snag a job at graduation.

Education for Employment

For one thing, despite the wide array of specific job-related majors and minors that our huge public universities offer, barely one in four college graduates eventually works in the field defined by his or her college degree, according to research by the Federal Reserve Bank of New York. This shouldn’t surprise. In highly developed capitalist societies, the demands placed on the workforce are malleable and mercurial. But the main reason why undergraduates seldom end up working in the field of their major course of studies is much simpler: Most of them enter college with no idea of what sort of work they would like to do or are fit for, and most of them exit college the same way.

And then there is the confusion generated by grade inflation, an educational disease that is particularly virulent in the liberal arts, humanities and business courses for which there are fewer objective measures of achievement like those found in math and science. At Yale University, according to a 2023 study by an economist on its faculty, nearly 80 percent of grades awarded to undergraduates the previous academic year were A or A-. At Harvard University a year earlier, the percentage was the same. On elite campuses like these, the pressure to inflate grades stems in large part from students’ hopes of gaining admittance to equally elite graduate schools.

At many state universities, however, the function of grade inflation is to keep students moving through the system. The pressure on fastidious faculty members can be overwhelming: from aggrieved students who think the school owes them a respectable grade-point average; from parents who don’t want to deal with college dropouts; and from administrators who want to avoid any loss of fees and tuition. Grade inflation satisfies all these constituencies—except, of course, the people who hire these graduates only to discover that their abilities have been wildly over-advertised.

It is very possible for students to graduate from college without ever having to write a lengthy paper or read an entire book.

Basic to undergraduate education is the ability to read, write and understand complex sentences and paragraphs. And yet the complaint most often registered by job interviewers—and here the evidence is mostly published grievances rather than hard data—is that a great many college graduates can do none of these things. Spelling is left to spell check. For decades, even law schools have had to provide writing courses that begin with basic grammar. Now, with the emphasis on education as vocational training, it is very possible for students to graduate from college without ever having to write a lengthy paper or read an entire book.

The decline in reading and writing skills mirrors the wholesale abandonment of the study of the humanities and liberal arts—indeed, what education itself meant from ancient Greeks through the rise of the university itself in the Middle Ages to the creation of the American “multiversity” in the 1960s. Across the country, majors in English, history, foreign languages and the classics have declined so precipitously that some graduate schools have put a temporary hold on accepting new doctoral candidates: There aren’t enough job openings or interested students for them to teach.

To cite just one representative example: Miami University, a selective private school in Ohio serving 20,000 students, is considering the elimination of 18 undergraduate majors with fewer than 35 students each. They are mainly liberal arts staples like French and German, history, religion and the classics. By contrast, The New York Times reports that Miami last year had 600 students majoring in computer science, 1,200 in marketing and 1,400 in finance. That’s Education for Employment.

And yet, it is far from obvious that a degree in business, for example, is a better guarantee of future employment than, say, a degree in philosophy. On the contrary, according to figures from the Educational Testing Service, philosophy majors with only an undergraduate degree enjoy higher median mid-career earnings than business or chemistry majors—and they display a much higher trajectory of salary growth from entry to mid-career. Philosophy majors are also among the highest scorers on the LSAT and GMAT tests for admission to law and business schools.

According to a 2020 study, history majors have a lower unemployment rate than business management, economics or communication majors.

Or consider history majors: Once a hugely popular major, history now accounts for less than a half percent of all bachelor’s degrees awarded in the United States. Yet, according to a 2020 study, history majors have a lower unemployment rate than business management, economics or communication majors. And their average salaries are only marginally less than those who majored in those other fields.

But to cite such research is to challenge Education for Employment on its own terms. What we need is a different model for undergraduate education, one that better comports with the experience of learning itself. I call it “Education for Enjoyment.”

Education for Enjoyment

In “The End of the English Major,” a widely discussed New Yorker essay on the steep decline of the humanities in higher education, author Nathan Heller found that those college students who persist in acquiring a liberal arts education must constantly defend their choice to family and friends who ask, “What are you going to do with that?” As one college junior complained to Heller, “It’s hard for students like me, who are pursuing an English major, to find joy in what we are doing.”

Joy! That is what gets lost when the purpose of higher education is employment. And yet without it, there is no real education. That is because human beings are hardwired to learn—and delight in doing so. Consider the evident pleasure we see in infants as they begin to notice their surroundings then to learn to name the things they see and touch. Through the gift of language children internalize the world around them and in that process come to a rudimentary knowledge of themselves as well. We were all such Adams once.

Growing up, much of what we learn—and most of what we remember—occurs informally through parents and others to whom we are emotionally attached. I’m thinking here of tricks of craft or secrets of the kitchen absorbed from doting grandparents as well as parents. These are actually early forms of apprenticeship under the tutelage of willing mentors—two essential ingredients of formal education as well.

Classroom learning is something else. It isn’t always pleasurable: There are courses we must pass, whether they interest us or not, basic skills like reading, writing and math to be mastered, painful or not—and not all teachers are mentors to their students. Joy in learning does not mean that we should enjoy every class we take. No one does. On the contrary, Education for Enjoyment depends in part on required curricula because that is the only way teenage students can discover which subjects engage their interest and which do not, which intellectual disciplines they are good at and which they are not—in short, who they are and are not.

Education for Employment fetishizes the college diploma as a job credential. Education for Enjoyment takes a more realistic approach. It recognizes that not everyone is emotionally or intellectually suited for college-level studies, at least not at age 18. (I know I wasn’t.) Millions of Americans prefer to learn by doing and enjoy doing so. Education for Enjoyment also recognizes that much of the vocational training that undergraduates elect to pursue on college campuses could be learned better, faster and much cheaper through apprenticeship programs in the military and other on-the-job settings. Journalism, for example, is a humble craft best learned by doing, and its practice thrived long before the invention of journalism schools. Abraham Lincoln, it is worth remembering, learned the law by observing how an older lawyer practiced it.

The decline in the study of the liberal arts in general and humanities in particular should be of grave concern to anyone who cares about the future of American higher education.

Fortunately, cracks are beginning to appear in what President Biden has called the work world’s “paper ceiling”: the use of a college degree as a filter for judging job applicants. As a 2017 study led by researchers at Harvard Business School documented, millions of job postings listed a college degree as a requirement for positions that were currently held by workers who, in fact, lacked that paper credential themselves—proof that the jobs did not require a college degree.

What they did require were certain skills that only experience can provide. Thus, according to a recent report from the Brookings Institute, “a shift to skills-based hiring instead of degree-based hiring would unlock economic mobility for millions of workers who have been overlooked for decades because they lack college certification.”

So how might the idea of enjoyment reconfigure education for the fortunate minority of American youths who are able to attend college away from home for four or more years, and enjoy what university public relations departments promote as the full “college experience?” That is what most adolescents imagine when they think about “going to college.”

Education for vocation

On the undergraduate level, I would argue, education is fundamentally the pursuit of answers to three questions: What in life is worth doing? What, among the many options, would I like to do? What, after experimentation, have I found I’m good at? (In religious terms, what talents has God put at my disposal?) These are the unknowns that education at this level seeks to make known to and operative for students, regardless of their field of study. In this view, undergraduate education is essentially a pedagogy for discerning one’s calling in life.

A calling is not a subject you can major in. Nor is it an experience you can dial up at will. It is at once more personal and more encompassing. Reflecting its biblical origins (think of Abraham and the prophets), a calling is a way of life we feel summoned to. It is the conviction that there is a particular path for you in life fitted to your talents, one that moves you passionately, arouses moral commitment and answers the human need for personal fulfillment.

There is immense joy in recognizing one’s calling. There is also real sadness if, later in life, you discover that the path you took is not the one you would have chosen, had you given the matter greater attention. Callings are seldom clear without a few years of trial and error. In my own case, after graduation, I attended law and graduate school before realizing that what I could and should do is be a writer. But that fire was already kindled by mentors in my undergraduate years. It is a fire any dedicated college teacher can ignite.

This is why the decline in the study of the liberal arts in general and humanities in particular should be of grave concern to anyone who cares about the future of American higher education. One of the great strengths of the liberal arts is precisely that they are not designed to fit a student to a job, like a nut to a bolt. The habits of mind and heart they develop—the ability to understand complex texts, to write and speak in complex sentences, to analyze moral issues and make informed judgments, to understand, appreciate and perhaps create works of the imagination—are not merely useful. They emancipate. They are skills that make human beings human, and in exercising them there is great enjoyment.

One of the great strengths of the liberal arts is precisely that they are not designed to fit a student to a job, like a nut to a bolt.

Conversation as a way of life

But the importance of the humanities extends far beyond the sharpening of skills. All learning is a form of conversation. Economists and scientists as well as engineers and artists learn by studying precedents. Marcel Duchamp studied the old masters before painting his “Nude Descending a Staircase,” and in his great modernist novel, Ulysses, James Joyce redeployed Homer’s Odyssey. There can be no innovation without tradition. Ignorance of the past, as T. S. Eliot knew, breeds provincials of the present.

In the humanities, the past is forever present. Just as we are all born into a language system not of our own choosing, but without which we could not even think, so through the study of the humanities—literature and history, theology and philosophy, and the main ideas in social sciences—we are born again into an ongoing conversation with the traditions on which cultures and civilizations depend. Understood this way, conversation is not merely the ability to talk well (though a really good conversation, like a really good wine, needs no justification); it is a way of life that enlarges the meaning and enjoyment of our own.

Of course, not everyone who goes to college is drawn to this way of life. But given the current emphasis on Education for Employment, it is now very possible—indeed likely—that most students graduate from college unaware that the kind of conversations I’ve described even exist.

For readers who are also unaware, here is a story that illustrates what a life of conversation can come to mean.

During a journalistic assignment to the former Soviet Union, I met an impressive young poet, Irina Rashnikova, a Christian convert who had been imprisoned for 18 months because she was outspoken about her faith. In prison, she had no one of her breeding to converse with and was allowed no books. No matter. In her memory were the great Russian writers she had already read, and they provided an ongoing conversation. Inspired by these interior dialogues, Irina produced a series of poems of her own, writing on the walls of her cell with bars of soap because she wasn’t permitted pencils or paper. After committing them to memory, Irina washed the lines away before the guards could discover her forbidden activity. Smuggled out of prison on cigarette paper, the poems eventually became a celebrated book.

Irina’s story can also serve as a parable of another—and inevitable—kind of confinement. Those of us who have survived into healthy old age eventually feel the circle of life contracting. Siblings, friends, classmates and former colleagues move on or away or die, foreclosing cherished conversations. Children and grandchildren are beloved but intermittent presences. But those of us who have enjoyed a life with Dante and Dickens, Hopkins and Eliot, Conrad and Cather—or any of the others whose works sit spine erect on my shelves—are never really alone. Like God in prayer, they are always there for conversation.

The latest from america

“His presence brings prestige to our nation and to the entire Group of 7. It is the first time that a pope will participate in the work of the G7,” Italian Prime Minister Giorgia Meloni said.
Gerard O’ConnellApril 26, 2024
“Many conflicting, divergent and often contradictory views of the human person have found wide acceptance … they have led to holders of traditional theories being cancelled or even losing their jobs,” the bishops said.
Robots can give you facts. But they can’t give you faith.
Delaney CoyneApril 26, 2024
Sophie Nélisse as Irene Gut Opdyke, left, stars in a scene from the movie “Irena's Vow.” (OSV news photo/Quiver)
“Irena’s Vow” is true story of a Catholic nurse who used her position to shelter a dozen Jews in World War II-era Poland.
Ryan Di CorpoApril 26, 2024