Over on the Scholar’s Stage forum, one forum member asks why the number of American university students selecting history as their chosen four year degree has been on the decline since the 1970s. He offers three hypotheses:
1) College students themselves valuing degrees with more defined career paths. As Schmidt notes, this is more complex than “I want a lot of money after graduation.” Psychology and “Arts Management” don’t hold a candle to Chemical Engineering in terms of money, but interest in them is increasing, probably because [they] have more defined career paths than history.
2) A variation of Bryan Caplan’s “signaling” hypothesis. This thesis argues that the value of education is in proving that “Jim is intelligent and conscientious,” not that “Jim acquired XYZ skills/knowledge.” In the supposed past, all college degrees had a strong signal of intelligence and conscientiousness, because fewer people went to college. In the present, history has less signaling value than more rigorous subjects that weed out students. I don’t think history SHOULD purposely weed out students…not only is it a cruel practice, but at this point it really can’t afford to lose more students! But the point stands.
3) The ideological polarization of the college-going populace. This is a more complex point than you might think. The rise of X Studies degrees means that the ardent Lefties who would have entered history are now entering X studies. The History professoriate tilting hard to the Left, and regularly engaging in Left-wing polemics on social media, probably alienates conservatives would have chosen history. I think this is a much smaller consideration than 1) and 2), but it’s probably not nothing.1
“M.,” “The Decline and Fall of the Discipline of History?,” Scholar’s Stage: Forum (29 June 2022).
All three of these hypothesis have explanatory power. Let us cover each in turn.
For the first hypothesis, there is a great deal evidence that college students in the 2010s saw their college experience a form of professional, not personal, development. Earlier generations tended to conceive of the university as a place for intellectual growth and self-discovery; in the aughts that changed, with students now putting top priority on raising their social-economic status2. This is partly a function of rising college costs, which make idealistic visions of the university as intellectual playground a financial ruinous delusion. It is partially a function of rising inequality, which makes the divide between the graduates and drop-outs stark. It is also reflects the broader student base that now attends university. The wealthy students of days past did not need their university experience to change their material circumstances. Their family could provide for anything the university failed to instill in them.
See, for example, data presented in Catherine Rampbell, “Why Do Americans Go To College? First and Foremost, Because They Want Better Jobs,” Washington Post (17 February 2015).
History, a major only tenuously attached to any legible career path, was always going to lose out in the face of these trends.
There are some things that history departments might do to blunt these forces. One of the challenges that history departments face is that even if their students are as intelligent or as driven as those students in business, biology, or psychology, the achievements of the history major are not legible in the world of commerce. I made this point in a small memo I wrote to the history professors of my university a decade ago, when there was talk of closing the department down for lack of students. I suggested that history majors be required to take several courses outside the major. This requirement would entail four or so courses in a practical skillset: GIS mapping, documentary filmmaking, data visualization, or something of this sort. Higher level history classes would require not only term papers but also historical projects that used these skills—say, a Google Maps skin that traced Hannibal’s journey across the Alps, or a small documentary on a historical figure they discovered in the archives. This sort of project would require similar research skills as a term paper, but the end result would be legible to people who had never taken a history class out of high school. The end goal would be for every student in the major to have a small portfolio of historical projects that could be whipped out at the beginning of any interview.
This sort of approach wouldn’t reverse the drastic decline in history majors, but it might help staunch the flood and encourage some students who might otherwise be afraid to declare the major.
In my mind the second hypothesis is the most important. One rough way to test this hypothesis would be to look at whether the decline in history majors across the United States is also seen at elite institutions. If 1) degrees are primarily a social signal, 2) the value of that signal is declining as a larger percentage of the population goes to college, and 3) employers have thus shifted to selecting on majors to separate wheat from chaff, it follows that students attending selective Ivy League institutions will feel less pressure to choose majors that signal their intelligence and discipline. University attendance may no longer be a mark of exclusivity; Harvard attendance certainly still is.
I turned to the U.S. News and World Report for data on this point. Behold: approximately one in ten Harvard undergraduates major in history. Between 6% and 7% of the undergraduates at Yale and Princeton do the same—approximately the same percentage as the national rates in the 1970s.3
For the national data, see Brian Schmitt, “The History BA since the Great Recession,” Perspectives on History (26 November 2018). For the school data, see the U.S. News and World Report pages for Harvard, Yale, and Princeton.
I appreciate the non-ideological framing of the third hypothesis. It strikes me as obviously true, but I see a further wrinkle in the story. This wrinkle might be significant enough to qualify as a fourth hypothesis instead of a mere elaboration of the third. It goes something like this:
In the 1960s, when history and English majors were among the most popular on campus, America was a very different place. This was an America where most kids memorized reams of poetry in school, where one third of the country turned on their television to watch a live broadcast of Richard III, and where listening to speeches on American history was a standard Independence Day activity. The most prominent public intellectuals of this America were people like Lionel Trilling (literary critic), Reinhold Niebuhr (theologian), and Richard Hofstader (historian). This was a world where the humanities mattered. So did humanities professors. They mattered in part, as traditionalists like to point out, because these professors were seen as the custodians of a cultural tradition to which most American intellectuals believed they were the heirs to. But they mattered for a more important reason—the reason intellectuals would care about that birthright in the first place.
Americans once believed, earnestly believed, that by studying the words of Milton and Dante, or by examining the history of republican Rome or 16th century England, one could learn important, even eternal, truths about human nature and human polities. Art, literature, and history were a privileged source of insight into human affairs. In consequence, those well versed in history and the other humanistic disciplines had immense authority in the public eye. The man of vaulting ambition studied the humanities.
It is hard to pinpoint the exact moment when this changes. As discussed in an earlier piece at the Stage, the last poet whose opinion anybody cared about was probably Allen Ginsberg. The last novelist to make waves outside of literary circles was probably Tom Wolfe—and he made his name through nonfiction writing (something similar could be for several of other prominent essayists turned novelists of his generation, like James Baldwin and Joan Didion). Harold Bloom was the last literary critic known outside of his own field; Allan Bloom, the last with the power to cause national controversy. Lin-Manuel Miranda is the lone playwright to achieve celebrity in several decades.
The historians have done a bit better. While nobody alive today has the following or authority that an Arthur Schlesinger Jr. or Richard Hofstadter had in the ‘50s, some historians have gotten close to the halls of power. The ideas of John Lewis Gaddis and Bernard Lewis had immense sway in the Bush II administration. Historians like Adam Tooze and Timothy Snyder, experts in authoritarian menace, have a great deal of cachet. But it is remarkable how small a role historians play in many of our public debates. Despite the amount of attention lavished on the problem of a rising China, we witness none of the sharp historical controversy over revolutions and terrors past that historians waged during the Cold War. The most ferocious historical debate of recent times was unleashed not by a historian but by a journalist.
If the humanistic disciplines no longer have public cachet, then who now speaks to the public?
One of the most important intellectual developments of the 21st century—something that I hope to highlight in my own book on this era—is the growing prestige and public authority of two modes of thought. I struggle to label the first group with one word—I waffle between “the modelers,” “the data heads,” and “the social scientists.” This group is comprised of psychologists, cognitive scientists, computer scientists, data scientists, statisticians, economists, quantitative sociologists, geographers, anybody who uses the word “computational” in front of their job description, and anybody whose main method of public engagement is a dynamic data visualization. Data is the watchword of these folks, empiricism their vocation, science their title.
In many ways these intellectuals embodied the zeitgeist of the Obama era. Under this banner we find people as different as Steven Pinker and Nate Silverman, Hans Rosling and Richard Dawkins, or Alison Gopnik and Cass Sustein. These are the sort of people who contribute to Edge. It is their sort of writing that the early Vox pretended to. At its best, the rise of this intellectual style led to the creation of entire new fields of intellectual inquiry (such as the marriage of evolutionary anthropology, cross cultural psychology, quantitative sociology now called “cultural evolution”). At its worst, the rise of this intellectual style led to your great aunt’s favorite Ted Talk.
The authority of the modelers has fallen from their Obama era zenith, but their way of thinking still holds powerful sway in the public mind (and has something close to monopoly status in Silicon Valley). Rising to meet the modelers are the intersectionalists. The two approaches have more in common than is initially apparent. Both modes idealize the counterintuitive insight. Practitioners of each believe that the masses are content to live in a world of surface realities; both groups secure their public standing by tearing away the truisms of everyday life to expose the truer workings of the world. Both scientist and intersectionalist are keepers of the secrets, blessed with analytic tools that allow them to see the patterns and deconstruct the processes that guide our lives.
The two differ most strikingly in their relationship to virtue. The authority of the modelers lay in their claims to objectivity; they seek truths that stand outside faction. The intersectionalists not only claim that this is impossible; the strength of their methods rests explicitly on the moral force of their injunctions.
Neither tribe is entirely congenial to the traditional humanists. The quarrel between the “two cultures” of the humanities and the sciences is well known. Less commonly understood is the struggle between the humanists and the intersectional theorists. The literary critic Mark Edmundson laid out the basic conflict in an essay for Harper’s a decade ago:
I suspect too that some of poetry’s reticence about speaking in large terms, swinging for the fence, owes to what one might call a theory-induced anxiety. In the modern-day university, the literary theorists are down the hall from the poets. What cultural theory seems to have taught the younger generation of poets is that one must not leap over the bounds of one’s own race and gender and class. Those differences are real and to be respected; the poets hear it time and again, if only as an echo from the nearby lecture hall: He who would write poetry that does not respect the politics of identity is impure, an opportunist, not to be trusted. Now, using Lowell’s “our” or Whitman’s “we” can register as a transgression against taste and morals. How dare a white female poet say “we” and so presume to speak for her black and brown contemporaries? How dare a white male poet speak for anyone but himself? And even then, given the crimes and misdemeanors his sort have visited, how can he raise his voice above a self-subverting whisper?
Poets now would quail before the injunction to justify God’s ways to man, or even man’s to God. No one would attempt an Essay on Humanity. No one would publicly say what Shelley did: that the reason he wrote his books was to change the world.
But poets should wise up. They should see the limits emanating from the theoretical critics down the hall in the English department as what they are. Those strictures are not high-minded moral edicts but something a little closer to home. They are installments in the war of philosophy against poetry, the one Hass so delicately evokes. The theorists — the philosophers — want the high ground. They want their rational discourses to hold the cliffs, and they want to quiet the poets’ more emotional, more inspired interjections. They love to talk about race and class and gender with ultimate authority, and of course they do not wish to share their right with others.4
Mark Edmundson, “Poetry Slam,” Harpers, July 2013 issue.
In the “the ancient quarrel between philosophy and poetry” the intersectionalists represent philosophy triumphant. To the social scientists who claimed that the humanities had no objective method for tracing causation, the humanist could retort that the modelers had no method for deciding on questions of beauty, value, or virtue. (They could further claim that the models of the modelers were absurdly reductionist, but that is a debate for another day). The theorists launched a more devastating attack: they denied universal judgements of beauty, value, and virtue altogether. They shrink the human condition down to a narrow span—a span bound tightly by divisions of class and caste. In arguing that a white man cannot access or understand the experience of a black man from his own country—much less a Chinese or Indian woman who lived centuries ago—the theorists robs the humanities of their revelatory power. History, poetry, and art lose much of their purpose. Their study is reduced to tracing genealogies of the wrong and harmful.
This charge is fatal to the study of literature. What is literature, after all? Nothing more than words on a page. It characters are fictional; its imagery, imaginary. There is no compelling reason for anyone to study the subject—as opposed to simply enjoy it—if it did not promise entry to something greater. That something greater was always the promise of “great” literature. The fundamental claim of literary critic and the novelist is that certain transcendent truths are best explored through fiction. If a work of literature does not have the potential to change your life—or at least fundamentally transform how you think about some aspect of life—then there is little reason to include it in the general curricula.
Students have always intuited this. But now these students hail from a culture that denies universal experience all together. When the words and works of the past are devalued as inherently blinkered and partial, declining interest in their study should surprise no one.
Historians face a similar problem. Yes, they deal with the real, not the imaginary. But that reality is long past. The classical humanist position acknowledges the peculiar individuality of each human being, yet insists that there is something common to human life that can be explored across the human span. Plutarch believed he could find parallel lives across the centuries. Any historian who does not share this belief will bring little value to students centuries removed from the object of their study. Historians must be careful not to distort the past, of course. They must not read the problems of the present moment back onto the concerns of the past. But if there is no connection between the two the study of the past will always be a hard sell to students living in the present.
American culture has lost faith in history as a vehicle for understanding the human experience. Our high culture questions the very concept of shared human experience. It is hard for history—or any of the humanities—to flourish in a world that does not put much stock in the human. By adopting intersectional ideology as their own, the professional humanists have confirmed that they do not believe in the promise of their own discipline. And if they do not believe in it…. why should any 18 year old student?
Finally, a fourth hypothesis: Americans no longer like to read. There is a lot of survey data on this question.5 The amount of time Americans spend reading has been on a constant clip of decline for several decades. History courses require a great deal of reading and writing. A younger friend of mine told me how she started taking a history of science class to fill a Gen Ed requirement, found herself fascinated with its material, but then dropped it for an anthropology course with a far lighter (“more manageable”) reading load. I suspect her story generalizes.
See Christopher Inagraham, “Leisure reading in the U.S. is at an all-time low,” Washington Post (29 June 2018).
This also accords with the changing demographics of the American university. As a rule, foreign students are harder working than their homegrown counterparts. But English is their second language. Intensive writing and reading courses will be something they avoid.
Students increasingly despise reading. It might be that simple.