The Fall of History as a Major–and as a Part of the Humanities

From Brian Schmitt, “The History BA since the Great Recession,” Perspectives on History (26 November 2018).

Over on the Scholar’s Stage forum, one forum member asks why the number of American university students selecting history as their chosen four year degree has been on the decline since the 1970s. He offers three hypotheses:

1) College students themselves valuing degrees with more defined career paths. As Schmidt notes, this is more complex than “I want a lot of money after graduation.” Psychology and “Arts Management” don’t hold a candle to Chemical Engineering in terms of money, but interest in them is increasing, probably because [they] have more defined career paths than history.

2) A variation of Bryan Caplan’s “signaling” hypothesis. This thesis argues that the value of education is in proving that “Jim is intelligent and conscientious,” not that “Jim acquired XYZ skills/knowledge.” In the supposed past, all college degrees had a strong signal of intelligence and conscientiousness, because fewer people went to college. In the present, history has less signaling value than more rigorous subjects that weed out students. I don’t think history SHOULD purposely weed out students…not only is it a cruel practice, but at this point it really can’t afford to lose more students! But the point stands.

3) The ideological polarization of the college-going populace. This is a more complex point than you might think. The rise of X Studies degrees means that the ardent Lefties who would have entered history are now entering X studies. The History professoriate tilting hard to the Left, and regularly engaging in Left-wing polemics on social media, probably alienates conservatives would have chosen history. I think this is a much smaller consideration than 1) and 2), but it’s probably not nothing.1

1

“M.,” “The Decline and Fall of the Discipline of History?,” Scholar’s Stage: Forum (29 June 2022).

All three of these hypothesis have explanatory power. Let us cover each in turn.

For the first hypothesis, there is a great deal evidence that college students in the 2010s saw their college experience a form of professional, not personal, development. Earlier generations tended to conceive of the university as a place for intellectual growth and self-discovery;  in the aughts that changed, with students now putting top priority on raising their social-economic status2. This is partly a function of rising college costs, which make idealistic visions of the university as intellectual playground a financial ruinous delusion. It is partially a function of rising inequality, which makes the divide between the graduates and drop-outs stark. It is also reflects the broader student base that now attends university. The wealthy students of days past did not need their university experience to change their material circumstances. Their family could provide for anything the university failed to instill in them.

2

See, for example, data presented in Catherine Rampbell, “Why Do Americans Go To College? First and Foremost, Because They Want Better Jobs,” Washington Post (17 February 2015).

History, a major only tenuously attached to any legible career path, was always going to lose out in the face of these trends.   

There are some things that history departments might do to blunt these forces. One of the challenges that history departments face is that even if their students are as intelligent or as driven as those students in business, biology, or psychology, the achievements of the history major are not legible in the world of commerce. I made this point in a small memo I wrote to the history professors of my university a decade ago, when there was talk of closing the department down for lack of students. I suggested that history majors be required to take several courses outside the major. This requirement would entail four or so courses in a practical skillset: GIS mapping, documentary filmmaking, data visualization, or something of this sort.  Higher level history classes would require not only term papers but also historical projects that used these skills—say, a Google Maps skin that traced Hannibal’s journey across the Alps, or a small documentary on a historical figure they discovered in the archives. This sort of project would require similar research skills as a term paper, but the end result would be legible to people who had never taken a history class out of high school. The end goal would be for every student in the major to have a small portfolio of historical projects that could be whipped out at the beginning of any interview.

 This sort of approach wouldn’t reverse the drastic decline in history majors, but it might help staunch the flood and encourage some students who might otherwise be afraid to declare the major.

In my mind the second hypothesis is the most important. One rough way to test this hypothesis would be to look at whether the decline in history majors across the United States is also seen at elite institutions. If 1) degrees are primarily a social signal, 2) the value of that signal is declining as a larger percentage of the population goes to college, and 3) employers have thus shifted to selecting on majors to separate wheat from chaff, it follows that students attending selective Ivy League institutions will feel less pressure to choose majors that signal their intelligence and discipline. University attendance may no longer be a mark of exclusivity; Harvard attendance certainly still is.   

I turned to the U.S. News and World Report for data on this point. Behold: approximately one in ten Harvard undergraduates major in history. Between 6% and 7% of the undergraduates at Yale and Princeton do the same—approximately the same percentage as the national rates in the 1970s.3

3

For the national data, see Brian Schmitt, “The History BA since the Great Recession,” Perspectives on History (26 November 2018). For the school data, see the U.S. News and World Report pages for Harvard, Yale, and Princeton.



I appreciate the non-ideological framing of the third hypothesis. It strikes me as obviously true, but I see a further wrinkle in the story. This wrinkle might be significant enough to qualify as a fourth hypothesis instead of a mere elaboration of the third. It goes something like this:

In the 1960s, when history and English majors were among the most popular on campus, America was a very different place. This was an America where most kids memorized reams of poetry in school, where one third of the country turned on their television to watch a live broadcast of Richard III, and where listening to speeches on American history was a standard Independence Day activity. The most prominent public intellectuals of this America were people like Lionel Trilling (literary critic), Reinhold Niebuhr (theologian), and Richard Hofstader (historian). This was a world where the humanities mattered. So did humanities professors. They mattered in part, as traditionalists like to point out, because these professors were seen as the custodians of a cultural tradition to which most American intellectuals believed they were the heirs to. But they mattered for a more important reason—the reason intellectuals would care about that birthright in the first place.

 Americans once believed, earnestly believed, that by studying the words of Milton and Dante, or by examining the history of republican Rome or 16th century England, one could learn important, even eternal, truths about human nature and human polities. Art, literature, and history were a privileged source of insight into human affairs. In consequence, those well versed in history and the other humanistic disciplines had immense authority in the public eye. The man of vaulting ambition studied the humanities.

It is hard to pinpoint the exact moment when this changes. As discussed in an earlier piece at the Stage, the last poet whose opinion anybody cared about was probably Allen Ginsberg. The last novelist to make waves outside of literary circles was probably Tom Wolfe—and he made his name through nonfiction writing (something similar could be for several of other prominent essayists turned novelists of his generation, like James Baldwin and Joan Didion). Harold Bloom was the last literary critic known outside of his own field; Allan Bloom, the last with the power to cause national controversy. Lin-Manuel Miranda is the lone playwright to achieve celebrity in several decades.

The historians have done a bit better. While nobody alive today has the following or authority that an Arthur Schlesinger Jr. or Richard Hofstadter had in the ‘50s, some historians have gotten close to the halls of power. The ideas of John Lewis Gaddis and Bernard Lewis had immense sway in the Bush II administration. Historians like Adam Tooze and Timothy Snyder, experts in authoritarian menace, have a great deal of cachet. But it is remarkable how small a role historians play in many of our public debates. Despite the amount of attention lavished on the problem of a rising China, we witness none of the sharp historical controversy over revolutions and terrors past that historians waged during the Cold War. The most ferocious historical debate of recent times was unleashed not by a historian but by a journalist.

If the humanistic disciplines no longer have public cachet, then who now speaks to the public?

One of the most important intellectual developments of the 21st century—something that I hope to highlight in my own book on this era—is the growing prestige and public authority of two modes of thought. I struggle to label the first group with one word—I waffle between “the modelers,” “the data heads,” and “the social scientists.” This group is comprised of psychologists, cognitive scientists, computer scientists, data scientists, statisticians, economists, quantitative sociologists, geographers, anybody who uses the word “computational” in front of their job description, and anybody whose main method of public engagement is a dynamic data visualization. Data is the watchword of these folks, empiricism their vocation, science their title.

In many ways these intellectuals embodied the zeitgeist of the Obama era. Under this banner we find people as different as Steven Pinker and Nate Silverman, Hans Rosling and Richard Dawkins, or Alison Gopnik and Cass Sustein. These are the sort of people who contribute to Edge. It is their sort of writing that the early Vox pretended to. At its best, the rise of this intellectual style led to the creation of entire new fields of intellectual inquiry (such as the marriage of evolutionary anthropology, cross cultural psychology, quantitative sociology now called “cultural evolution”). At its worst, the rise of this intellectual style led to your great aunt’s favorite Ted Talk.

The authority of the modelers has fallen from their Obama era zenith, but their way of thinking still holds powerful sway in the public mind (and has something close to monopoly status in Silicon Valley). Rising to meet the modelers are the intersectionalists.  The two approaches have more in common than is initially apparent. Both modes idealize the counterintuitive insight. Practitioners of each believe that the masses are content to live in a world of surface realities; both groups secure their public standing by tearing away the truisms of everyday life to expose the truer workings of the world. Both scientist and intersectionalist are keepers of the secrets, blessed with analytic tools that allow them to see the patterns and deconstruct the processes that guide our lives.

The two differ most strikingly in their relationship to virtue. The authority of the modelers lay in their claims to objectivity; they seek truths that stand outside faction.  The intersectionalists not only claim that this is impossible; the strength of their methods rests explicitly on the moral force of their injunctions.

Neither tribe is entirely congenial to the traditional humanists. The quarrel between the “two cultures” of the humanities and the sciences is well known. Less commonly understood is the struggle between the humanists and the intersectional theorists. The literary critic Mark Edmundson laid out the basic conflict in an essay for Harper’s a decade ago:

I suspect too that some of poetry’s reticence about speaking in large terms, swinging for the fence, owes to what one might call a theory-induced anxiety. In the modern-day university, the literary theorists are down the hall from the poets. What cultural theory seems to have taught the younger generation of poets is that one must not leap over the bounds of one’s own race and gender and class. Those differences are real and to be respected; the poets hear it time and again, if only as an echo from the nearby lecture hall: He who would write poetry that does not respect the politics of identity is impure, an opportunist, not to be trusted. Now, using Lowell’s “our” or Whitman’s “we” can register as a transgression against taste and morals. How dare a white female poet say “we” and so presume to speak for her black and brown contemporaries? How dare a white male poet speak for anyone but himself? And even then, given the crimes and misdemeanors his sort have visited, how can he raise his voice above a self-subverting whisper?

Poets now would quail before the injunction to justify God’s ways to man, or even man’s to God. No one would attempt an Essay on Humanity. No one would publicly say what Shelley did: that the reason he wrote his books was to change the world.

But poets should wise up. They should see the limits emanating from the theoretical critics down the hall in the English department as what they are. Those strictures are not high-minded moral edicts but something a little closer to home. They are installments in the war of philosophy against poetry, the one Hass so delicately evokes. The theorists — the philosophers — want the high ground. They want their rational discourses to hold the cliffs, and they want to quiet the poets’ more emotional, more inspired interjections. They love to talk about race and class and gender with ultimate authority, and of course they do not wish to share their right with others.4

4

Mark Edmundson, “Poetry Slam,” Harpers, July 2013 issue.

In the “the ancient quarrel between philosophy and poetry” the intersectionalists represent philosophy triumphant. To the social scientists who claimed that the humanities had no objective method for tracing causation, the humanist could retort that the modelers had no method for deciding on questions of beauty, value, or virtue. (They could further claim that the models of the modelers were absurdly reductionist, but that is a debate for another day). The theorists launched a more devastating attack: they denied universal judgements of beauty, value, and virtue altogether. They shrink the human condition down to a narrow span—a span bound tightly by divisions of class and caste. In arguing that a white man cannot access or understand the experience of a black man from his own country—much less a Chinese or Indian woman who lived centuries ago—the theorists robs the humanities of their revelatory power. History, poetry, and art lose much of their purpose. Their study is reduced to tracing genealogies of the wrong and harmful.

This charge is fatal to the study of literature. What is literature, after all? Nothing more than words on a page. It characters are fictional; its imagery, imaginary. There is no compelling reason for anyone to study the subject—as opposed to simply enjoy it—if it did not promise entry to something greater. That something greater was always the promise of “great” literature. The fundamental claim of literary critic and the novelist is that certain transcendent truths are best explored through fiction. If a work of literature does not have the potential to change your life—or at least fundamentally transform how you think about some aspect of life—then there is little reason to include it in the general curricula.

Students have always intuited this. But now these students hail from a culture that denies universal experience all together. When the words and works of the past are devalued as inherently blinkered and partial, declining interest in their study should surprise no one.  

Historians face a similar problem. Yes, they deal with the real, not the imaginary. But that reality is long past. The classical humanist position acknowledges the peculiar individuality of each human being, yet insists that there is something common to human life that can be explored across the human span. Plutarch believed he could find parallel lives across the centuries. Any historian who does not share this belief will bring little value to students centuries removed from the object of their study. Historians must be careful not to distort the past, of course. They must not read the problems of the present moment back onto the concerns of the past. But if there is no connection between the two the study of the past will always be a hard sell to students living in the present.

American culture has lost faith in history as a vehicle for understanding the human experience. Our high culture questions the very concept of shared human experience. It is hard for history—or any of the humanities—to flourish in a world that does not put much stock in the human. By adopting intersectional ideology as their own, the professional humanists have confirmed that they do not believe in the promise of their own discipline. And if they do not believe in it…. why should any 18 year old student?

Finally, a fourth hypothesis: Americans no longer like to read. There is a lot of survey data on this question.5 The amount of time Americans spend reading has been on a constant clip of decline for several decades. History courses require a great deal of reading and writing. A younger friend of mine told me how she started taking a history of science class to fill a Gen Ed requirement, found herself fascinated with its material, but then dropped it for an anthropology course with a far lighter (“more manageable”) reading load. I suspect her story generalizes.

5

See Christopher Inagraham, “Leisure reading in the U.S. is at an all-time low,” Washington Post (29 June 2018).

This also accords with the changing demographics of the American university. As a rule, foreign students are harder working than their homegrown counterparts. But English is their second language. Intensive writing and reading courses will be something they avoid.   

Students increasingly despise reading. It might be that simple.

Leave a Reply to Michael Ysrael Cancel reply

63 Comments

“Earlier generations tended to conceive of the university as a place for intellectual growth and self-discovery; in the aughts that changed, with students now putting top priority on raising their social-economic status.”

I think you put that transition much too late. I was at Berkeley in the late 1980s, and most of the students were there to improve their socio-economic status. The big difference I see between the ’80s and the 2000s is that in the 1980s one could major in History with a reasonable expectation of getting a better job than would be available to a high-school graduate. By the 2000s, getting a degree in History (or many other humanities or social sciences) required going to law school or getting a PhD to get a job better than high-school teacher.

Hard to disagree with this. I recall my experience attending university 2007-2011. Almost every student approached it as white-collar vocational training and social networking. This was at a decent (top-40) southern school. “Soft” humanities like history were almost exclusively vehicles to law school. “Hard” (ie some math involved) humanities like economics or political science were likewise simple gateways into MBA programs and the like.

As someone with a History BA and, later, an MBA, and who is now giving advice to a college-bound son, I think you’re right. It’s especially hard to justify a history degree given the cost of college now, which is dramatically different from a couple generations ago when history was a more-popular major.

I think there has been a broader cultural shift behind students’ motivations (hypothesis 1), which has resulted in many programs becoming vocational in spirit if not in pedagogy. I would sum it up as “we don’t care to write/produce/create for the future, and thus don’t care for our predecessors who wrote for us.” Beyond the factors you mention, the causes might include a general rise in consumerism, overproduction of elites, and democratization of culture and of the pathways to wealth (both good things in my view).

Regarding hypothesis 2, it’s worth pointing out that history is still considered a rigorous course in many elite American universities, and those who read history at, say, Oxbridge (to the extent this is an issue in the UK) often end up in public life and its adjacencies.

I think the focus on theory is a bit overblown or at least pales in comparison to your other camp, the modelers. Yes, theory has diffused into the mainstream and it’s hard to avoid its influence in any semi-intellectual conversation today. But, its tenets are still debatable; the modeling mindset is so ingrained that no one would even question it. This applies not just to strictly quantitative fields but also banking, consulting, product management, and general corporate life, where every decision must have some quantifiable basis. Even philosophy, which I think you align with the intersectionalists (if I didn’t misunderstand), has become rather mathy and retains a strong analytic bias in the (American) academy.

Whatever the causes, I agree that the loss of shared human experience has resulted. I actually wrote about this recently ([my name] dot com). But history and the humanities have survived worse, and society never escapes the universal for long – at least on a historical timescale!

What frustrates me with my description of the modelers is that there something missing from it. While it clearly has its antecedents in the ’20s-’50s heyday of social science, its modern vogue has something to do with electronics, silicon valley, the rising prestige of nerds, and perhaps a cultural reaction to the Bush years (think “republican war on science”). But it is hard for me to pin point exactly what is different about the 21st century version vs. what came before.

Accessibility of computers for statistics and visualization may be one big difference. Anyone with decent computer skills can learn to use Excel or R in a cookbook way — you can aggregate data, run statistical tests, easily produce professional-quality charts, and the whole process is easy and fun.

Easier and more fun than getting good data or understanding what you’re doing — and then you get 1000 scientific papers which don’t replicate and all of which will be cited in a book whose title is a single noun. On the other hand, at its best this accessibility means first-rate work can be produced by fewer people with almost no resources (like Youyang Gu’s COVID modeling).

Another one might be exactly what your post is about — nobody studies the humanities any more. I think those legends from the RAND golden age do recognizably have the thinking style of Silicon Valley people today. But they were part of that world you describe where everyone had exposure to the humanities, and their thought just seems deeper for it. I’m comparing, say, Warren Weaver to someone like Paul Graham.

If we had people in the bipartisan foreign policy Blob capable of historically reasoning, they would understand from the example of the British Empire in 1945 of winning the world war/s but losing the peace and the colonies. In this case, the American Empire may well be succeeding in bleeding the hated Russians (TBD based imo on Putin or his harder line successor’s ability to rally the Russian people to greater sacrifices for partial to full mobilization) but loosening the seemingly tighter than ever Anglo American imperial death grip over a collapsing EU.

Despite ‘keeping the Germabs down’ Lord Hastings Ismay’s famous quip about the purpose of NATO, I think we’ll first a defiant Orbanist Hungary, then pent up Russian sympathizing Bulgaria, then Croatia then the bigger dominos like Italy governments joining Turkey in adopting India-like Non Aligned Neutrality after this Washington and London hose their allies this winter. That is, open support for ‘the multipolar world’, even while paying lip service to NATO.

It is always a good idea, especially when arguing against the bipartisan foreign policy Blob, not to sound stupid. “[T]he seemingly tighter than ever Anglo American imperial death grip over a collapsing EU” makes you sound stupid.

Some random reactions:

American families have gotten smaller over the last 70 years, perhaps meaning that children, and their life trajectories, loom larger for both parents and child. The proportion of children going on to college has increased over the same period, perhaps meaning that careers matter more for those whose parents never went to college. Both factors might raise signaling in importance.

While the proportion of history majors has declined, I wonder about the absolute numbers–much of a decline there? How do Jill Lepore’s sales numbers compare to Schlesinger’s or Hofstadter’s? Maybe as the population doubled from 1960 to now we’ve not expanded the room for history?

“American culture” has changed and diversified. You can see it in areas other than history–consider classical music. I’d guess there’s lots of symphonic music written today; much of it in the form of soundtracks for movies and TV, much less of it played in concert halls to live audiences.

Thanks fo rtracking down this data; I did wonder myself if perhaps the number of people choosing the major was stable, and it was new students from lower SES being added in that simply were not choosing the major. This is a good data point against that theory,

“Too much reading” might be a good explanation. I remember talking to a computer scientist who disdained the humanities as being too easy, but was shocked by the amount of required reading in a general ed history course. In terms of jobs, opportunities in law enforcement, especially federal special agents. Like historians, they answer a question by using primary sources, document the results which are then subject to peer review. This might take additional course work, which the article alludes to (currently, IRS Criminal Investigation–the folks that got Capone and Agnew–require 15 semester hours of accounting and 9 related semester hours, which includes subjects such as economics, which a history major will probably be taking anyway.) The skills of a history major would also fit with being a local or state detective, but those careers would require years on the beat before getting that job.

The “signaling” model of education was written by Michael Spence when Bryan Caplan was about two years old. Calling it “Bryan Caplan’s ‘signaling’ hypothesis” is an error.

Mike Spence didn’t invent it either. There were any number of academic articles (and a few books) before him. Charles Peters’ Washington Monthly complained about it constantly. Of course, Spence developed the “market signaling” idea magnificently, deserving his (semi-)Nobel. Shockingly, his Market Signaling is now out of print, and Amazon doesn’t even have a used copy.

“Art, history, and literature were a privileged source of insight into human affairs.”

As an older millennial, color me skeptical. When interacting with older people, I don’t get the sense that I am interacting with a batch of people who are more humanities-oriented bunch of people than people my age or younger. This is the age group that finds NCIS, sharing Minions memes and Dancing with the Stars to be the pinnacle of entertainment. What a third of Americans watched on TV decades ago isn’t a good basis for comparison – there was just less to watch.

I DO think younger generations are more skeptical that the Western Canon has all the answers worth having and that their interests in art, literature and history are much less concentrated in that area, but I’m not sold on the idea that younger generations simply don’t care as much about humanities.

The point is less about the boomers, than the adults who ran the country when the boomers were teenagers. The boomers are the start of this transition to a new world.

Who will lead the country? Who will be the next Jim Mattis? How will a STEM degree educate a person to navigate global affairs; how to interact with different cultures? Who knows the history of global conflicts and their causes? There is a definite need for historians, political scientist, and philosophers to create and maintain a balanced world. A huge mistake to minimize the humanities.

I always find these sort of arguments in favor of the humanities perplexing. Yes, not everything can be easily quantified and reduced to numbers – and reducing something to numbers doesn’t mean making it objective if in the process you simplify too much or leave out some critical factors. In principle, the universe *is* numbers, and so are all the human affairs, of course, but the high level, complex phenomena are so far removed from the lower level ones that it’s essentially just a nice academic thought. We have to deal with what we have, and what we have is our brains and their fuzzy understanding of the sort of social realities they evolved to deal with.

However, those brains also evolved to give us big, BIG blind spots. They aren’t supposed to make us more just or more fair. They’re supposed to make us and our tribe WIN. That’s what evolution selects for. There’s entire books full of pages written by historians, philosophers, lawyers and so on coming up with sophisticated rationalizations for the same points a random rube could make: why we’re THE BEST, why we should do things I like and not do things I don’t like, and so on. Being an expert in humanities doesn’t make you better at handling different cultures or global conflicts, per se. It doesn’t make you more honest or more able to understand human suffering. And if you don’t at least make the effort of rooting your reasoning into being self-critical, into trying to still adhere to some kind of empirical evidence or test of your ideas, it risks simply being a road to being very, very good at convincing yourself and others of whatever nonsense you wanted to believe in the first place.

Which isn’t to mean that we don’t need humanities, absolutely. Of course we do need them. But we shouldn’t pine for these idealized images of the great intellectuals and humanists of the past leading the country. Because in practice philosophers and historians are just as able to fuck these things up as anyone else.

Wrong…they’re able to fuck things up, but less so…your opinion essentially places primacy on discounting critical thinking…that it’s a coin toss as to which type would lead more successfully….critical thinking requires consideration of different choices…Excel crunches numbers

I didn’t read her comment as discounting critical thinking at all. To me, she seemed to be saying that a person steeped in the humanities can be just as deficient in critical thinking as someone who uses Excel a lot. In fact, they can be even more so.

Perhaps, you are saying, that is unlikely because someone truly educated in the humanities has to develop good critical thinking skills. I suppose that is an empirical question. But I can’t help thinking of the “no true Scotsman” story: The Scottish patriot tells his friends, “No Scotsman gets drunk.” One of his friends disagrees, ” I saw MacGregor just last night and he was falling down drunk.” “Aye,” replies the patriot, “No true Scotsman ever gets drunk.”

This is as true for the social scientists as it is for the humanist, I am afraid. Self deception through data and modeling is just as easy–perhaps easier–than self deception through modeling.

I majored in both history and political science. I think there are important things to be learned from both approaches. But if I had to choose, I would go with the tool set of the historians and novelists.

Life teaches you how to navigate other cultures. Going to any world class university, and you will inevitably encounter people from different places. Clubs, internships, friends all provide hands-on experience in a way that a humanities classroom doesn’t. I enjoyed my humanities classes a lot, but I wouldn’t bring some of the academics I’ve known to a cross-cultural dinner. Modern academia encourages an absorption in minutiae that makes your own work seem to be the most important thing, which can make respecting and getting along with other cultures rather difficult.

Further, I’ve known more STEM majors to take history as a hobby than historians who have bothered to learn the quantitative skills necessary for computer or data science. Humanities seems to be more a soft skill that you can pick up outside the university.

Your source for the Richard III thing is a 2013 self-published political tract with zero copies available on Amazon.

I did look this up on Google Books and found a book about the history of NBC, which halved the number to 25 million and contextualized it with other information about the time slot in which Richard III aired, and the relative popularity of lowbrow game shows, probably among those young boomers. I agree with the larger point you’re making in this essay about the shift from following tastemakers to following “modelers,” but I think it’s worth pointing out (as George Marsden has done quite well) that the intellectual giants of the humanities, these guardians of American culture, were already deconstructing the normativity that their own status implied.

In 1910, high school students often read Macaulay’s Essay on Lord Clive, John Greenleaf Whittier’s Snow-Bound, Francis Parkman’s Oregon Trail, Longfellow’s Courtship of Miles Standish, etc. I believe most of these texts were abandoned in schools by 1960; if readers look into these titles, I think they will find that this was mostly for good reason.

(p.s. if anyone knows the Courtship of Miles Standish today, it is probably only through the Looney Tunes adaption which has itself been banned for 20 years at this point)

Left a final sentence off that last paragraph! Lionel Trilling was creating a new canon which he felt more appropriate to a 20th century liberal nation, throwing out the old canon I mentioned; Niebuhr creating a new type of theology; and Hofstader is George Marsden’s normative anti-conformist par excellence.

Much of this rings very true. One thing I think it leaves out however, is how much the shift in focus to making higher education a means for students to “raise their social-economic status” is very much a policy goal, publicly stated. The slashing of government funding of university education combined with higher tuition fees was supposed to leave students with “more skin in the game,” a constant refrain, so they would be forced to look at the transactional nature of their education and therefore far more likely to choose a path which would lead to paying off student loans rather than a path which might lead to personal/cultural fulfillment, leaving a student unable to dig themselves out from under debt obligations.

Congress has enacted several laws over the last twenty-plus years to make it ever more difficult to discharge student loan debt through bankruptcy. The downsides to getting a degree in something which will not pay are clear to see when the consequences are legally binding like this.

Look, let’s just acknowledge that reading and writing makes people lazy. Instead of using their memories and fully understanding a subject, they just recite what’s on the page. Without context, the written word can be twisted to mean whatever you want it to mean. Those who read and write merely appear educated rather than being so. We should really go back to memorization and discussion, as our forebears did.

Not even a tiny spike from the celebration by US universities of the 800th anniversary of Magna Carta in 2015. Oh, wait, that’s right the even was barely mentioned at US universities. Quite logically, of course, as mention Magna Carta and the hue and cry will arise “that only applied to the elite”. Which is true, but missing that posted each year, the ideas wore upon the people and over the centuries more and more came under its sway, even to the point of abolition of slavery.

The humanities haven’t defended themselves over the last 40 years. I saw Jordan Peterson recently say, “People don’t have ideas. Ideas have people.” What idea is the modern history department offering that might recruit people to learn more about?

I’m curious if there is an increase in the number of political science and government majors?

Dear Tom
Using the same data I mentioned above:
1970-71: 27,482 political science and government majors
2018-291: 36,715 political science and government majors
an increase but as much as in economics
Also, for completeness,
1970-71: 33,263 sociology majors
2018-19: 26,702 sociology majors
a drop, but as large as in history.

Many PoliSci/Government majors are doing it as a sort of vocational training. They want to work in government and make the world a better place. Many of them implicitly believe, “The public sector is about the public interest; the private sector is about private greed.” Ironically, their education will give them many, many times when the first part of that is not correct. But most of them will keep the faith.

My hunch was that there was a flight from history to these majors since they’re somewhat adjacent.

Terrific piece… Sad, but I think true.

I have two engineering degrees and an MBA. I remember when I was finishing my BMSE in the 80’s (Clarkson University, Potsdam, NY) that there was active discussion about creating a 5-year engineering program that added one year’s worth of coursework in the humanities as a way of making engineers better, more well-rounded leaders. I remember thinking that it was a wonderful idea. One more year of study to study history/philosophy…etc would have been so valuable!

I wish this idea could come to life. The first year salaries of engineers make that a tough proposition however. I feel it would benefit society at large if this modified degree program could happen.

Regarding hypothesis 3, why hasn’t history expanded to accommodate other cultures, then? I find Sei Shonagon to acknowledge universal truths more than Dante Alighieri (how many people have engaged in gossip vs. had divinely inspired midlife crises?), and yet, she wasn’t taught in any of my classics courses. If history insists on staying white, male, and Western, it shouldn’t be surprised when people move on to other studies.

.maybe the problem is that real history has been flushed down the toilet and a new fake woke history has taken its place? Imagine if a professor was brave enough to teach the real history of slavery? How there are various different forms of slavery throughout time? Or that slavery has been the norm of human civilization until just a couple hundred years ago? If we actually tried to reach real history, not the touch feely stuff that passes for academic vigor, then maybe more people will take history and can apply the lessons of history. But when you cover up and lie about the past, it becomes irrelevant and we will continue to make the same mistakes of the past.

There are many, many history courses that take this line. Type in “comparative history of slavery syllabus” and see what you find.

Sure, if you go to a university and choose one of the history classes that will cover it. I got my CS degree without taking any history so all I have is the standard high school American history. So my conception of slavery is the popular one, where all slaves were black and lived in the southern states, maybe trained in the Caribbean if I remember that. The worst treates slaves in Uncle Tom’s Cabin, a book that nobody read (but should) and is fairer to slave owners than the modern conception. (Since the goal was to convince people slavery is bad enough to go to war over that is saying something)

Of course i’m an American, I don’t really know how other countries see slavery.

Noel Lenski’ and Catherine M. Cameron’s “What is a Slave Society” is a nice collection of essays on slavery across the world and across history.

I would say most of #2 and a little of #3 and #1.

Of course, 2 seems kind of true to me? The now conservative religious reverence for old books from someone’s cultural tradition makes no sense to me, and I can’t help but notice that everyone seems to think that old books for *their* cultural tradition are the exalted ones.

Modern writers seem way better than writers of the past, as are modern inventors, philosophers, etc.

Can you find a better writer of history than Samuel Eliot Morison? His Harvard PhD dates from 1912. But, of course, Mr. Morison would never coin so euphonious a phrase as “way better.”

Professor Admiral Samuel Eliot Morison was the last Harvard professor to commute to work on horseback.

WHat will you learn majoring n history that you couldn’t learn at the public library plus you tube.

The ability, inclination and confidence to question and evaluate the histories at the public library and You Tube.

One real reason not to study history in a university setting is that one may read history for oneself and gain more insight and understanding than by spending years being lectured to by liars and cheats. The lecturer in my 1966 freshman history class was a revisionist who didn’t even have a firm grip on the characters and dates, much less the cause and import of events of the history of the United States of America about which his compulsory course was based. In the decades since, it has gotten much easier to gain access to original sources ad the same time that the curriculum in most universities has become decreasingly fact-based.

I do not share this judgement. If you compare the rigor of a Jackson Turner to a Hoftstader it is clear that the standards for citations and evidence were just far higher from the mid 20th century forward. If you have the fortitude to read through histories written in the woke interpretive frame, and disregard the theoretical apparatus they use, you find a strong core of primary sources underneath–often a stronger core than older works.

But not always. And in truth, most of the “Woke” interpretations are just reworkings of ideas and themes that historians from 1920s or 1930s originally developed, but with more jargon and less readable prose. Not hard to find condemnations of America as imperialist, white supremacist, etc. from a century ago.

Philosophy major here (from early 1980s). This was another traditional major for future lawyers (and also future clergy, although the mainline churches that favored such a cerebral approach were already on the decline, along with their cushy clergy jobs). Logic was always its own subgroup, and had as much in common with math or computer science as with the ethicists and metaphysicians they shared a department with. Of course some reckless students just picked philosophy because they had to have a major, and happened to like the subject. Me, I was interested in Eastern religions and New Age stuff (which was then entering its heyday) and intuited that philosophy had relevance to my spiritual search. I remember a lot of double majors, which is natural considering that so many basic philosophical subjects / courses overlap with other disciplines (political philosophy, philosophy of science, philosophy of religion, consciousness studies, ethics / professional ethics, etc.).

More generally, I noticed several types of motivation among university students (but who typically pursued several of these goals at once: career or financial goals, drinking and partying (you may remember Murray Sperber’s “Beer and Circus”), knowledge or wisdom for its own sake (rare I know, but philosophy departments are a good place to look for them), and self-exploration or self-identity (college being a traditional time to join a cult or subculture, explore one’s sexuality, etc.). I have to imagine that universities have been having more and more trouble fulfilling these aims (especially if they try to offer all of them at once), and that the several economic disasters of the era (dot-com bubble, 2008, Covid + general global collapse) haven’t helped. Then as now, a fair number of students are “aimless” in the sense of being in college just because it’s kind of expected, and they don’t know what else to do (not that this is necessarily a bad thing).

A few random comments:

“The last novelist to make waves outside of literary circles was probably Tom Wolfe.” What about Stephen King? What about J.K. Rowling? Sure, they’re lowbrow, and as often as not communicate their political views through Twitter. On that note, while Niebuhr may go largely unread, we are not left without public theologians–there’s Eckhart Tolle, Rick Warren, Jack T. Chick (possibly the most influential 20th century theologian!), and many more.

“…the last poet whose opinion anybody cared about was probably Allen Ginsberg…” Are we counting rappers?

[But] “some historians have gotten close to the halls of power.” I’ve always wondered how many DC people follow Juan Cole. (Probably more during the Iraq War.)

I’m not counting rappers or any other song lyricists. Of course, what they are doing is in a very real sense poetry. Anyone who has an itch to do popular poetry will do that. But if you are officially “a poet”, you are only committing words to a page, and pretty much no one cares outside some literature departments.

Another aspect not touched on here is that going into academia as a history major (or any humanities major, really), is a brutal affair with very little prospects of making a living doing it. Bret Devereaux, a military historian who teaches at a university and runs a public facing blog about perceptions of history as shown in popular media, wrote an article called “So you want to go to grad school (in the academic humanities)?” recommending people against it. He describes the process as:

“You are working anywhere from 60 to 80 hours a week, while economically precarious, often within hollering distance of the poverty line. Your coursework, in addition to being heavy, is high-stress and high-stakes… Graduate school takes emotionally well-adjusted, ultra-high-performing students and through a combination of stresses largely turns them into anxious, depressed and neurotic wrecks.”

If someone makes it through all of that, there still isn’t a guaranteed job at the end of it, in academia or elsewhere. With conditions like these, it’s a wonder that anyone is studying history at all.

As a current college student, I think “Americans don’t like reading” is a very accurate representation of what’s happening, but I’d like to qualify that: Young Americans dislike and experience difficulty with critical reading.

Psychology majors, business majors, even computer scientists have to read. But they read texts that contain the truth outright. No psychology major needs to worry that their readings are lying to them. STEM students have difficulty critically reading research in (and not in) their fields! Untrained experts (read: journalists) often have the same problem. But they can get by without actually thinking critically or analytically about any of this stuff.

History, literature, and other words-based humanities are quite different. If you cannot critically read, compare, and analyze text, you will be incapable of succeeding. So students go to majors that involve passive reading, majors where they read the truth as it’s written in words and not as it must be discovered.

I worked for three decades in the research department of a Wall St. “bulge bracket” firm and interviewed many job candidates. I did not ask, “What are the five signs of conservative accounting?” for the answers to such questions are easy to find or teach. Instead, I asked, “Tell me about some movie or book you have read.” A good history major or an English lit major would have shined. They would know how to identify the crux of the matter and in the future should be able to summarize a forceful investment idea. I would prefer to hire a history major that is smart enough to earn a CFA than a business major with an MBA who may only know the capital ratios.