Public Intellectuals Have Short Shelf Lives—But Why?

Image Source

Several months ago someone on twitter asked the following question: which public thinker did you idolize ten or fifteen years ago but have little intellectual respect for today? [1] A surprising number of people responded with “all of them.” These tweeters maintained that no one who was a prominent writer and thinker in the aughts has aged well through the 2010s.

I am not so harsh in my judgments. There are a few people from the last decade that I am still fond of. But the problem is inevitable. This is not a special pathology of the 21st century: when you read intellectuals of the 1910s talking about the most famous voices of the 1890s and early 1900s you get the same impression. You even get this feeling in a more diluted form when you look at the public writing of the Song Dynasty or Elizabethan England, though the sourcing is spottier and those eras and there was no ‘public’ in the modern sense for an individual living then to intellectualize to. But the general pattern is clear. Public intellectuals have a shelf life. They reign supreme in the public eye for about seven years or so. Most that loiter around longer reveal themselves oafish, old-fashioned, or ridiculous.

To give you a sense of what I mean by this, consider the career of public intellectual whose career peaked in the early aughts. Thomas Friedman is now the butt of a thousand jokes. He maintains his current position at the New York Times mostly through force of inertia, but secondly through his excellent connections within the Davos class and his sterling reputation among those who think as that class does. But this was not always so. Let us review Friedman’s climb to prominence:

Thomas Friedman earned his BA in Mediterranean Studies in 1975; a few years later he obtained a prestigious Marshall scholarship to study at Oxford, where he earned a Masters in Middle Eastern Studies. By age 26 he was a reporter in Beirut, and at age 29 he had won his first Pulitzer (for up close reporting on a war massacre). He would win another Pulitzer as the New York Times‘ bureau chief in Jerusalem, and at age 36 would write his first award winning book, From Beirut to Jerusalem, a recapitulation of his years of reporting in those two cities. This put Friedman at the top of the “Middle East hand” pack. That is a nice place to be, but it is still far away from the position of household public intellectual.

To get there Friedman would first transition to reporting from Washington DC as a White House correspondent. A few years later (now at age 41) he would be given a foreign affairs column at the New York Times, moving him a step further into the opinion-business. I attribute his transformation from minor public commentator to Voice of the Zeitgeist to two events: first,  the publishing of The Lexus and the Olive Tree in 1999 (when he was 46 years old), the first of several books that would lay out his theory of globalization; second, the terrorist attacks September 11th, which allowed him to write columns that drew on both his long personal experience in the Middle East and his newer interest in globalization. These were the columns that won him his Pulitzer for commentary in 2002 and made him a central voice in the debates over America’s response to the terrorist attacks and the the invasion of Iraq. I place Friedman’s peak in his 52nd year, when his most famous book, The World is Flat, was published. It was also around this time that opposition to Friedman was at its peak, with bloggers and columnists alike writing long diatribes against him.

Friedman would close out the decade with another book and three documentaries. These were mostly restatements of his columns (which in turn drew heavily from ideas he first introduced and developed between Lexus and The World if Flat). Friedman was still a part of the national conversation, but his perspective had lost its originality. His columns began to bleed together. This is the era when “Friedman Op-Ed Generators” went viral. Increasingly, Friedman was not argued against so much as joked about. By 2013 or so (just as he was turning 60) Thomas Friedman was done. Not technically so—between then and now he would rack up two more books, hundreds of columns, and heaven knows how many appearances at idea festival panels and business school stages. But intellectually Friedman was a spent force. His writing has been reduced to rehashing old rehashes, his columns the rewarmed leftovers of ideas grown old a decade ago. It is hard to find anything in his more recent books or columns that has mattered. He is able to sell enough books to live comfortably, but you will have difficulty finding anyone under 50 who admits they have read them. Friedman lingers still as a public figure, but not as a public intellectual. His thinking inspires no one. The well has run dry.

But why?

The easy answer is that the world of 2019 is not the world of 2002. What seemed compelling at the turn of the millennium is not compelling now. A man whose worldview has not budged in two decades has nothing to say to a world that has changed tremendously in that same time. But this answer is not really sufficient. It is hard to remember now, but there was once a time when the insights of Thomas Friedman read fresh and strikingly original. That his ideas seem so banal and obvious today is in many ways a measure of how successful he was at popularizing them in the early 2000s. The real question to answer is this: why are so many public intellectuals capable of generating insight, originality, or brilliance at the beginning of their careers, but are utterly incapable of fresh thinking a decade later?

Let me offer two hypotheses. One is psychological, the other sociological.

Analytic brilliance is not constant over the course of life. Both general intelligence and more nebulous measures of creativity have clear peaks over the course of a lifespan. Here is how one textbook describes research on this question (I’ve taken out the parenthetical references to various source studies for ease of reading):

In most fields creative production increases steadily from the 20s to the late 30s and early 40s then gradually declines thereafter, although not to the same low levels that characterized early adulthood. Peak times of creative achievement also vary from field to field. The productivity of scholars in the humanities (for example, that of philosophers or historians) continues well into old age and peaks in the 60s, possibly because creative work in these fields often involves integrating knowledge that has crystallized over the years. By contrast, productivity in the arts (for example, music or drama) peaks in the 30s and 40s and declines steeply thereafter, because artistic creativity depends on a more fluid or innovative kind of thinking. Scientists seem to be intermediate, peaking in their 40s and declining only in their 70s. Even with the same general field, differences in peak times have been noted. For example, poets reach their peak before novelists do, and mathematicians peak before other scientists do.

Still in many fields (including psychology) creative production rises to a peak in the late 30s and early 40s, and both the total number of works and the number of high quality works decline thereafter. This same pattern can be detected across different cultures and historical periods…. 

What about mere mortals? Here researchers have fallen back on tests designed to measure creativity. In one study, scores on a test of divergent thinking abilities decreased at least modestly after about age 40 and decreased more steeply starting around 70. It seems that elderly adults do not differ much from young adults in the originality of their ideas; the main difference is that they generate fewer of them. Generally then, these studies agree with the studies of eminent achievers: creative behavior becomes less frequent in later life, but it remains possible throughout the adult years.”[2] 

I suspect the underlying mechanism behind this pattern is brain cell loss. Neuroscientists estimate that the average adult loses around 150,000 brain cells a day; in the fifty years that follow the end of brain maturation (ca. years 25-75), the average brain will lose somewhere between 5-10% of its neurons.[3] Fluid intelligence begins declining in a person’s 30s.[4] This implies that most humans reach their peak analytic power before 40. Crystal intelligence holds out quite a bit longer, usually not declining until a person’s 60s or 70s. This is probably why historians reach peak achievement so late: the works that make master historians famous tend towards grand tomes that integrate mountains of figures and facts—a lifetime of knowledge—into one sweeping narrative.

Thus most humans develop their most important and original ideas between their late twenties and early forties. With the teens and twenties spent gaining the intellectual tools and foundational knowledge needed to take on big problems, the sweet spot for original intellectual work is a person’s 30s:  these are the years in which they have already gained the training necessary to make a real contribution to their chosen field but have not lost enough of their fluid intelligence to slow down creative work. By a person’s mid 40s this period is more or less over with. The brain does not shut down creativity altogether once you hit 45, but originality slows down. By then the central ideas and models you use to understand the world are more or less decided. Only rarely will a person who has reached this age add something new to their intellectual toolkit.

Recognizing this helps us make sense of a many interesting aspects of human social life. I think often about Vaisey et al’s 2016 study, which demonstrated that most shifts in social attitudes occur not through change in the attitudes at the individual level, but through intergenerational churn.[5] Old attitudes die because generations that hold them literally die off. Such is the stuff of progress and disaster.

Such is also the problem of the public intellectual. A public intellectual’s formative insights were developed to explain the world he or she encountered during a specific era. Eras pass away; times change. It is difficult for the brain to keep up with the changes.

Not impossible, just hard. And this bring my second, sociological explanation into play. There are things that a mind past its optimum can do to optimize what analytic and creative power it still has. But once a great writer has reached the top of their world, they face few incentives to do any of these things.

Consider: Thomas Friedman’s career began as a beat reporter in a war-zone. He spent his time on Lebanese streets talking to real people in the thick of civil war. He was thrown into the deep and forced to swim. The experiences and insights he gained doing so led directly to many of the ideas that would make him famous a decade later.

In what deeps does Friedman now swim?

We all know the answer to this question. Friedman jets from boardroom to newsroom to state welcoming hall. He is a traveler of the gilded paths, a man who experiences the world through taxi windows and guided tours. The Friedman of the 20th century rushed to the scene of war massacres; the Friedman of the 21st hurries to conference panels. What hope does a man living this way have of learning something new about the world?

More importantly: What incentive does he have to live any other way?

I have noticed that historians who transition from the role of academic scribbler to famed public voice follow a sort of pattern. Their first published work might be a monograph, perhaps a PhD thesis turned book. It will be on some narrow topic no sane person cares about, the product of months spent in one archives in one location. U.S.-British trade relations in the 1890s, perhaps, or state-led cultural imperialism in Japanese Manchuria. They may repeat this feat again, but at some point they transition to something broader⁠—now they are writing a global history of trade regimes under the gold standard, or of empire building in the whole Greater East Asia Co-prosperity sphere. This work will be a brilliant, field-defining piece of scholarship, lauded (or resented) by other luminaries of their sub-discipline, read by scholars and interested laymen alike. That book will be published by an academic press; the next will be aimed at popular audiences. Our historian has now graduated fully to the role of public thinker: her next book will be on the dangers posed by trade wars writ large, or on the nature of modern imperialism. This title will be reviewed in all the famous magazines; people who have never read it will argue about it on twitter. And then everything starts to fall apart.

The trouble is that just as our historian reaches her full stature as a public name, her well of insight begins to run dry. A true fan of her works might trace elements of their name-making title back to the very first monograph she published as a baby academic. She was able to take all of the ideas and observations from her early years of concentrated study and spin them out over a decade of high-profile book writing. But what happens when the fruits of that study have been spent? What does she have to write about when they have already applied their unique form of insight to the problems of the day?

Nothing at all, really. Historians like this have nothing left to fall back on except the conventional opinions common to their class. So they go about repackaging those, echoing the same hollow shibboleths you could find in the work of any mediocrity.

You see this pattern recur again and again in the op-eds of our nation. A once-bold foreign correspondent whose former days of daring-do have already been milked for more than they are worth, a Nobel laureate two decades removed from the economic papers that gave him acclaim, a nationally known historian who has not stepped into an archive since graduate school—the details change but the general pattern is the same. In each case the intellectual in question is years removed from not just the insights that delivered fame, but the activities that delivered insight.

The tricky thing is that it is hard to go back to the rap and scrabble of real research when you have climbed so high above it. Penguin will pay you a hefty advance for your next two hundred pages of banal boilerplate; they will not pay you for two or three years of archival research on some narrow topic no one cares about.  No matter that the process of writing on that narrow topic refills the well, imbuing you with the ideas needed to fill out another two decades of productive writing. The world is impatient. They do not have time to wait for you to reinvent yourself.

There are practical implications for all this. If you are an intellectual, the sort of person whose work consists of generating and implementing ideas, then understand you are working against time. Figure out the most important intellectual problem you think you can help solve and make sure you spend your thirties doing that. Your fifties and sixties are for teaching, judging, managing, leading, and dispensing with wisdom. Your teens and twenties are for gaining skills and locating the problems that matter to you. Your thirties are for solving them.

Public intellectuals who do not wish to transition in the their forties from the role of thinker to mentor or manager are going to have a harder time of it. Optimizing for long term success means turning away from victory at its most intoxicating. When you have reached the summit, time has come to descend, and start again on a different mountain. There are plenty of examples of this—Francis Fukuyama comes to mind as a contemporary one—but it is the harder path. For some, this will be a path worth taking. For others, wisdom is found in ceding the role of public intellect over to younger upstarts and moving to more rewarding positions guiding the next generation of intellectual lights.

—————————————————————————————
If you would like to read some of my other jottings on psychology may find the posts “Historians, Fear Not the Psychologist,” Public Opinion in Authoritarian States,” and “Taking Cross Cultural Psychology Seriouslyof interest. If writing on intellectual life are more up your alley, consider “Questing for Transcendence,” “Books Notes–Strategy, a History,” “I Choose Hannah Arendt,”   and On the Angst of American Journalists” instead. To get updates on new posts published at the Scholar’s Stage, you can join the Scholar’s Stage mailing list, follow my twitter feed, or support my writing through Patreon. Your support makes this blog possible.
—————————————————————————————

[1] I’ve forgotten who, and did not bother saving the tweet—if you know who it is sound off in the comments)

[2] Carol Sigelman and Elizabeth Rider, Lifespan Human Development, 6th ed (Belmont, CA: Wadsworth Learning, 2009).

[3] John E Dowling, Understanding the Brain: From Cells to Behavior to Cognition (New York: W. W. Norton & Company, 2018).

[4] John Horn and Raymond Cattel, “Age differences in fluid and crystallized intelligence,” Acta Psychologica (1967), vol 26, 107-129. For a very strong counter-statement that argues this fluid v. crystal distinction does not match the complexity of the data, see Joshua Hartshorne and Laura Germine, “When Does Cognitive Functioning Peak? The Asynchronous Rise and Fall of Different Cognitive Abilities Across the Life Span,” Psychological Science (2015), vol 26, iss. 4, 433–443.


[5] Stephen Vaisey and Omar Lizardo, “Cultural Fragmentation or Acquired Dispositions? A New Approach to Accounting for Patterns of Cultural Change,” Socius: Sociological Research for a Dynamic World (2016), vol 2 .

Leave a Comment

36 Comments

Or maybe time gives us some perspective and weeds out those who don't quite make the cut. After reading Marcus Aurelius' Meditations, for example, the YouTube pontifications of Jordan Peterson seem a bit lacking.

Jordan Peterson was also uppermost in my mind from the first paragraph. I never agreed with everything he said, but when he first broke I really enjoyed a lot of his videos, whereas now I find his takes and general persona so consistently embarrassing I wonder how I could ever have seen anything in him. The same’s fast happening with the Weinstein brothers. I’m tempted to think none of them had anything to offer in the first place, but that’d be reductive – I think this article’s explanation is better. It’s possible to contain a lot of good and a lot of bad.

(Can relate to you preferring to go to the source with Marcus Aurelius too – one of the main things I enjoyed with Jordan was the amateur mythological/psychological/religious/Darwinian stuff, but these days I prefer to get my spiritual fix from Buddhist & Taoist authors.)

Perhaps the ones who fade and spend their lives reiterating their perspective are more public than intellectual.

You have an interesting point here, but defaulting to a theory of brain cell loss shows an ageist bias. If you think this theory is true, then you should seek out studies correlating superior brain cell count with creativity, independent of age.

Consider an alternate theory: Creativity in a field is a function of time in that field. As you point out, some disciplines (say, math) have their greatest advances due to creativity, while other disciplines (e.g. history) have their greatest advances due to synthesis and perspective. In other words, you're more likely to be a great mathematician in your first 10 years in the field, while you're more likely to be a great historian after at least 20 years in the field. Of course, it is also true that time in the field correlates to age. But the cause-and-effect here is about time in the field, not age.

This alternate theory says that great mathematicians are young not because they're young, but because they're new at math. It happens to be unlikely that someone starts a serious career in mathematics at age 50, but if a person did so, they would have an equal chance to a young person at making great advances during the first 10 years of their career, from age 50 to age 60. The fact that this doesn't happen often can be entirely sociological rather than just biological.

I'm not asserting the alternate theory is undoubtedly true. There is science backing both theories. I'm just saying that the fact that you default to the brain cell theory shows that you have an ageist bias that you should reexamine.

The prison mathematics project seem to be supportive of your theory, the founder didn’t start serious mathematical studies until his 30s, yet with good mentorship by an Italian professor, was able to publish original research in a peer reviewed journal. All while in prison no less.

This is an interesting piece. I’m just writing in to let you know that you spelled Steve Vaisey’s name wrong.

Similarly, movie comedians like Jerry Lewis, Robin Williams, Eddie Murphy, Jim Carrey, and Will Ferrell typically have about a half decade at the top before audiences start getting tired of them.

After awhile, the public begins to figure out their shticks and anticipate it, so even the most brilliant talents no longer seem surprising.

A complementary perspective: Some people have one big, attractive idea in their lives. Once they have used up this idea, they can do one of two things: Work with new ideas (novelist example: Nick Hornby), repeat themselves (novelist example: Charles Bukowski).

If they use a new idea, chances are the new one is not going to be as attractive as the old one.

If they keep repeating the old idea, people are going to get tired of it, as per Steve Sailer's comment above (Perhaps especially so because the most attractive versions of the big, attractive idea have been used first).

In both cases, a decline in popularity.

"Penguin will pay you a hefty advance for your next two hundred pages of banal boilerplate; they will not pay you for two or three years of archival research on some narrow topic no one cares about."

That's very interesting. This makes me want to know how much value there would be in tackling some new subject with no experience, vs. tackling it with a lot of experience in a different field.

My naive expectation is that sending someone who worked completely through a previous insight to tackle a new one would generally be more thorough and produce fewer errors; as a side effect I expect getting to the broader insight faster. I separately expect someone with a lot of experience in an adjacent subject to be able to understand context faster and build on their previous insights more or less directly.

Assuming I am right, what would that be worth? Penguin might not pay, but would a university or corporation? What about the DoD or State Department?

T Greer: Analytic brilliance is not constant over the course of life. Both general intelligence and more nebulous measures of creativity have clear peaks over the course of a lifespan.

Alternative hypothesis: Simple mean reversion. Test by comparing prominence over time to age at first prominence. If it's governed largely by time since prominence and not age at first prominence, likely simple mean reversion is at play.

Do younger prominent public intellectuals seems smarter? Are the young, Woke lumpenintelligentsia set qualitatively better? They mostly seem callow, petty, foolish and shallow, more insulated from reality and society as it is, not folk who see it with more acuity. I would guess that if you identified a young public intellectual, they would not have more staying power or productivity if we could wave a wand and, Peter Pan-like, they retained neurological youth.

And then there are public intellectuals whose (ir)relevance has stayed about constant all their life: Chomsky for example. Still they seem to gather fans.

What about biologists? Many seem to be prolific even to a very ripe age. Maybe they are comparable to historians, because they synthesize vast amounts of data (thus "natural history").

Then, in the computing science, people who have invented (successful) programming languages (ones based on new paradigm), have almost as their duty to keep on developing that specific platform, instead of concocting even more languages. This could be compared to Tolkien, whose "duty" must have been to complete the legends of the Middle Earth, instead of trying to radically "renew" his writing career. ("Oh please, do not write about those hobbits and elves anymore!") But maybe he was a kind of historian as well?

Maybe there's also an aspect that "public intellectuals" who are journalists or even philosophers have much less leeway for their creativity than say poets, composers/musicians, visual artists or mathematicians, who are much less bridled by the banalities and zeitgeist of the contemporary human society?

It would be nice to know more examples of people who have stayed truly creative right to the end? Beethoven and Dostoyevsky for example?

Anon says: "Penguin might not pay, but would a university or corporation?"

Academics often *do* do this. Think of the university professors who switch to a truly novel research program once they gain tenure!

Kitturi says:

"Maybe there's also an aspect that "public intellectuals" who are journalists or even philosophers have much less leeway for their creativity than say poets, composers/musicians, visual artists or mathematicians, who are much less bridled by the banalities and zeitgeist of the contemporary human society?"

And yet the same holds true for artists as well. Very few rule their scenes for multiple decades at a time. We remember those who did that because they tend to be the very best. Even Shakespeare in his late stage was a superior playwright. But he comes from such a high that he could not help but be.

Other anon said:

"Test by comparing prominence over time to age at first prominence. If it's governed largely by time since prominence and not age at first prominence, likely simple mean reversion is at play."

But what of those who rise and rise and then fall? Common arc. LemmusLemmus' comment above strikes me as the best alternative explanation so far.

"Do younger prominent public intellectuals seems smarter?"

Than Thomas Friedman? Heavens yes.

Of course folks will have their partisan favorites. But there is a reason the people who rule the roost at NRO and the now-defunct Weekly Standard had their moment c. 2000 but struggle to develop anything useful or attractive to a conservative under 45 today.

ginsudo-

"If you think this theory is true, then you should seek out studies correlating superior brain cell count with creativity, independent of age"

I do not see why that would be true. Does a Ferrari have more parts in its engine than a pick up truck? Extra engine parts might not make a car go faster… but you take enough of the car parts out, and performance will decrease.

Anon 1 said:

"Or maybe time gives us some perspective and weeds out those who don't quite make the cut"

Undoubtedly! But you had the chance to read Aurelius before you read Peterson. So why did he strike such a chord?

Aurelius speaks to all mankind. Peterson had something to say to the young men of 2018-19 specifically. Which is part of what makes a public intellectual public, I suppose. Speaking directly to the problems of their day, not all days.

T Greer: there is a reason the people who rule the roost at NRO and the now-defunct Weekly Standard had their moment c. 2000 but struggle to develop anything useful or attractive to a conservative under 45 today.

Sure, but those seem as compatible or more with simple mean reversion in productivity and perhaps a dose of cohort ingroup loyalty (and cohort differences in needs). As much as with explanations based on older people generally simply not being able to produce as many good ideas for neurological reasons, or as being too insulated to do it.

I suppose the difference in emphasis here is part of a wider disagreement with the thrust of the world that your hypothesis seems to support, where:

Stable succession where people have structured, planned lifecycle roles and lifepaths. Success means achieving particular milestones on a particular schedule and leaning into the inevitabilities of a particular rhythm of life. Older people accept making way for the young, inculcating them carefully into a shared 'canon' of ideas and tradition, lining them up for roles as prominent defenders of the status quo (and not coincidentally, can pay off their mortgages!).

That just seems so… unnecessary, stodgy and lacking in vibrancy and competition, compared a more liberal view of intellectual society where every individual simply "fights it out" for their own place in a chaotic marketplace of ideas, yielding nothing, hanging on as long as they can and defending their ideas and role in that sphere until their death. (Perhaps that feels frustrating in an age when the older cohort is relatively large and just as well educated as the younger and where it is expensive to build in big cities! But this age will pass.)

T Greer: I do not see why that would be true. Does a Ferrari have more parts in its engine than a pick up truck? Extra engine parts might not make a car go faster… but you take enough of the car parts out, and performance will decrease.

Sure, but brains are not cars (or watches), made for market with a careful engineered minimal level of functional redundancy, to a precisely engineered specification. Neuroplasticity can route efficiently around loss, and pruning over time can increase efficiency.

I don't know- I read National Review/NRO and the Weekly Standard for many years, and their ability to say anything to me drained away at a steady clip over the period 2003-15. I'm 49 now, so in their demographic.

I quite agree with the idea that Aurelius speaks to history and Peterson to a subset of a finite moment. I admit I am not too familiar with Peterson other than his original claim to fame in Canada in which he questioned the scope of individual ability to define material reality. Or, to be less combative, questioned the right of individuals to choose their pronouns. But his later role as a gadfly seems useful mainly to remind people of the entirety of the western intellectual tradition, including Aurelius, who were in danger of not being aware of its value or even existence.

That he would need to do so, or that it would be so controversial, showed me how far we had come from that early 90s moment when Marcus Aurelius was all the rage again, republished in multiple new editions, talked about positively in New York papers and magazines, and considered a potential voice to late modernity as he was foundational to late antiquity.

It also made me wonder if anyone in his time considered his maxims tedious, derivative, or patronizing. His audience had all of stoicism to read already.

@Kartturi

Interesting you bring up Chomsky, because he's the first name that came to mind as an example of what T. Greer is talking about (I am speaking to his foreign policy views here, not linguistics, of which I've read very little of his work). I'd argue that he only existed as a true public intellectual with new ideas and a broad audience (i.e., not only countercultural leftists) from the mid 60s, when he led Harvard/MIT faculty in taking a stand against the Vietnam War, through the mid 70s, the point at which 60s radicalism had run its course. After that, he became a niche figure who gradually developed a cult following. Obviously his books sold well and he always had an audience on campuses in the 80s and 90s, but mainstream institutions ceased to take him or his ideas seriously by 1980 (compare his prominence nationally re: the Vietnam War against that re: US involvement in Central America in the 80s). Since that time, he has been saying approximately the same thing, using the same analytical framework and providing the same critiques. Not a statement on whether he's right or wrong, but everyone who read Chomsky knew what they were getting by the time Necessary Illusions came out.

I say this as an American, though; maybe his prominence was longer-lasting in Israel or Europe.

Rock stars have fairly short shelf lives for creating famous original songs. After awhile, the novelty wears off.

Most famous individuals bring a new perspective that is their's alone. That's pretty impressive. But, if they are good at getting famous, pretty soon it's not a new perspective.

It all depends, of course, on what SORT of public intellectual you are.

The three models of 'public intellectual' that I have found that have had any currency across time and culture are these:

1. The Prophet, speaking outside the gates of the Court, speaking truth to power.

2. The Court Scholar, justifying the actions of the Court, and those in power.

3. The Court Jester.

I have noted that the few 'public intellectuals' and Prophets who have their own 'fuck you money' (like Nassim Nicholas Taleb), or their own academic niche (like Noam Chomsky) tend to have long shelf lives. They also tend to remain intellectually productive. Odd, that.

It's the Court Scholars you have to watch. You know, the ones who take the coin of the Court. It is odd to note the way that they tend to morph, and rather rapidly, too, into Court Jesters.

I like the Hornby example above. He wrote 3 books based on the same character idea (improving each time). Then he tried new stuff and became boring.

All the reasons mentioned in your post are probably true, but I think you're missing one additional, simpler, explanation: regression to the mean. Producing original intellectual works requires creativity and analytical powers, but also one's share of luck. I think it is an often underestimated part of the work of an intellectual, explorer of ideas who can find a nugget of gold in his garden, or find only mud travelling across the continents.
This regression to the mean, combined with the loss of creativity, stimulation, and incentives, gives, I think, a reasonable explanation to the short shelf-life of public intellectuals.

In a great many cases it is simply that the emperor has been shown to be naked, and public reverence has shifted to the next fad. Marcus Aurelius may be a voice for the ages, but he cannot fill every months literary supplement.

"The Friedman of the 20th century rushed to the scene of war massacres; the Friedman of the 21st hurries to conference panels. What hope does a man living this way have of learning something new about the world?"

This is very insightful. Explains Robert D. Kaplan, as well.

I'm not convinced by the (mainly) neurological explanation, although that obviously plays a role. "Public intellectuals" tend to be stupider than real intellectuals, because they are selling mono causal and popular theories to a willing audience. Over 15 years or so political economy changes and the audience dries up. Thomas Friedman is a good case: he sold globalization daydreams to wanna be masters of global flows in the early 2000's but after 2008 it became clear that only a tiny number would become masters of the 'flat world' so his audience disappeared.

Real intellectuals by contrast, a Weber, a Marx, a Hayek, have complex world views fitted to the actual complexity of social systems. They don't disappear as they age (or die) because passing fads and economic booms and busts don't date the imaginative syntheses they have achieved.

Turchin, within his niche, has remained relevant by making a ten-year prediction in 2010 that is relevant to today. LOL

Too bad it's not a happy prognosis 🙁

Great article. Another thing I’ve noticed is that fame itself warps public intellectuals. There’s the living-in-a-bubble effect you describe, but there’s also the way they react to their audience and the audience reacts to them. Once you’ve gone from having a mind of your own to becoming the spokesperson for a certain viewpoint, you end up publishing more and more takes that pander to that viewpoint. Your fans love it, your critics hate it, and suddenly you’re all about pleasing the fans and getting a rise out of the critics. It seems to be incredibly difficult not to get more and more bitter and reactive – and consequently simplistic – the more you go on and the more pushback you generate. Scott Alexander writes about how publishing anti-woke takes has become low-hanging fruit now, and to his immense credit he’s avoided going down the rabbit hole of doing that and nothing but that. It’d drive up his traffic no end, but it’d also oversimplify his approach and destroy his independence of perspective.

As I read this post, I began to ponder what is required to be regarded as an intellectual with fresh ideas. Fresh ideas that seem to fade with time. That smacks of faddishness. Humans, Homo sapiens, have been around for 230,000+ years. We are still working out a concept of reality. Who are we and where are we (and maybe are we real) in an infinite universe with laws of physics that seem to change as we peer closer to atomic matter and light. There are no answers yet and maybe there never will be any basic answers to life. So, for now we will keep following the latest ideas and listen to the babble. Time and the evolution of all things will continue. We must try to understand ourselves, the essence of being human, and take our seats at the table of life with each other, and chill out.

I read Beirut to Jerusalem shortly after it was published. And I attended lectures by Freidman on Israel and the Middle East in the 90s. I remember talking to him after one such session shortly before the election of Ehud Barak as Israel’s PM in 1999. He was enthusiastic about Barak and the prospect of peace with the Palestinians. Within 18 months, the peace negotiations at Camp David collapsed, the Second Intifada began, and it was clear that Freidman was just blowing smoke. I was done with him well before 9/11.

Always fun to think of examples of people who this is true/not true for.
Cervantes is a good one – he invented most things that were going to happen in literature for the next 400 years in his 50s.
Andre 3000 is also a great example of someone who’s taken the exact opposite path to Friedman – saw that his time at the vanguard was up and stepped away.