Author: john mcgowan

Great Books 2: Institutions, Curators, and Partisans

I find that I have a bit more to say on the topic of “great books.” 

The scare quotes are not ironic—or even really scare quotes.  Instead, they are the proper punctuation when referring to a word as a word or a phrase as a phrase.  As in, the word “Pope” refers to the head of the Catholic Church.  The phrase “great books” enters into common parlance with University of Chicago President Robert Maynard Hutchins’s establishment of a great books centered curriculum there in the 1930s.  From the Wikipedia page on Hutchins: “His most far-reaching academic reforms involved the undergraduate College of the University of Chicago, which was retooled into a novel pedagogical system built on Great Books, Socratic dialogue, comprehensive examinations and early entrance to college.”  The University of Chicago dropped that curriculum shortly after Hutchins stepped down in the early 1950s, with St John’s College now the only undergraduate institution in the country with a full-bore great books curriculum.  Stanford and Columbia had a very great books slanted general education set of requirements for first and second year undergraduates well into the 1990s, but have greatly modified that curriculum in the 21st century.

These curricular issues are central to what I want to write about today.  “Literature,” Roland Barthes once said, “is what gets taught.”  It is very hard to even have a concept of “great books” apart from educational institutions, from what students are required to read, from what a “well-educated” person is expected to be familiar with.  As I wrote a few posts back (https://jzmcgowan.com/2023/07/31/americans-are-down-on-college/), we in the United States seem now to have lost any notion of what a “well-educated” person is or should be.  The grace notes of a passing familiarity with Shakespeare or Robert Frost are now meaningless.  The “social capital” accruing to being “cultured” (as outlined in Pierre Bourdieu’s work) has absolutely no value in contemporary America (apart, perhaps, from some very rarified circles in New York). 

I am not here to mourn that loss.  As I said in my last post, aesthetic artefacts are only “alive” if they are important to some people in a culture.  Only if some people find that consuming (apologies for the philistine word) an artistic work is fulfilling, even essential to their well-being, will that work avoid falling into oblivion, totally forgotten as most work of human hands (artistic or otherwise) is. 

Today, instead, I want to consider how it is that some works do survive.  I think, despite the desire from Hume until the present, that the intrinsic greatness of the works that survive is not a satisfactory explanation.  More striking to me is that the same small group of works (easily read within a four year education) gets called “great”—and how hard it is for newcomers to break into the list.  For all the talk of “opening up the canon,” what gets taught in America’s schools (from grade school all the way up through college) has remained remarkably stable.  People teach what they were taught.

Yes, Toni Morrison, Salman Rushdie, and Gabriel Garcia Marquez have become “classics”—and are widely taught.  But how many pre-1970 works have been added to the list of “greats” since 1970?  Ralph Ellison and Frederick Douglass certainly.  James Baldwin is an interesting case because he has become an increasingly important figure while none of his works of fiction has become a “classic.”  On the English side of the Atlantic, Trollope has become more important while Shelley has been drastically demoted and Tennyson’s star is dimming fast.  But no other novelist has challenged the hegemony of Austen, the Brontës, Dickens, and Eliot among the Victorians, or Conrad, Joyce, Hardy, and Woolf among the modernists.  The kinds of wide-scale revaluations of writers that happened in the early years of the 20th century (the elevations on Melville and Donne, for example) have not happened again in the 100 years since.  There really hasn’t been any significant addition to the list (apart from Douglass and barring the handful of new works that get anointed) since 1930.

I don’t deny that literary scholars for the most part read more widely and write about a larger range of texts than such scholars did in the 1950s and 1960s.  (Even that assumption should be cautious.  Victorian studies is the field I know best and the older scholars in that field certainly read more of the “minor” poets of the era than anyone who got a PhD in the 1980s or later ever does.)  But the wider canon of scholars has not trickled down very much into the undergraduate curriculum.  Survey courses in both British and American literature prior to 1945 are still pretty much the same as they were fifty years ago, with perhaps one or two token non-canonical works.  More specialized upper class courses and grad courses are sometimes more wide-ranging.  Most significantly, the widening academic canon has not moved into general literate culture (if that mythical beast even exists) at all.

The one place where all bets are off is in courses on post 1945 literature.  No canon (aside from Ellison, Morrison, Rushdie, Baldwin) has been established there, so you will find Nabokov read here and Roth read there, while the growth of “genre courses” means Shirley Jackson and Philip K. Dick are probably now taught more frequently than Mailer or Updike or Bellow.  Things are not as unstable on the British side, although the slide has been toward works written in English by non-English authors (Heaney, Coetzee, various Indian novelists alongside Rushdie, Ondaatje, Ishiguro).

Much of the stability of the pre-1945 canon is institutional.  Institutions curate the art of the past—and curators are mostly conservative.  A good example is the way that the changing standards brought in by Henry James and T. S. Eliot were not allowed (finally) to lead to a wide-scale revision of the “list.”  Unity and a tight control over narrative point of view formed the basis of James’s complaints against the Victorians.  The rather comical result was that academic critics for a good thirty years (roughly 1945 to 1975) went through somersaults to show how the novels of Dickens and Melville were unified—a perverse, if delightful to witness, flying in the face of the facts.  Such critics knew that Dickens and Melville were “great,” and if unity was one feature of greatness, then, ipso facto, their novels must be unified.  Of course, the need to prove those novels were unified showed there was some sub rosa recognition that they were not.  Only F. R. Leavis had the courage of his convictions—and the consistency of thought—to try to drum Dickens out of the list of the greats.  And even Leavis eventually repented.

The curators keep chosen works in public view.  They fuss over those works, attend to their needs, keep bringing them before the public (or, at least, students).  Curators are dutiful servants—and only rarely dare to try to be taste-makers in their own right. 

I don’t think curators are enough.  The dutiful, mostly bored and certainly non-passionate, teacher is a stock figure in every Hollywood high school movie.  Such people cannot bring the works of the past alive.  For that you need partisans.  Some curators, of course, are passionate partisans.  What partisanship needs, among lots of other things, is a sense of opposition.  The partisan’s passion is engendered by the sense of others who do not care—or, even more thrilling, others who would deny the value of the work that the partisan finds essential and transcendentally good.  Yes, there are figures like Shakespeare who are beyond the need of partisans.  There is a complacent consensus about their greatness—and that’s enough.  But more marginal figures (marginal, let me emphasize, in terms of their institutional standing—how much institutional attention and time is devoted to them—not in terms of some kind of intrinsic greatness) like Laurence Sterne or Tennyson need their champions.

In short, works of art are kept alive by some people publicly, enthusiastically, and loudly displaying how their lives are enlivened by their interaction with those works.  So it is a public sphere, a communal, thing—and depends heavily on admiration for the effects displayed by the partisan.  I want to have what she is having—a joyous, enlivening aesthetic experience.  Hume, then, was not totally wrong; works are deemed great because of the pleasures (multiple and many-faceted) they yield—and those pleasures are manifested by aesthetic consumers.  But there is no reason to appeal to “experts” or “connoisseurs.” Anyone can play the role of making us think a work is worth a look, anyone whose visible pleasure has been generated by an encounter with that work.

The final point, I guess, is that aesthetic pleasure very often generates this desire to be shared.  I want others to experience my reaction to a work (to appreciate my appreciation of it.)  And aesthetic pleasure can be enhanced by sharing.  That’s why seeing a movie in the theater is different from streaming it at home.  That’s why a book group or classroom discussion can deepen my appreciation of a book, my sense of its relative strengths and weakness, my apprehension of its various dimensions. 

So long as those communal encounters with a work are happening, the work “lives.”  When there is no longer an audience for the work, it dies.  Getting labeled “great” dramatically increases the chances of a work staying alive, in large part because then the institutional artillery is rolled into place to maintain it.  But if the work no longer engages an audience in ways close to their vital concerns, no institutional effort can keep it from oblivion.

Great Books?

I am currently facilitating a reading group that began with the goal of revisiting the literary works the group members read in a “great books” course forty years ago.  The original (year long) syllabus will be familiar to anyone who knows the traditional canon of the Western literary tradition: Homer, the Bible, Sophocles, Virgil, Augustine, Dante, Shakespeare, Milton, Goethe, Dostoyevsky, Joyce.  Forty years ago, there was not even a token attempt to include a non-white, non-male author—and the absence of such authors went entirely unmarked, was not on the radar screen as it were, and thus was not even considered something worth noticing or contemplating.

In the course of revisiting the class all these years later, it is not surprising that the group has felt the need to supplement the original list with works by Virginia Woolf, Toni Morrison, Jesmyn Ward, Sandra Cisneros, and Salman Rushdie.

In reading those works, some members of the group have declared (in certain, not all, instances) that the books don’t meet the standard of a “great book”; some of the books have been deemed interesting, informative, worth reading perhaps, but not “great.”

Which raises the vexed question of the standard for greatness—and that’s the topic of this post.

My first—and biggest—point is that I find the whole enterprise of deciding whether something is great or not unproductive.  It rest on the notion of a one size fits all, absolute standard that is more detrimental to appreciation of an aesthetic (or any other kind of) experience than helpful.

Is Italian cuisine “better” than Chinese cuisine?  I trust you see the absurdity of the question.  You certainly can’t appreciate the Italian meal you are eating if you are comparing it to a Chinese meal.  And, in the abstract, the general question of which cuisine is “better” is nonsense.  There is no proper answer to the question because it lacks all specificity.

Judgments of better or worse are always in relation to some standard, some criteria, of judgment.  In his book A Defense of Judgment (University of Chicago Press, 2021), Michael Clune keeps scoring cheap points by telling us that Moby Dick is better than The Apprentice.  The examples hide the absurdity of the claim.  If he insisted instead that Moby Dick is better than The Sopranos, he would almost certainly generate the kind of objection that could lead to forcing him to justify his claim.  According to what criteria is Moby Dick superior, and in relation to what purposes.  Are there no contexts at all where I would prefer to watch The Sopranos to reading Moby Dick? Are there specific things The Sopranos does better than Moby Dick? Am I always choosing the lesser (thus revealing my debased tastes) when I watch the show?  Would the world be a richer and “better” (that word again!) place if it only had Moby Dick in it and not The Sopranos

I hope that makes it clear that the rank ordering of various aesthetic works is not just unhelpful, but needlessly restrictive, tending toward the puritanical.  Furthermore, it is a category error.  To respond to diversity (that there are multiple cuisines, that there are many aesthetic objects, and that they come in different genres and employ different media) by ranking all the instances it offers on one scale is to miss the pluralistic plenitude of the world.

So, the standard bearer always cries at this point, does that mean anything goes?  Are we doomed to drown in the sea of relativism? The bugbear of relativism, the contortions writers who long to be considered “serious” go through to avoid being accused of relativism, never fails to astound me.  I hope to address these fears—akin to a “moral panic” in their intensity—in a future post.  Suffice for now to say that relativism is trivially true.  You cannot aspire to be the world’s greatest baseball player if you grow up in first century CE Rome or in contemporary Malawi.  Your aspirations are relative to context.

What does that say about aesthetic standards?  First (again, trivially true) is that such standards shift over time.  Until 1920, general opinion was that Uncle Tom’s Cabin was a better book than Moby Dick.  From 1920 to 1980, you would have been considered a complete philistine to prefer Stowe’s novel to Melville’s.  Currently, a more pluralistic ethos prevails.  If you are considering a novel that successfully moves an audience to tears and outrage about a social injustice, then Uncle Tom’s Cabin is the ticket.  For more abstract musings on the meaning of life, Moby Dick is a better bet.  If you want a “tight,” well structured, gem of a more minimalist nature, not one of the “loose baggy monsters” that Henry James disparaged, than neither Stowe nor Melville is going to fit the bill.

Judgment, then, of a work’s quality will be relative to the standard you are applying to the work.  And also relative to the purpose for which the work was written and the purpose for which the consumer is coming to the work.  When making up a syllabus of 19th century American literature, excluding Stowe (and, for that matter, Frederick Douglass), as was standard practice for well over fifty years, is to offer a very truncated vision of the American scene from 1840 to 1870.  Allowing some vague, unspecified, notion of “better” justify the inclusion of Emerson, Thoreau, Hawthorne, Whitman, Dickinson, and Melville, along with the exclusion of Stowe and Douglass, is not only to miss important cultural works, but also to renege on the intellectual responsibility to be self-conscious about the standards that govern one’s judgments (and the choices that follow from those judgments, along with the consequences of those choices).

OK.  Let me try to get concrete.  The whole “great books” thing, with its (most likely inevitably futile) attempt to impose “standards” on the benighted tastes of one’s contemporaries, always arises in moments of what we nowadays call “culture wars.”  One famous instance is the quarrel between “the ancients and the moderns” of the late 17th and early 18th century.  More relevant to us today is the modernist revolt against the Romantics and the Victorians.  T. S. Eliot was a central figure here, promulgating a “classicist” aesthetic standard that valued austere, non-sentimental, tightly formed, stringently intellectual (and hence non-emotional and non-personal) works over what he deemed the sloppy, sentimental, and overly rhetorical (i.e. trying to persuade the audience of some moral or political or otherwise sententious “truth’) of the art of the 19th century.  That the works Eliot championed were “difficult” was a feature not a bug.  The world was awash in easy, popular art—and “high art” had to be protected from danger of being dragged into that swamp. 

What Eliot was trying to produce was nothing less than a sea change in sensibility.  He wanted to change what audiences liked, how they responded to aesthetic objects.  Henry James (as we have already seen) was engaged in the same enterprise.  The modernist painters offer a particularly clear case of this enterprise.  Works that in 1870 were deemed “barbarous” were declared masterpieces by 1910.  (Van Gogh, who sold only one painting in his lifetime, unhappily did not live to bask in this radical revaluation, this shift in criteria of judgment, in the world of visual art.  Cezanne, to a somewhat lesser extent, also died a few years too early.)

The shift in sensibility was wonderfully summed up (in his usual pithy manner) by Oscar Wilde when he said “it would take a heart of stone to not laugh at the death of Little Nell.” (Translation: the death of little Nell in Dickens’s The Old Curiosity Shop –which gets dragged out over numerous pages—famously moved readers to tears on both sides of the Atlantic.)

So, in short, we do have a set of aesthetic standards promulgated by the modernists that lead to the elevation of Melville over Stowe (among many over revaluations) and which can be specified (especially when considered as negations of some of the prevailing features of “popular art”—works which, like loose women, are castigated for being “too easy.”)

A list of great books, then, can be a destroyer of diversity.  (“Eleanor Rigby” along criteria of profundity and musical complexity is a “greater” song than “When I’m 64,” but don’t we want a world in which both exist and in which we listen to both?)  And such a list relies on a fairly one-dimensional set of criteria that belies the imaginative plenitude that the arts provide.  When this narrowing work is combined with the notion that all judges of any taste know instinctively what a great work looks and tastes like, without any need to spell out the grounds for their judgment, we have a specific sensibility parading as universal.  (Which, not surprisingly, mirrors the objection of women and people of color about their experiences in “unmarked,” male-shaped spaces.  There are unwritten, even unconscious, norms of behavior in such spaces that are not seen as one alternative manner among others, as not universal.)

Does this mean all judgments of “better” or “worse” are off the table? No.  It simply means that an aesthetic work (or a meal in a Chinese restaurant for that matter) should be judged according to the criteria that guided its making.  I will admit that I find much politically motivated visual art deeply flawed.  But that is not because I have some aestheticist notion that art is always ruined by being political (another of the modernist shibboleths).  I reject any such absolute, universalist standard that says art can only do this and not that.  Rather, I think it is particularly difficult for the visual arts to make statements; they don’t have the same resources for statement-making available to novelists, poets, and film-makers (to name only three). 

Does this mean that visual artists should all eschew making works that aim at some political point? No.  Successfully doing something that is very difficult is often the hallmark of an important artist, one worth paying attention to.  The role of the audience is, in this view, to grasp what the artist is trying to accomplish—and to judge how successfully the artist accomplished that goal. Given similar goals, some artists do better work than other artists–relative to that goal.

Two last points and I am done.

The first relates to acquired taste.  An aesthetic education is always a process of learning how to appreciate, in the best case scenario to enjoy, aesthetic objects that, at first encounter, are too different, difficult, foreign, unfamiliar to grasp.  This process of education is mid-wived by others (friends, lovers, teachers) who deeply appreciate some works of art and long to convey that appreciation to another.  The means to that sharing is a heightened apprehension of the particular features of the particular work.  The mentor guides the neophyte toward “seeing” what is there.  The one who appreciates illuminates the work, shows what it contains that is to be valued, to the newcomer.  People who are especially good at this work of illumination are the truly gifted teachers and critics. 

In my ideal English department (for example), the staff would include a medievalist to whom the works of that period are endlessly fascinating and enjoyable—and that professor would be a success if she communicated that enthusiasm, that appreciation, to students who entered college with no idea that there was a vastly rich repository of medieval literature to encounter and learn to love. There would be no need to disparage some works as inferior in order to champion some as deeply pleasurable and worth reading along any number of criterial dimensions. 

And that brings me to my second—and last—point.  There is absolutely no doubt that various works have been aided in the perpetual effort to escape oblivion by institutional support and inertia.  Wordsworth becomes part of the curriculum—and I teach and research about Wordsworth because of the institutional stamp of value.  Literary institutions, like all assemblages of power, work to sustain themselves.  It takes a long time for values to shift in the academy—a shorter time in the market (as witnessed by the shift in taste in painting between 1870 and 1914).  The larger point is that judgments of value do not occur in a vacuum.  There are institutional hierarchies that protect prevailing judgments and only slowly adopt re-valuations.  

Still, institutions are not omnipotent—and they tend to ossification if not drawing revitalizing energies from some other source.  All of which is to say that “great books” only remain alive to the extent that some people somewhere still find them of interest, importance, worth devoting some time to.  Here’s the last reappearance of relativism in this discussion.  A book can be as “great” as you want to claim it is, but none of its intrinsic features will ensure its survival, its still being read, its not falling into the oblivion that engulfs 99% of the artistic works ever produced.  It will only still command attention while some audience finds it worthy of attention.  And that worthiness rests, in part, on the work having institutional prestige and enthusiastic champions, but also (crucially) on an encounter with it being experienced by at least some people as part of living a full and satisfying life.  The work’s survival is relative to an audience that keeps it alive.   

Treason—and Trump

I have been working my way (painfully slowly) through Raimond Gaita’s Good and Evil: An Absolute Conception (2nd. Edition, Routledge, 2004).  It’s a brilliant, fascinating, frustrating, idiosyncratic book.  Amazingly right in places, confoundingly wrong in others—and all over the map.  I hope to write more directly about its main arguments in subsequent posts.

Right now, however, I just want to use what he has to say (in one of his digressive moments) about treason.  Here’s the most relevant passage (for my purposes here, but also for what seems to me his apt understanding of what treason is):

“Treason is a crime against the conditions of political communality.  Traitors, by ‘aiding and abetting’ the enemy of their people, help those who would destroy them as a people.  Or, they deliver their people and the conditions that make them a people—which enable them to say ‘we’ in ways that are not merely enumerative but expressive of their fellowship in a political identity—as a hostage to the improbable good fortune that their enemies will respect their integrity as a people.  Therefore, treason is not essentially, or indeed ever at its deepest, a crime against the state.  It is actually a crime against a form of civic association” (253-54).

To wit: treason threatens the very terms of, the very existence, of the civic association that undergirds the state.  In reference to Trump: the crime is not against the state, but against the very conditions that make the state possible.  That is, one crucial term of American civic association is that the winner of an election gets to hold the office for which that election was held.  “We” as Americans can disagree fiercely about all kinds of things, but “we” are no longer a “we” when we do not abide by the results of elections.  The state cannot exist if its office holders are not those who have been duly elected.  There is no political community left if elections are not respected.

Another point: Gaita’s description of treason holds better for Eastman and Clark (and the others in the Georgia indictment) than it does for Trump.  The co-conspirators have aided and abetted the enemy who is aiming to undermine the constitutive civic association.  But Trump is the enemy, not one who aids the enemy.  He aims to destroy the foundational commonality that makes the political entity called the United States possible.

I could get sidetracked into the legerdemain by which Trump and his followers would insist they are not trying to destroy America, but in fact save it (make it great again).  Not worth going down that rabbit hole.  But it is notable that they act in a way that would destroy the civic association, while also aiming to keep its infrastructure intact so that there are offices for them to occupy, state functions that they can take over.  That’s why there is an argument that their efforts were an attempt at a coup, not a full scale treasonous act whose goal was the utter destruction of a polity or of a “people.”

I don’t know if much hinges on deciding whether Trump and his henchmen are guilty of a failed coup attempt or of treason.  In both cases, they are certainly guilty of breaching the constitutive rules of American political and civic life.  They have manifestly failed to uphold and defend the Constitution, as many of them swore to do when they took their various oaths of office.

Another side note: the always cogent Timothy Burke has a blog post in which he wonders how anyone with even a modicum of sense would ever go to work for Trump (whose record of treating his helpers dismally is unambiguous and exists in plain sight).  Burke doesn’t have any good answers; he can only shake his head in disbelief.  Even if Trump is on the rise, no one ever benefits from hitching their wagon to his star.  His narcissism can’t abide sharing his triumphs (and whatever fruits those triumphs yield in the way of money, fame, or power) with anyone.  Of course, Burke’s puzzlement here only echoes the wider puzzlement over the cult of Trump among such a large share of the populace.  This recent CBS poll boggles the mind.  Among Republican voters, Trump is deemed more honest than everyone else in their lives by large margins.  (Even more than intimates, although the gap there is much lower.  Only 8% trust Trump more than their family members.)  So much for Hannah Arendt’s sophisticated take on the general cynicism generated by authoritarians, that is, the notion that everyone knows they are lying, but just think “everyone lies” and shrug.  No: the lies are believed; they are deemed the only truth out there.  (See NOTES below for references.)

But back to Gaita.  Because treason (on his take) “undermines . . . the conditions which make it possible for a people to speak as a people, . . . the most fitting (though not, any more, practical) punishment for unrepentant traitors . . . is banishment” (257).

What a lovely thought!  I don’t actually see why banishment is “not, any more, practical.” Surely we could send Trump abroad—wonderful to think he would flee to Saudi Arabia—and then keep him from re-entering the United States.) In any case, since banishment is not on the table, let’s at least indulge our fancies in thinking how appropriate the penalty would be in Trump’s case.  What he craves is admiration and adulation.  Deprive him of his audience, of the “people” to whom his plea for attention is made, let him fulminate in the emptiness of cyber-space entirely outside of the context (an actual civic association) to which his tweets are addressed.  Delicious.  The punishment would fit the criminal (and, possibly, also the crime) in a Dantesque manner.

NOTES

The Timothy Burke blog post on Trump’s henchmen. 

The CBS poll:  https://www.cbsnews.com/news/trump-poll-indictments-2023-08-20/

Here’s the most relevant finding in that poll, but it’s very much worth looking at the entire poll results (available through the link).

trump-truth.png

Finally, the relevant Hannah Arendt passage that has been making the rounds over the past eight years as pundits and others try to come to terms with the phenomenon of the Trump cult:

In an ever-changing, incomprehensible world the masses had reached the point where they would, at the same time, believe everything and nothing, think that everything was possible and nothing was true… The totalitarian mass leaders based their propaganda on the correct psychological assumption that, under such conditions, one could make people believe the most fantastic statements one day, and trust that if the next day they were given irrefutable proof of their falsehood, they would take refuge in cynicism; instead of deserting the leaders who had lied to them, they would protest that they had known all along that the statement was a lie and would admire the leaders for their superior tactical cleverness. (From The Origins of Totalitarianism)

Arendt’s take has proved both too sophisticated and too optimistic.  What we have seen instead is that no amount of “irrefutable proof” will lead the cultists to recognize the lie as a lie.  The cultists don’t have to retreat to the redoubt of cynicism.  They just double down on their belief in the original lie—and in the figure who propagates the lies.

Americans Are Down on College

Noah Smith, at his Substack blog Noahopinion, posts poll data that shows a precipitous loss of faith in college among a wide swathe of Americans. (https://www.noahpinion.blog/p/americans-are-falling-out-of-love?utm_source=substack&utm_medium=email)

Here’s the grim chart that sums it all up:

This disenchantment with college is more marked among Republicans—which is no surprise given the profound anti-intellectualism of current day Republican populism joined to the constant attacks upon universities as citadels of liberalism.  But Democrats also have much less faith in the usefulness of a college education.  Here’s the chart that details the demographic divides on this issue—helpfully giving us the percentage declines since in 2015 in the far right column.

I am just back from the Tennessee mountains where I was visiting with two friends who are English professors at the University of Tennessee. So the plight of the humanities inevitably came up.  Which isn’t exactly the decline of faith in college tout court.  But is adjacent to that decline.

Anyway, my line was: we no longer have any story at all that we can tell, that feels even remotely plausible, about why someone should be conversant with the cultural heritage represented by the texts of the past.  The only rationale anyone ever advances these days is about skills acquired as by-products of reading: critical thinking, pattern recognition, attention to detail, ability to track complex arguments or emotional states complete with competing points-of-view and ambiguous data etc.

Similar arguments are used to justify instruction in writing.  Vital communication skills and all the rest. 

But even its most ardent practitioners can no longer—in the face of a culture that clearly does not care in the least—make the case for being an educated or “cultured” person, where attaining that status entails familiarity with a cultural heritage marked by certain landmarks, with that familiarity widely shared. 

When I taught at the Eastman School of Music, our dean would often lament that the audience for classical music largely consisted of 60+ year olds.  What would happen when that audience died off? Well, so far, it turns out that the next cohort of 60 years olds takes their place. 

I am currently experiencing something similar.  I facilitate two different reading groups (with ten participants in each) of people in their 60s who want to read classics. My sixty year olds in the two reading groups are hungry for encounters with “great books.”  Some of the books are ones they read in college and want to revisit.  Other selections are books they have always wanted to tackle.  So we have read Homer, Dante, Augustine, Dostoyevsky, Joyce, Woolf, Cather, Morrison and Cervantes among others. (Both groups have been going strong for three years.)

Are my readers outliers?  Yes and No.  They are products of the time when most students were liberal arts majors (English, History, Religious Studies etc.) and then went on to professional careers in business, law, journalism, and even medicine. Like (I would argue) the classical music audience, they had early experiences reading “major authors” (just as the classical music audience had early experiences of learning to play piano or violin and were taken to hear orchestras.)  After reaching a certain pinnacle in their professional lives (and after the kids are grown and gone if they had them), these oldsters turn back to the classics.  Not everyone in their position makes this turn, but a fair number of people do.  They are hungry for the “culture” that they tasted for a while in youth, and now want to revisit it.

The current situation is different because, especially when it comes to books more so that when it comes to music, the early experience is not on offer.  A certain subset of the population still gets violin and piano lessons.  But fewer and fewer young people are getting Homer, Woolf, or Conrad in either high school or college.  There is no early imprinting taking place.

And what are my readers seeking? In a word: wisdom. They are looking for life lessons, aids to reflecting on their own lives.  They are just about completely uninterested in historical or cultural context, the kinds of things scholars care about.  So the humanities have an additional problem in the context of the research university which is supposed to “produce knowledge.” Why does a society want knowledge about the cultures of the past (its own culture and other cultures) and about its highlighted landmarks?  We humanists don’t have a good answer to that one when faced with the general indifference.  We can echo the complaints of Matthew Arnold about the philistines who prevail in our society, but we lack his faith that “culture” has something precious to offer that society.  And certainly even an attempt to activate Arnold’s vision of “culture” would have little relation to what counts as “scholarship” in the contemporary university. Arnold, too, was mostly focused on gleaning wisdom (“the best that has been thought”) from the classics–although he also hoped that attention to “culture” could provide a “disinterested,” reflective place to stand that would mitigate partisan wranglings. Even in 1867, that last one seemed pretty laughable, and certainly naive.

Still, there was a time when offering wisdom, or paths to maturation, or lessons in the practices of reflection was valued as something college could (and should) do.  But such vague values carry no water in our relentlessly economic times.  Starting in the 1980s (greed is good) when the gap between economic winners and losers began to widen and it also became clear that there was wealth beyond previous imaginings for the winners, return on investment became all.  The decline of support for college is pretty directly tied to a cost/benefit analysis that says the economic pay-off of a college degree has declined.

The facts of that matter are complex.  Overall, it’s still a winning economic strategy to get a college degree.  It is even unclear whether a degree in a “practical” major like business or health services carries a better economic return than a liberal arts degree in history or literary studies.  Determining the facts of this matter are complicated by the extent to which social/economic starting point influences the eventual outcomes along with where one degree is from (given the extreme status hierarchy in American higher education).

But it is simply wrong that college is an economic loser.  So why the decline in faith in college? One, the upfront costs are now so much higher than they once were.  People go into debt to get a college degree—and the burden of that debt weighs heavily on them precisely when they are setting out and in their most vulnerable, least remunerative years of their job lives. 

Second, the relentless attack on what is taught in college for the right wing outrage machine. The strong decline since 2015 registered in the Gallup poll is much stronger among the groups (Republicans and those without a college degree) most susceptible to right wing propaganda.

But we should recognize that the right-wing attack exists alongside a wider and growing sense that college’s sole purpose is job preparation. As a result, much of the traditional college curriculum simply seems beside the point, a waste of time.  The degree is what matters; the pathway to that degree is now deeply resented by many students.  It is experienced as a pointless, even sadistic, set of obstacles—and the sensible course of action is to climb over those obstacles in the most efficient way possible. (Hence the epidemic of cheating, and the documented increase in the numbers of students who think cheating is acceptable.)  What is offered in the classroom is experienced as having no value whatsoever.  The only value resides in the degree—a degree that is only slightly (if at all) connected to something actually learned (whether that be some acquired skills or something more nebulous like wisdom.) 

We humans seem particularly adept at this kind of reversal of values, making what at first was a marker of accomplishment into the aim of our endeavors.  Money becomes the goal instead of a signifier of values, only valuable insofar as it enables access to things needed for flourishing; in a similar fashion, the degree that was simply meant to signify educational acquisition of valuable knowledge is now the goal of the pursuit, with the actual knowledge radically devalued. 

Our politicians have acted on this reversal of values.  Public higher education is now driven by the imperative to deliver as many degrees for the least amount of public expenditure.  That the actual educational outcomes (measured in other terms than simply the number of degrees granted) are devastated by this approach doesn’t trouble them in the least because they buy into the general contempt for the actual content of what gets taught in the college classroom.  That the credential (the degree) is divorced from actual competence or knowledge apparently doesn’t bother them either.  It’s all numbers driven, with no attention at all to quality.

When we reached this point in this conversation among four English professors (the youngest of whom was 70), we lamented we had become the cranky oldsters we swore we would never become.  Spouting the all too predictable: “How it was so much better in our day.”  Another blogger I like, Kevin Drum, spends a lot of time debunking the notion that Americans, including young Americans, are worse off today than in years past. (Link to Drum’s blog: https://jabberwocking.com/) When one adjusts for inflation, housing costs and other economic indicators (like wages), things in the United States have been fairly steady over the past 70 years. The key point is that economic inequality has increased.  The lower half has mostly held steady, while the upper 20% has taken all of the wealth generated by economic growth over that time span.  So the have-nots are not more destitute (they are even slightly better off), but they have to witness the excesses of those who are much more wealthy than they were in the 1950s and 60s.

I think, however, that Drum misses the fact that economic anxiety is way higher, even if that is mostly a factor of the reaction to numbers.  To face a monthly rent of $3000 feels more daunting even if that’s only $350 in 1970 dollars.  The same goes for college tuition and student loan debts.  Especially when college costs have risen faster than the rate of inflation.

So the sheer sticker shock of college costs has to be seen as one factor in the disillusionment.  Despite generous aid packages, studies show that the price is off-putting for lower income students—precisely the students least likely to know about how aid works.  Add to that the fact that most aid packages also include loans and the upfront financial burdens and risks are daunting. 

I used to say there was only three things the world wanted to buy from the US: our Hollywood centered entertainment, our weapons, and our higher education. I think that may still be true, but we sure seem determined to undermine two of the three, leaving only our heavily subsidized defense industry standing.  Withdrawal of government support for education (shifting the costs onto students) hurts the one, while corporate greed (screwing the writers, actors, and other workers) hurts the other.

It is a truism that the periods when the arts flourish are also when a nation is most prosperous; think Elizabethan and Victorian England; 5th century Athens; early 15th century Florence etc.  The 1950s and the 1960s may not have been such a golden age for artistic achievement, but it was a time of economic well-being.  And that fact seems to have generated the confidence that allowed for a non-utilitarian ideal of a liberal arts education to flourish.  Yes, that ideal was a “gentlemanly” one, which meant it excluded women, non-whites, and large swathes of the working class.  But the GI Bill and the massive investment in public higher education during those years was the beginning of the opening up of that model of college to larger numbers.  The retreat from that ideal is not (as Kevin Drum’s work repeatedly demonstrates) the result of America being less prosperous in 2020 than it was is 1965.  Rather, it is the fact that completion for a piece of that wealth has been greatly increased.  An economy that produced general prosperity (again, with the important caveat that it excluded blacks from that prosperity) has been transformed into one where the gap between winners and losers has widened—and is ever present to every player in the field.  (Why do American workers not take their vacation time?  Because they are terrified that their absence will prove they are not essential—and so they will be laid off.)  The things that our society has decided it cannot “afford” are legion (health care for all; decent public transportation; paying competitive wages to keep teachers in the classroom).  Among those things is a college education that has only a tangential relation to a specific job as it aims to deliver other benefits, ones that can’t be easily or directly tied to a monetary outcome.