Joseph North Three:  Sensibility, Community, Institution

Now we reach the point in my discussion of Joseph North’s Literary Criticism: A Concise Political History (Harvard UP, 2017) where I mostly agree with him.  I am simply going to take up some of his key terms and goals and inflect them somewhat differently.  I think what I have to say runs parallel to North, not ever much meeting him on his chosen ground, but not running athwart his formulations either.

Here’s three of North’s descriptions of his project.

The first comes from a footnote on Raymond Williams and features North’s “scholarship/criticism” divide.  “Of course, none of this is to say that Williams was not deeply committed to ‘practice’ in other fields of endeavor; I merely mean to observe that he understood his disciplinary work in scholarly terms, as cultural analysis, cultural history, and cultural theory, rather than understanding it in critical terms as the systematic cultivation of sensibility.  Naturally the two are not finally distinguishable, and any powerful work of scholarship moves readers to try on different ranges of sensibility, etc. etc. But the ‘practice’ of scholarship, conceived of as cultural analysis, is necessarily neither direct nor systematic in this respect” (pg. 233, fn. 18).

The second is notable for its raising the issue of institutions.  “I only want to add that the problem facing the discipline is not an entirely new one, for in a broad sense it is much the same problem that the critical revolution of the 1920s managed to solve: the problem of creating a true paradigm for criticism—the problem of how to build an institution that would cultivate new, deeper forms of subjectivity and collectivity in a rigorous and repeatable way” (126-27).

In the third passage, he faults the work of D. A. Miller and Eve Sedgwick for its “lack of any prospect of a true paradigm for criticism—the lack of any hope of putting together a paradigmatic way to use the literary directly to intervene in the social order” (173).  Two pages earlier, he describes what I think he means by “direct” intervention.  “My point is simply that it really does make a difference to the character of the work produced by an intellectual formation when those involved feel strongly their responsibility to the needs of a fairly well-defined larger formation beyond the academy—a larger formation defined not simply by its ‘identity’ but by its character as a living movement—which is to say, really, a formation defined by its always limited but nevertheless real ability to define itself by determining, collectively, the trajectory of its own development” (171).

I can’t resist, of course, registering where I disagree with these statements. I have already made clear my skepticism that there is a rigorous or systematic way to cultivate a sensibility.  I am also astounded that North does not recognize feminist literary criticism of the period from 1975 to 1995 as a paradigmatic case of academic work tied “to a fairly well-defined larger formation beyond the academy.”  And if Sedgwick’s relation to the gay liberation movement isn’t a similar instance, may the Lord help the rest of us.  And North’s repeated (as much a tic as his use of the term “rigorous”) use of the words “true” and “really” make him appear more Stalinist than I think he really is.  Does he really intend to shut down the pluralism of intellectual work in favor of the one true path?  Re-education camps for the critics so that they get with the program—and are taught the methods of the new systematic pedagogy.  Surely, one of the delights of the aesthetic sensibility is its anarchism, its playfulness, its imaginative ingenuity, excesses, and unruliness. I suspect that “systematic,” and “repeatable” and “direct” aesthetic education would prove counter-productive in many cases.  At least, I hope it would–with teachers and student both summoning enough gumption to rebel against the indoctrination.

Finally, I want to quibble with his description of “direct” intervention.  Work that stands in support of, proves useful to, “larger” social movements is not direct—at least not directly political.  Here’s Judith Butler in a 1988 essay offering a straightforward description of political acts.  “Clearly, there are political acts which are deliberate and instrumental actions of political organization, resistant collective interventions with the broad aim of instating a more just set of social and political relations” (“Performative Acts and Gender Constitution,” Theater Journal  523).  That cultivating an aesthetic sensibility might play a role in encouraging someone to join that social movement is not the direct political intervention that the movement attempts through quite different means and actions than what the critic does in the classroom or in her written work.  To confuse the two does no one any good—especially if it lets the teacher/critic deem herself sufficiently political as she advances her academic career.  The teacher/critic’s contribution is valuable, but it also indirect.

Enough with the dissents. I completely agree with North that “sensibility” is the crucial concept for the “hearts and minds” side of politics.  Cultivating a leftist sensibility is necessary, although not sufficient, to creating the kind of society we leftists want to live in.  The caveats here are familiar.  There is no guaranteed path from an aesthetic sensibility to a leftist politics. [Let me also note that the practice of close reading is also not the only, or even the royal road, to acquiring an aesthetic sensibility.  Lots of people got there other ways, which casts doubt of the “systematic” and “rigorous” pedagogy, and on the fetishizing of close reading.] For many aesthetes (Nietzsche, Yeats, and Pound among them), the vulgarity and bad taste of the masses drives them to anti-democratic, autocratic visions of strong, masterful leaders of the herd.  For others (Wordsworth and Coleridge for example), reverence for genius promotes a kind of over-all piety that leads to a quietist respect for everything that is, investing tradition and the customary forms of life with a sacred aura it is impious to question or change.  (This is the aesthetic version—articulated by T. S. Eliot as well—of Edmund Burke’s conservatism.)

But the larger point—and now we are with David Hume and William James in contrast to Kant—is that our political ideas and principles (and our ethical ones as well) are the products of our sensibility.  It is the moral passions and our moral intuitions that generate our political commitments.  James (in the first lecture of Pragmatism) talks of “temperament”—and throughout his work (from The Principles of Psychology onwards) insists that our stated reasons for doing something are always secondary; there was the will to do something first, then the search for justifying reasons.  Indignation at the injustice of others (or of social arrangements) and shame at one’s own acts of selfishness are more secure grounds for conduct than a rationally derived categorical imperative.

James seems to think of temperament as innate, a fated from birth.  North’s point is that education—a sentimental education—can shape sensibility.  I agree.  My daughter was in college at George Washington University when Osama bin Laden was killed.  Her classmates rushed over to the White House (three blocks away) to celebrate when the news was heard.  She told my wife and me that she didn’t join the celebration.  It just felt wrong to her to dance in the streets about killing someone.  Her parents’ reaction was her Friends School education had just proved itself.

Sensibility is akin to taste.  The leftist today finds it distasteful, an offense to her sense of how things should be, to live in Trump’s America.  I will use my next post to describe the sensibility of the right in that America.  But for the left, there is outrage at the caging of immigrant children, and at the bigotry that extends to non-whites, women, non-Christians and beyond.  Fundamentally, it is the shame of living in such a needlessly cruel society, with its thousands of homeless and millions of uninsured.

I don’t know exactly how a specifically “aesthetic” sensibility lines up with this leftist sensibility.  And as I have said, there is certainly no sure path from one to the other.  But I am willing to believe (maybe because it is true at least for myself) that the aesthetic stands at odds with commercial culture, attending to values and experiences that are “discounted” (in every sense of that word) in the dominant culture.  Being placed at odds, in a spot where the taken-for-granteds of one’s society are made somewhat less self-evident, has its effect.  If what one has come to appreciate, even to love, is scorned by others, new modes of reckoning (again in every sense of the word), and new allegiances (structure of feeling) may beckon.

Here is where Hume is preferable to James.  Hume (Dewey and Mead follow Hume  here in a way the more individualistic James does not) portrays sensibility as shaped through our communal relations and as reinforced by those same relations.  In other words, even non-conformity is social.  It is extremely difficult, perhaps impossible (akin to the impossibility of a “private language” in the Wittgenstein argument) to be a solitary “enemy of the people.”  There must be resources—from the tradition, from received works of art, criticism, and cultural analysis, from a cohort—on which one can draw to sustain the feeling that something is wrong in the dominant order.

Education, in other words, can play a major role in shaping sensibility—and it is the community the school offers is as crucial as the educational content.  Young people discover the courage of their convictions when they find others who feel the same way, who have the same inchoate intuitions that school (in both its formal and informal interactions) is helping them to articulate.  The encouragement of teachers (yes, you are on the right path; keep going; keep probing; keep questioning; trust your instincts) and of peers (those famous all-night bull sessions after our student finds her sympaticos).

Communities are, famously, ephemeral.  We can idealize them (as arguably Hannah Arendt does in her definition of “the political”—a definition that seems to exclude everything except the excited talk among equals from the political sphere).  Societies are corrupt, impersonal, hierarchical, mechanical, not face-to-face.  Communities are “known” (as Raymond Williams phrased it), informal and intimate.  A familiar narrative of “modernity” sees communities as overwhelmed by society, by the depredations of capitalism, war, and the ever-expanding state. (Tonnies)

This romanticism does not serve the left well.  Communities are not sustainable in the absence of institutions.  And they certainly cannot withstand the pressures of power, of the large forces of capitalism and the state, without institutional homes.  There must (quite literally) be places for the community to gather and resources for its maintenance.  Make no mistake about it: neo-liberalism has deliberately and methodically set out to destroy the institutions that have sustained the left (while building their own infrastructure—chambers of commerce, business lobbying groups, the infamous think tanks—that provide careers for the cadre of right-wing hacks).  Unions, of course, first and foremost.  When did we last have a union leader who was recognized as a spokesperson for America’s workers?  But there has also been the absorption of the “associations” that Tocqueville famously saw as the hallmark of American democracy into the services of the state.  Outsourced welfare functions are now the responsibility of clinics first created by the feminist and gay liberation movements to serve the needs of their communities.  Financial stability has been secured at the price of being experienced as embedded members of the community; now those organizations are purveyors of  services begrudgingly offered by a bureaucratic state that always put obstacles in the way of accessing those benefits.

North is right to see that the neoliberal attack on institutions extends to the university.  The aesthetic sensibility (since at least 1960) has been bunkered in the university, having failed to sustain the few other institutional structures (little magazines, the literary reviews it inherited from the 19th century) that existed in the early 20th century.  Reading groups are well and good (they are thriving and I hardly want to belittle them), but have no institutional weight or home.  Humanities departments are about it, except for the arts scene (again, mostly woefully under-institutionalized) in some major cities.

So there is every reason to fight hard to keep the humanities as an integral part of the university.  I personally don’t think taking the disciplinary route is the way to fight this fight—but maybe I am wrong.  Maybe only claims to disciplinary specificity and expertise can gain us a spot.

More crucially, I think North is absolutely right to believe that our efforts as critics are doomed to political ineffectiveness if not tied to vibrant social movements.

[For the record, here is where I think North’s criticism/scholarship divide really doesn’t work.  Efforts along both lines can prove supportive or not to social movements.  It is the content, not the form, of the work that matters.  And I also think work that is apolitical is perfectly OK.  It is tyrannical—a mirror image of the absurd regimes of “productivity” that afflict both capitalism and the research university—to insist that everything one does contribute to the political cause.  Life is made worth living, in many instances, by things that are unproductive, are useless.]

The problem of the contemporary left is, precisely, the absence of such social movements.  The civil rights movement had the black churches, and then the proliferation of organizations: SNCC, CORE, SCLC, along with the venerable NAACP, and A. Philip Randolph’s labor organization.  It sustained itself over a very long time.  The feminist movement had its clinics, and NOW.  The anti-war movement had A. J. Muste and David Dellinger, long-time veterans of peace groups.  The Democratic Party is obviously no good unless (except when) it is pushed by groups formed outside the party, groups that act on their own without taking instructions from the party. The Bernie Sanders insurrection will only reshape the Democratic Party when it establishes itself as an independent power outside the party–with which the party then needs to come to terms.

The trouble with Black Lives Matter, ME Too, and Occupy is that they all have resisted or failed (I don’t know which one) to establish any kind of institutional base.  Each of these movements has identified a mass of people who share certain experiences and a certain sensibility.  They have, in other words, called into presence (albeit mostly virtually—except for Occupy) a community.  That discovery of other like souls is comforting, reassuring, even empowering.  I am not alone.  But to be politically effective, these movements need legs.  They need to be sustained, in it for the long haul.  And that requires institutions: money, functionaries, offices, continuing pressure at the sites deemed appropriate (for strategic reasons) for intervention.

In short (and now I am the one who is going to sound like a thirties Marxist), the left needs to make the long march through the institutions—a march begun by creating some institutions of its own on the outside to prepare it for the infiltration of the institutions on the inside.  That’s what the right has been doing for the past forty years.  While the left was marching in the street on the weekends with their friends, the right was getting elected to school boards.  Protest marches feel great, but are ephemeral, easily ignored.  Our society’s shift rightwards has come through a million incremental changes wrought on the ground by somebody in an office somewhere, by right wing hacks and business lobbyists writing legislation, by regulators letting oversight lapse, by prosecutors and courts looking the other way at white collar and corporate crime. During the Obama years, the left paid almost no attention to state-level races, ceding those legislatures to the right almost by default–with grievous consequences (not the least of which is a weak bench, unable to provide any potential national candidates between the ages of 45 and 65).

We need leftist social movements that pay attention to the minutiae, that are not addicted to the large dramatic gesture, that don’t engage in the magical thinking that a piece of legislation or a court decision solves a problem once and for all.  It’s the implementation, the daily practices of state, corporate, educational, regulatory institutions (as Foucault should have taught us) where change takes place, in often silent and difficult to perceive ways.  That’s the room where it happens—and the left has all too often failed to even try to get into the room.

Joseph North Two—Rigor and Memory (Oh My!)

It must be something in the water in New Haven.  North deploys the term “rigor” as frequently as Paul DeMan, with whom he has just about nothing else in common.  I will just offer two instances.  The first comes from his closing exhortation to his readers “to secure a viable site within the social order from which to work at criticism in the genuinely oppositional sense” (211).  Success in this effort would requires “a clear and coherent research program together with a rigorous new pedagogy, both of which, I think, would need to be founded on an intellectual synthesis that addressed the various concerns of the major countercurrents in a systematic and unitary way’ (211).

In the Appendix, the issue is described in this way:  “How does one pursue the tenuous task of cultivating an appreciation for the aesthetic without lapsing into mere impressionism?  How does one pursue this task with a rigor sufficient to qualify one’s work as disciplinary in the scientific terms recognized by the modern university” (217).

[A digression: nothing in the book suggests that North takes an oppositional stance toward the “modern university”—or to its notions of what constitutes a discipline, what “counts as” knowledge, or its measures of productivity.  Rather, he is striving to secure the place of literary studies within that university in order to pursue an oppositional, “radical” (another favorite word, one always poised against “liberal”) program toward modern, capitalist society.]

Rigor, as far as I am concerned, is a half step away from rigor mortis.  When I think of brilliant instances of close reading, rigor is just about the last word that comes to mind.  Supple, lively, surprising, imaginative, even fanciful.  In short, a great close reading quickens.  It bring its subject to life; it opens up, it illuminates. The associative leaps, the tentative speculations, the pushing of an intuition a little too far.  Those are the hallmarks of the kind of close reading that energizes and inspires its readers.  What that reader catches is how the subject at hand energized and inspired the critic.

Similarly, a rigorous pedagogy would, it seems to me, be the quickest way to kill an aesthetic sensibility.  The joyless and the aesthetic ne’er should meet.

Not surprisingly, I have a similar antipathy to “method.”  Close reading is not a method.  To explain why not is going to take a little time, but our guide here is Kant, who has wise and very important things to say about this very topic in his Critique of Judgment.  Spending some time with Kant will help clarify what it is the aesthetic can and can’t do.

But let’s begin with some mundane contrasts.  The cook at home following a recipe.  The lab student preforming an experiment.  The young pianist learning to play a Beethoven sonata.  The grad student in English learning to do close readings.  Begin by thinking of the constraints under which each acts—and the results for which each aims.  The cook and the lab student want to replicate the results that the instructions that have been given should lead to.  True, as cooks and lab students become more proficient practitioners, they will develop a “feel” for the activity that allows them to nudge it in various ways that will improve the outcomes.  The map (the instructions) is not a completely unambiguous and fully articulated guide to the territory.  But it does provide a very definite path—and the goal is to get to the designation that has been indicated at the outset. Surprises are almost all nasty in this endeavor.  You want the cake to rise, you want the experiment to land in the realm of replicable results.

The pianist’s case is somewhat different, although not dramatically so.  In all three cases so far, you can’t learn by simply reading the recipe, the instructions, the musical score.  You must actually do the activity, walk the walk, practice the practice.  There is more scope (I think, but maybe I am wrong) for interpretation, for personal deviance, in playing the Beethoven.  But there is limited room for “play” (using “play” in the sense “a space in which something, as a part of a mechanism, can move” and “freedom of movement within a space”—definitions 14 and 15 in my Random House dictionary.)  Wander too far off course and you are no longer playing that Beethoven sonata.

Now let’s consider our grad student in English.  What instructions do we give her?  The Henry James dictum: “be someone on whom nothing is lost”?  Or the more direct admonition: “Pay attention!”  Where do you even tell the student to begin.  It is not simply a case of (shades of Julie Andrews) beginning at the beginning, a very good place to start, since a reading of a Shakespeare sonnet might very well begin with an image in the seventh line.  In short, what’s the recipe, what’s the method?  Especially since the last thing we want is an outcome that was dictated from the outset, that was the predictable result of our instructions.

Kant is wonderful on this very set of conundrums.  So now let’s remind ourselves of what he has to say on this score.  We are dealing, he tells us, with two very different types of judgment, determinative and reflexive.  Determinative judgments guide our practice according to pre-set rules.  With the recipe in hand and a desire to bake a cake, my actions are guided by the rules set down for me.  Beat the batter until silky smooth (etc.) and judgment comes in since I have to make the call as to when the batter is silky smooth.  In reflexive judgment, however, the rule is not given in advance.  I discover the rule through the practice; the practice is not guided by the rule.

Kant’s example, of course, is the beautiful in art.  Speaking to the artist, he says: “You cannot create a beautiful work by following a rule.”  To do so, would be to produce an imitative, dispirited, inert, dead thing.  It would be, in a word, “academic.”  Think of all those deadly readings of literary texts produced by “applying” a theory to the text.  That’s academic—and precisely against the very spirit of the enterprise.

Here’s a long selection of passages from Kant’s Third Critique that put the relevant claims on the table.  We can take Kant’s use of the term “genius” with a grain of salt, translating it into the more modest terms we are more comfortable with these days.  For genius, think “someone with a displayed talent for imaginative close readings.”

Kant (from sections 46 and 49) of the third Critique:  “(1) Genius is a talent for producing something for which no determinative rule can be given, not a predisposition consisting of a skill for something that can be learned by following some rule or other; hence the foremost property of genius must be originality. (2) Since nonsense too can be original, the products of genius must also be models, i.e. they must be exemplary; hence, though they do not themselves arise through imitation, still they must serve others for this, i.e. as a standard or rule by which to judge. (3) Genius itself cannot describe or indicate scientifically how it brings about its products . . . . That is why, if an author owes his product to his genius, he himself does not know how he came by the ideas for it; nor is it in his power to devise such products at his pleasure, or by following a plan, and to communicate his procedure to others in precepts that would enable them to bring about like products” (Section 46).

“These presuppositions being given, genius is the exemplary originality of a subject’s natural endowment in the free use of his cognitive powers.  Accordingly, the product of a genius (as regards what is attributable to genius in it rather than to possible learning or academic instruction) is an example that is meant not to be imitated, but to be followed by another genius.  (For in mere imitation the element of genius in the work—what constitutes its spirit—would be lost.)  The other genius, who follows the example, is aroused to a feeling of his own originality, which allows him to exercise in art his freedom from the constraint of rules, and to do so in such a way that art acquires a new rule by this, thus showing that talent is exemplary” (Section 49).

Arendt on Kant’s Third Critique.  Cavell on It Happened One Night.  Sedgwick on Billy Budd.  Sianne Ngai on “I Love Lucy.” I defy anyone to extract a “method” from examining (performing an autopsy?) these four examples of close reading. Another oddity of North’s book is that for all his harping on the method of close reading, he offers not a single shout-out to a critic whose close readings he admires.  It is almost as if the attachment to “method” necessitates the suppression of examples.  Precisely because a pedagogy via examples is an alternative to the systematic, rigorous, and methodical pedagogy he wants to recommend.

But surely Kant is right.  First of all, right on the practical grounds that our student learns how to “do” close reading by immersion in various examples of the practice, not by learning a set of rules or “a” method.  Practice makes all the difference in this case; doing it again and again in an effort to reach that giddy moment of freedom, when the imagination, stirred by the examples and by the object of scrutiny, takes flight.  Surely “close reading” is an art, not a science.

And there, in the second and more important place, is where Kant is surely right.  If the very goal is to cultivate an aesthetic sensibility, how could we think that the modes of scientific practice, with its vaunted method and its bias toward replicable and predictable results, would serve our needs?  The game is worth the candle precisely because the aesthetic offers that space of freedom, of imaginative play, of unpredictable originality.  If the aesthetic stands in some kind of salutary opposition to the dominant ethos of neoliberalism, doesn’t that opposition rest on its offer of freedom, of the non-standard, of the unruly, of non-productive imaginings?  Why, in other words, is the aesthetic a threat and a respite from the relentless search for returns on investment, for the incessant demand that each and every one of us get with the program?  That they hate us is a badge of honor; being systematic seems to want to join the “rationalized” world of the economic?  [Side note: here is where critique cannot be abandoned.  We must keep pounding away at the quite literal insanity, irrationality, of the market and all its promoters.  But the aesthetic should, alongside critique, continued to provide examples of living otherwise, of embodying that freedom of imagination.]

Kant, of course, famously resists the idea that lack of method, praise of an originality that gives the rule to itself, means that anything goes.  Genius is to be disciplined by taste, he writes.  We judge the products produced by the would-be genius—and deem some good examples and others not so good.  I am, in fact, very interested in the form that discipline takes in Kant, although this post is already way too long so I won’t pursue that tangent here.  Suffice it to say two things:

1. The standard of taste connects directly to Kant’s fervent desire for “universal communicability.”  He fears an originality so eccentric that it places the genius outside of the human community altogether.  If genius is originality, taste is communal (the sensus communis)—and Kant is deeply committed to the role art plays in creating and sustaining community.  The artist should, even as she pursues her original vision, also have the audience in mind, and consider how she must shape her vision in order to make it accessible to that audience.  So we can judge our students’ attempts to produce close readings in terms of how they “speak” to the community, to the audience.  Do they generate, for the reader, that sense that the text (or film or TV show) in question has been illuminated in exciting and enlivening ways?  There is an “a-ha” moment here that is just about impossible to characterize in any more precise–or rigorous–way.

2. Taste, like genius, is a term that mostly embarrasses us nowadays. It smacks too much of 18th century ancien regime aristocrats.  But is “aesthetic sensibility” really very different from “taste”?  Both require cultivating; both serve as an intuitional ground for judgments.  In my next post—where I take up the question of sensibility—I want to consider this connection further.

But, for now, a few words more about “close readings.  Just because there is no method to offer does not mean we cannot describe some of the characteristics of close reading.  I think in fact, we can call close readings examples of “associative thinking.”  A close reading (often, hardly always) associates disparate things together—or dissociates things that we habitually pair together or considered aligned.  So Arendt shows us how Kant’s third Critique illuminates the nature of the political; Cavell enriches a meditation on finitude through an engagement with It Happened One Night; Sedgwick’s reading of Billy Budd illustrates how homosexuality is both acknowledged and denied; Ngai associates a situation comedy with the nature of precarious employment.  In each case, there is an unexpected—and illuminating, even revelatory—crossing of boundaries.  Surprising juxtapositions (metonymy) and unexpected similarities where before we only saw differences (metaphor).  Which takes us all the way back to Aristotle’s comment “that the metaphorical kind [of naming] is the most important by far.  This alone (a) cannot be acquired from someone else, and (b) is an indication of genius” [that word again!] (Sectoin 22 of the Poetics).  There is no direct way to teach someone how to make those border crossings.

How is this all related to judgment?  Both to Aristotle’s phronesis (sometimes translated as “practical wisdom”) and to Kantian judgment.  (Recall that morality for Kant is too important to leave to judgment of the reflexive sort; he wants a foolproof method for making moral judgments.  Aristotle is much more willing to see phronesis at work in both ethics and aesthetics.)  We get wrong-footed, I think, when we tie judgment to declaring this work or art beautiful or not, this human action good or evil.  Yes, we do make such judgments.

But there is another site of judgment, the one where we judge (or name) what situation confronts us.  Here I am in this time and place; what is it that I am exactly facing?  Here is where associative thinking plays its role.  How is this situation analogous to other situations I know about—either from my own past experiences or from the stories and lessons I have imbibed from my culture?  Depending on how I judge the situation, how I name it, is what I deem possible to make of it.  Creative action stems from imaginative judgments, from seeing in this situation possibilities not usually perceived.

That’s the link of judgment to the aesthetic: the imaginative leaps that, without the conformist safety net of a rule or method, lead to new paths of action.  If we (as teachers in the broad field of aesthetics) aim to cultivate an aesthetic sensibility, it is (I believe) to foster this propensity in our students for originality, for genius—in a world where conformity (the terror of being unemployable, of paying the stiff economic price of not following the indicated paths) rules.  Judgment, like metaphorical thinking, is an art, not a science—and cannot be taught directly, but only through examples.  It’s messy and uncertain (expect lots of mistakes, lots of failed leaps).  And it will exist in tension with “the ordinary”—and, thus, will have to struggle to find bridges back to the community, to the others who are baffled by the alternative paths, the novel associations, you are trying to indicate.

Joseph North (One)

One of the oddities of Joseph North’s Literary Criticism: A Concise Political History (Harvard UP, 2017) is that it practices what it preaches against.  North believes that the historicist turn of the 1980s was a mistake, yet his own “history” is very precisely historicist: he aims to tie that “turn” in literary criticism to a larger narrative about neo-liberalism.

In fact, North subscribes to a fairly “vulgar,” fairly simplistic version of social determinism.  His periodization of literary criticism offers us “an early period between the wars in which the possibility of something like a break with liberalism, and a genuine move to radicalism, is mooted and then disarmed,” followed by “a period of relative continuity through the mid-century, with the two paradigms of ‘criticism’ and ‘scholarship’ both serving real superstructural functions within Keynesianism.”  And, finally, when the “Keynesian period enters into a crisis in the 1970s . . . we see the establishment of a new era: the unprecedentedly complete dominance of the ‘scholar’ model in the form of the historicist/contextualist paradigm.”  North concludes this quick survey of the “base” determinants of literary critical practice with a rhetorical question:  “If this congruence comes as something of a surprise, it is also quite unsurprising: what would one expect to find except that the history of the discipline marches more or less in step with the underlying transformations of the social order?” (17).

Perhaps I missed something, but I really didn’t catch where North made his assertions about the two periods past the 1930s stick.  How do both the “critical” and “scholarly” paradigms serve Keynesianism?  I can see where the growth of state-funded higher education after World War II is a feature of Keynesianism.  But surely the emerging model (in the 50s and 60s) of the “research university,” has as much, if not more, to do with the Cold War than with Keynesian economic policy.

But when it gets down to specifics about different paradigms of practice within literary criticism, I fail to see the connection.  Yes, literary criticism got dragged into a “production” model (publish or perish) that fits it rather poorly, but why or how did different types of production, so long as they found their way into print, “count” until the more intense professionalization of the 1970s, when “peer-reviewed” became the only coin of the realm?  The new emphasis on “scholarship” (about which North is absolutely right) was central to that professionalization—and does seem directly connected to the end of the post-war economic expansion.  But that doesn’t explain why “professionalization” should take an historicist form, just as I am still puzzled as to how both forms—critical and scholarly—“serve” Keynesian needs prior to 1970.

However, my main goal in this post is not to try to parse out the base/superstructure relationship that North appears committed to.  I have another object in view: why does he avoid the fairly obvious question of how his own position (one he sees as foreshadowed, seen in a glass darkly, by Isobel Armstrong among others) reflects (is determined by?) our own historical moment?  What has changed in the base to make this questioning of the historicist paradigm possible now?  North goes idealistic at this point, discussing “intimations” that appear driven by dissatisfactions felt by particular practitioners.  The social order drops out of the picture.

Let’s go back to fundamentals.  I am tempted to paraphrase Ruskin: for every hundred people who talk of capitalism, one actually understands it.  I am guided by the sociologist Alvin Gouldner, in this case his short 1979 book The Rise of the New Class and the Future of the Intellectuals (Oxford UP), a book that has been a touchstone for me ever since I read it in the early 1980s.  Gouldner offers this definition of capital: anything that can command an income in the mixed market/state economy in which we in the West (at least) live.  Deceptively simple, but incredibly useful as a heuristic.  Money that you spend to buy food you then eat is not capital; that money does not bring a financial return.  It does bring a material return, but not a financial one.  Money that you (as a food distributer) spend to buy food that you will then sell to supermarkets is capital.  And the food you sell becomes a commodity—while the food you eat is not a commodity.  Capital often passes through the commodity form in order to garner its financial return.

But keep your eye on “what commands an income.”  For Marx, of course, the wage earner only has her “labor power” to secure an income.  And labor power is cheap because there is so much of it available.  So there is a big incentive for those who only have their labor power to discover a way to make it more scarce.  Enter the professions.  The professional relies on selling the fact that she possesses an expertise that others lack.  That expertise is her “value added.”  It justifies the larger income that she secures for herself.

Literary critics became English professors in the post-war expansion of the research university.  We can take William Empson and Kenneth Burke as examples of the pre-1950s literary critic, living by their wits, and writing in a dizzying array of modes (poetry, commissioned “reports,” reviews, books, polemics).  But the research university gave critics “a local habitat [the university] and a name” [English professors] and, “like the dyer’s hand, their nature was subdued.”  The steady progress toward professionalization was begun, with a huge leap forward when the “job market” tightened in the 1970s.

So what’s new in the 2010s?  The “discipline” itself is under fire.  “English,” as Gerald Graff and Peter Elbow both marveled years ago, was long the most required school subject, from kindergarten through the second year of college.  Its place in our educational institutions appeared secure, unassailable.  There would always be a need to English teachers.  That assumed truth no longer holds.  Internally, interdisciplinarity, writing across the curriculum, and other innovations threatened the hegemony of the discipline.  Externally, the right wing’s concerted attack on an ideologically suspect set of “tenured radicals” along with a more general discounting (even elimination) of value assigned to being “cultured” meant the “requirement” of English was questioned.

North describes this shift in these terms:  “if the last three decades have taught literary studies anything about its relationship to the capitalist state, it is that the capitalist state does not want us around.  Under a Keynesian funding regime, it was possible to think that literary study was being supported because it served an important legitimating role in the maintenance of liberal capitalist institutions. . . . the dominant forms of legitimation are now elsewhere” (85).  True enough, although I would still like to see how that “legitimating role” worked prior to 1970; I would think institutional inertia rather than some effective or needed legitimating role was the most important factor.

In that context, the upsurge in the past five years (as the effects of 2008 on the landscape of higher education registered) of defenses of “the” discipline makes sense.  North—with his constant refrain of “rigor” and “method”—is working overtime to claim a distinctive identity for the discipline (accept no pale or inferior imitations!).  This man has a used discipline to sell you. (It is unclear, to say the least, how a return to “criticism,” only this time with rigor, improves our standing in the eyes of the contemporary “capitalist state.”  Why should they want North’s re-formed discipline  around anymore than the current version?)

North appears  blind to the fact that a discipline is a commodity within the institution that is higher education.  The commodity he has to sell has lost significant amounts of value over the past ten years within the institution, for reasons both external and internal.  A market correction?  Perhaps—but only perhaps because (as with all stock markets) we have no place to stand if we are trying to discover the “true” value of the commodity in question.

So what is North’s case that we should value the discipline of literary criticism more highly? He doesn’t address the external factors at all, but resets the internal case by basing the distinctiveness of literary criticism on fairly traditional grounds: it has a distinct method (“Close reading”) and a distinct object (“rich” literary and aesthetic texts).  To wit:  “what [do] we really mean by ‘close reading’ beyond paying attention to small units of any kind of text.  Our questions must then be of the order: what range of capabilities and sensitivities is the reading practice being used to cultivate?  What kinds of texts are most suited to cultivating those ranges? Putting the issue naively, it seems to me that the method of close reading cannot serve as a justification for disciplinary literary study until the discipline is able to show that there is something about literary texts that make them especially rewarding training grounds for the kinds of aptitudes the discipline is claiming to train.  Here again the rejected category of the aesthetic proves indispensable, for of course literary and other aesthetic texts are particularly rich training grounds for all sorts of capabilities and sensitivities: aesthetic capabilities”( 108-9; italics in original).

I will have more to say about “the method of close reading” in my next post.  For now, I just want to point out that it is absurd to think “close reading” is confined to literary studies–and North shows himself aware of that fact as he retreats fairly quickly from the “method” to the “objects” (texts).  Just about any practitioner in any field to whom the details matter is a close reader.  When my son became an archaeology major, my first thought was: “that will come to an end when he encounters pottery shards.” Sure enough, he had a brilliant professor who lived and breathed pottery shards—and who, even better yet, could make them talk.  My son realized he wasn’t enthralled enough with pottery shards to give them that kind of attention—and decided not to go to grad school.  Instead, my son realized that where he cared about details to that extent, where no fine point was too trivial to be ignored, was the theater—and thus he became an actor and a director.  To someone who finds a particular field meaningful, all the details speak.  Ask any lawyer, lab scientist, or gardener.  They are all close readers.

This argument I have just made suggests, as a corollary, that all phenomenon are “rich” to those inspired by them.  Great teachers are, among other things, those who can transmit that enthusiasm, that deep attentive interest, to others.  If training in attention to detail is what literary studies does, it has no corner on that market.  Immersion in just about any discipline will have similar effects.  And there is no reason to believe the literary critics’ objects are “richer” than the archaeologists’ pottery shards.

In short, if we go the “competencies” route, then it will be difficult to make the case that literary studies is a privileged route to close attention to detail—or even to that other chestnut, “critical thinking.” (To North’s credit, he doesn’t play the critical thinking card.)  Most disciplines are self-reflective; they engage in their own version of what John Rawls called “reflective equilibrium,” moving back and forth between received paradigms of analysis and their encounter with the objects of their study.

North is not, in fact, very invested in “saving” literary studies by arguing they belong in the university because they impart a certain set of skills or competencies that can’t be transmitted otherwise.  Instead, he places almost all his chips on the “aesthetic.”  What literary studies does, unlike all the rest, is initiate the student into “all sorts of capabilities and sensitivities” that can be categorized as “aesthetic capabilities.”

Now we are down to brass tacks.  What we need to know is what distinguishes “aesthetic capabilities” from other kinds of capabilities?  And we need to know why we should value those aesthetic capabilities?   On the first score, North has shockingly little to say—and he apologizes for this failure.  “I ought perhaps to read into the record, at points like this, how very merely gestural these gestures [toward the nature of the aesthetic] have been; the real task of developing claims of this sort is of course philosophical and methodological rather than historical, and thus has seemed to me to belong to a different book” (109; italics in original).

Which leaves us with his claims about what the aesthetic is good for.  Why should we value an aesthetic sensibility?  The short answer is that this sensibility gives us a place to stand in opposition to commercial culture.  He wants to place literary criticism at the service of radical politics—and heaps scorn throughout on liberals, neo-liberals, and misguided soi-disant radicals (i.e. the historicist critics who thought they were striking a blow against the empire).  I want to dive into this whole vein in his book in subsequent posts.  Readers of this blog will know I am deeply sympathetic to the focus on “sensibility” and North helps me think again about what appeals to (and the training of) sensibilities could entail.

But for now I will end with registering a certain amazement, or maybe it is just a perplexity.  How will it serve the discipline’s tenuous place in the contemporary university to announce that its value lies in the fact that it comes to bury you?  Usually rebels prefer to work in a more clandestine manner.  Which is to ask (more pointedly): how does assuming rebellious stances, in an endless game in which each player tries to position himself to the left of all the other players, bring palpable rewards within the discipline even as it endangers the position of the discipline in the larger struggle for resources, students, and respect within the contemporary university? That’s a contradiction whose relation to the dominant neo-liberal order is beyond my abilities to parse.

Response to Michael Clune’s “Judgment and Equality”

Headnote: I was scheduled to present at the American Comparative Literature Association meeting in Chicago on March 20th.  Obviously, the meeting got cancelled.  The session was on “Aesthetic Education” and the panel members were all asked to read Joseph North’s recent book Literary Criticism: A Concise Political History (Harvard UP, 2017) and an essay by Michael Clune entitled “Judgment and Equality” (Critical Inquiry, 2018).  After reading the Clune essay, I was moved to write the response posted below.  I think it is fairly self-explanatory, even if you haven’t read the Clune essay.  After writing this response, I discovered that Clune had offered a shorter version of his plea for the authority of experts (and polemic against equality in matters of judgment) in a Chronicle of Higher Education piece that generated a fair amount of hostile response.  (You can easily find these pieces on line by googling Clune’s name.)  In particular, the hostility came from the fact that conservative New York Times pundit, Ross Douhat, wrote favorably about Clune’s position on the op-ed page of the Times.  Doubtless, Clune was chagrined to see his argument, which he thought was radically leftist, embraced by a right-wing writer.  But I don’t know that he should have been particularly surprised; to question–or to think about limiting–the claims of democratic equality is always going to play to the right’s fundamental commitment to reining in equality and democracy wherever it rears its dangerous head.  In any case, it is to the anti-democratic implications of Clune’s argument that my piece responds to.  I will post some thoughts on North’s book in the next few days.

 

In November 2008, a week after the election of Barack Obama to the presidency, I was in a New York city room full of bankers and hedge fund managers leading a discussion on the implications of that election.  The financiers were horrified; they earnestly told the gathering that Obama and a Democratic Congress, led by Nancy Pelosi were know-nothings who, through their ignorant meddling, were about to ruin American economic prosperity.  These men—and of course they were all men—were completely unshaken in their conviction of their competence even following the financial collapse of the previous month.  A portrait of expertise in action, offering a strong case for why the rule of experts must be tempered by the oversight of the demos.  Every profession is a conspiracy against the laity, George Bernard Shaw famously warned us.

Democracy means many things, but one of its many entailments is that elites must subject themselves to the judgment of the masses.  As experts we can deplore the ignorance of the non-initiated, but in a democracy authority is not to be had as a gift but must be earned.  Democracy is a supremely rhetorical political form.  Any one, including the expert, who has a position they want the polity to act upon must convince a majority of her fellow citizens to endorse that policy.  Persuasion is the name of the game; and saying it again, just louder this time and standing on my credentials as an expert, is not a very effective rhetorical move.  There is a deep anti-authoritarian bias in the demos—and we should celebrate that fact.  Democracy, as Winston Churchill said, has some very obvious flaws, but it sure beats all the alternatives.

The right has eaten the left’s lunch for some forty years now.  We people of the left can scream that it hasn’t been a fair fight, but that still doesn’t provide any justification for retreating from the democratic arena into a petulant insistence on our being correct and the misled masses being wrong.  The technocracy of the EU may be somewhat preferable to the plutocracy of the US, but the “democratic deficit” is real in both cases.  Maybe democracy is always a battle between elites for endorsement from the general populace.  If that is the case, and if violence is not considered a viable or desirable alternative, then the rhetorical battle for the hearts and minds of the people is where all the action is.  It makes no sense in such a battle to begin by maligning the judgment of those people.  Depending on the capacity of the people to judge for themselves is the foundational moment of faith in a democratic society.  Yes, as Clune reminds, us, Karl Marx refuses to make that leap of faith.  Do we really want to follow Marx down that anti-democratic path?

Marx, after all, also warns us that every ruling elite indulges itself with the sweet conviction that it acts in the interests of all.  We, those business men I spent the evening with told themselves, are the “universal class” because we bring the blessings of economic plenty to all.  In their utter belief in their own goodness, I saw a mirror image of myself and my leftist friends.  If we don’t for a moment want bankers to avoid accountability to the people they claim to serve, why would we think we deserve an exemption.  Listen to your academic colleagues rant about the vocabulary of assessment and outcomes when applied to what happens in the classroom—and you will hear an echo of what I listened to that night in New York. Who dares to question the effectiveness of what transpires on our college campuses?

Kenneth Burke picked up the term “professional deformation” from John Dewey.  He used it to highlight the blindness that accompanies immersion in a discipline.  I think Clune is right to present judgment as emerging from the practices and institutions of a discipline. (“[T]o show someone the grounds of a given judgment is to educate them in the field’s characteristic practices,” he writes [918].)  The oddity of his position, it seems to me, is that he takes this Kuhnian point as a reason to enhance our faith in the judgments of those encased in a paradigm.  That strikes me as a very odd reading of Kuhn, taking his book as a celebration of “normal science” instead of a meditation on the difficulty of intellectual revolution because of the blinders normal science imposes.  It is only a bit exaggerated, in my view, to see Kuhn as telling us that textbooks devour their readers and turn them into mindless conformists. Yes, Clune nods to the fact that communities of practitioners “can and do manifest bias and thus serve as sites of oppression” (918), but he seems to think acknowledgment of that fact is enough to render it harmless, appealing to an unspecified “broad range of measures” (919) that can compensate for the potential oppressions.  But I read Kuhn as suggesting that it is precisely the young, the uninitiated, the outsiders (in other words, those who are least embedded in the community of practice, or even non-members of it), who are most likely to disturb its complacency, its confidence in its judgments and its blindness to its biases and oppressions.  Let’s remember Foucault’s lessons about the power of disciplines.  All concentrations of power are to be distrusted, which is another reason (besides a discipline’s in-built blind spots) to advocate for the subjection of expert judgments to external review—and not simply external review by other members of the community in question.  I am a firm believer in the 80/20 rule; spend 80% of your effort in mastering your discipline; spend 20% of your time in wide-ranging reading and activities that are completely unrelated to that discipline.  And then use that 20% to break open your discipline’s inbreeding.

I am fully sympathetic with Clune’s desire to find in aesthetics an alternative to the norms and values of commercial society.  And that position does seem to entail a commitment to aesthetic education as the site when that alternative can be experienced and embraced.  I also believe that the democratic commitment to the people’s right to judge the prescriptions and advice of the experts does make the need for an educated citizenry a priority for our schools and universities.  The liberal arts curriculum should be aimed at making citizens more competent judges.  It is a strong indication of the right wing’s rhetorical triumph with a section of the populace that a majority of Republicans in a recent poll agreed that universities did more harm than good.  I don’t need to tell this audience that the liberal arts and the arts are under a sustained rhetorical attack.

What drives people like me and you crazy is that the attitudes adopted by the right are impervious to facts.  Climate change denial has become the poster child for this despair over the ability of the demos to judge correctly or wisely.  It is worth mentioning that the denigration of the liberal arts is equally fallacious, at least if the reasons to avoid humanities or arts classes are economic.  All the evidence shows that humanities and arts majors, over a lifetime, do just as well economically as science and engineering and business majors.  The sustained attack on the arts and humanities has more to do with a distaste for the values and capacities (for critical thinking, for sophisticated communication) they promote.

So what are we, the defenders of the aesthetic and the humanities (along with the world-view those disciplines entail), to do?  Saying our piece, only louder this time, and with a statement of our credentials as experts, won’t do.  Declaring our inequality, my superiority to you, should be a non-starter at a moment in history where increasing inequality is among our major problems.  I, frankly, am surprised that Clune is even tempted to take that route.  It comes across as pretty obvious petulance to me.  Why isn’t anyone paying any attention to me?  I know what’s what and they don’t. Listen up people.

In short, I stand with those who realize that judgment needs to be reconceived in ways that render it compatible with equality.  Clune is undoubtedly right that some writers have failed to face squarely the fact that judgment and equality are not easily reconcilable.  The problem, to put it into a nutshell, is that judgment seems to entail right and wrong, correct and incorrect, true and false.  To make all judgments equivalent is akin (although it is not actually that same as) total relativity, the idea that every judgment is “right” within a specified context.  Contrasted to that kind of relativism, the acceptance of the equivalence of all judgments can look even more fatuous, marked with a shrug and a “whatever.”  No point arguing since there is no accounting for tastes, and no one gets to dictate your tastes to you even if they are weird, incomprehensible, obnoxious, disgusting.  One’s man’s meat is another man’s poison.

Faced with such epistemological throwing in of the towel, it is not a surprise that folks keep coming back to Kant.  Clune details how both Sianne Ngai and Richard Moran have recently tried to come to terms with Kant’s attempt to demonstrate that aesthetic judgments make a “demand” on others, thus raising our aesthetic preferences above a mere statement of personal taste and towards an intersubjective objectivity.  Ngai, Moran, and Clune all use the term “demand” and the three translations of Kant’s Critique of Judgment I have consulted also use that term.  But I will confess to preferring Hannah Arendt’s translation of Kant, even though I have never been able to find in Kant where she finds the phrase that she puts in quotation marks.  For Arendt, those making an aesthetic judgment, then “woo the consent” of the other.  Arendt, in other words, places us firmly back into the rhetorical space that I am arguing is central to democracy.  Surprisingly, Clune never recognizes the affinity between his “community of practitioners” and Kant’s sensus communis.  What Arendt calls our attention to—especially when she tells us that Kant’s Critique of Judgment is the “politics” critics claim he never got around to writing—is the fact that the sensus communis always needs to be created and its ongoing reconfiguration is the very stuff of politics.  Yes, judgments are deeply indebted to and influenced by the community from which they are articulated, but that community and its practices is a moving target.  Think of Wittgenstein’s image of language as a sea-going vessel that undergoes a slow, but complete, rebuild even as it never leaves the water for dry-dock.  The democratic community—and its judgments on the practices of its various sub-cultures and its elites and its experts—is continually being refashioned through the public discourses that aim to sway the public in one direction or another.

How does this understanding of the scene of politics help.  Clune, I think, provides a clue when he writes “For me to be convinced by the critic’s aesthetic judgment that James is interesting means not that I have evaluated the reasons for that judgment but that I’ve decided to undertake an education that promises to endow me with his or her cultural capacities” (926).  What gets under-thought here is what would actually motivate such a decision.  We need to invoke Aristotle in conjunction with Raymond Williams at this point.  The expert—be she a climate scientist, a heterodox economist, or a Proust scholar—wants, at a minimum, to inspire trust, and, at a maximum, the auditor’s desire to join her community of practitioners, to make its common sense his own.  It is not “reasons,” as Clune says, that are decisive here, but ethos.  I would be willing to be that almost everyone in this room could point toward a teacher who inspired them—and inspired them exactly as the kind of person I myself wanted to become.  What an aesthetic education offers is initiation into a particular “structure of feeling.”  It is the attractiveness of that sensibility that our political and public rhetorics need to convey.  Once again, Kant and Arendt help us here when they point to the crucial importance of the “example” to these attempts to “woo the other.”  Modelling what a life lived within that structure of feeling looks like is far more potent that pronouncing from on high that Moby Dick is superior to Star Wars.

Look at this concretely.  The rhetorical genius of the Republican party since Ronald Reagan has been to portray the professional, educated, upper-middle class left (who occupy then “helping professions” of doctor, lawyer, teacher, social work) as joyless scolds, continually nagging you about how all the things you do are harmful to the environment, to social harmony, to your own well-being.  They have made it a political statement to drive a gas-guzzling truck while smoking a cigarette in defiance of those pious kill-joys.  That’s the rhetorical battle that the left has been losing since 1980.  Yes, the populace scorns our expert judgments, but that’s because they have no desire at all to be part of the communities in which those judgments are common sense.  Our problem, you might say, is not how to educate—aesthetically or otherwise—those who make the decision to undertake an education, but is how to make the prospect of an education appealing to those who see it as only a constant repudiation of their own sensibilities and capacities.  In short, “structures of feeling” triumph over “interests” much of the time and the left has proved spectacularly inept at modelling positive examples of the sensibility we wish to see prevail in our society.

I shouldn’t be so overwhelmingly negative about the left.  The sea-change in attitudes (and public policy) toward LBGTQ citizens over the past thirty years cannot be overstated.  Of course, given that attitudes are, as I have argued, a moving target, changes in any one direction are never set in stone.  Constant maintenance, rearticulation, and adjustments on the fly are necessary.  The task of education, of initiation into a sensibility that has come to seem “common sense,” as both attractive and right, is always there in front of us.  I am simply arguing that the right wing has been more attuned to that educative task than the left.  Or as I am prone to say, the left goes out and marches in the street on the weekend before returning to work on Monday while the right gets itself elected to school boards.

As a teacher, I find Ngai’s focus on “the interesting” crucial and poignant.  When we call something “interesting,” we are saying it is something worry of attention, something worthy of pausing over and considering at more length.  And that plea for attention is certainly at the very center of my practice as a teacher.  When I declare in front of class that this or that is “interesting,” I am inviting students into a sensibility that wants to ponder the significance of the thing in question.  But I am also pleading with them to take that first step—knowing that for many of them I am just another professor who incomprehensively gets excited about things to which they are supremely and irredeemably indifferent.  You can’t win them all, of course.  But the effort to win some of them over is endless, never fully successful, and in competition with lots of other demands on their attention.

There is, I am arguing, no other course of action open in a democratic society.  We are, if you will, condemned to that rhetorical battle, attempting to woo our students, to woo the demos, to a particular sensibility, a particular vision of the good.  That, I will state it nakedly, is politics.  To dream of a world where expert opinion is accepted by the non-experts is to dream of salvation from politics, from its endless wrangling, its messy compromises, its inevitable mix of failures with successes.  It is to desire a technocratic utopia, in which the “administration of things” replaces the conflicts of political contestation.  No thank you.

Another way to say this is that politics is the inevitable result of living in a pluralistic universe.  There will never be full consensus, there will never be a single vision of the good to which all subscribe, there will never be an all-encompassing and all-inclusive sensus communis.  On the whole, I’d say that’s a good thing.  I would hate to live in a world where everyone disagreed with me about everything.  But I am convinced that a world in which everyone agreed with me about everything would be almost as bad.

But, but, but . . . climate change.  Please recognize that climate change is just one in a long string of existential threats that democracy—slow, contentious, ruled by greed and passion—is deemed ill equipped to handle.  Authoritarians of whatever political stripe are always going to identify a crisis that means democracy must be put on hold.  The terrible attraction of war is that it negates the messy quotidian reality of pluralism.  The dream is of a community united, yoked to a single overwhelming purpose, with politics suspended for the duration.  Thus, that great champion of pluralism, William James, could also dream of a “moral equivalent of war.”  Perhaps democracy truly is unequal to the challenge of climate change, but then the desire/need to jettison democracy should be stated openly.  Otherwise, it is back to the frustrations of political wrangling, to the hard process of winning over the demos.

So, yes, I am in favor of an aesthetic education that aims to introduce students to a sensibility that finds commercial culture distasteful and (perhaps more importantly but perhaps not) unjust. And I want them to see that indifference to climate change is of a piece with the general casualness of our prevailing economic order to the sufferings of others. But I cannot endorse Clune’s picture of that educational process.  “[T]he significant investment of time and energy that this education requires—both at its outset and for a long time afterwards—is channeled in submission to the expert’s judgment that these works make particularly rewarding objects of attention.  The syllabi of an English department’s curriculum, for example, codify this submission” (926).  I have been fighting against my English department’s curriculum for twenty-five years.  The texts I want to teach in my classes are the ones I find good to think with—and I invite my students to join me in that thinking process.  (More Arendt here: her notion that judgment involves “going visiting” and you can know a thinker’s ethos by considering the company she wants to visit—and to keep.)  What I model is one person’s encounter with other minds—the minds represented by the books we read and by the people who are in the classroom with me.  My colleagues should have similar freedom to construct their courses around the texts that speak to them—and in which they then try to interest their students.

Fuck submission.  Maybe it’s because I teach in the South.  But my students have been fed submission with mother’s milk.  What they need to learn is to trust their own responses to things, to find what interests them, to find what moves them emotionally and intellectually.  They need to learn the arrogance of democratic citizenship, which arrogates to itself the right to judge the pronouncements of the experts.  Certainly, I push them to articulate their judgments, to undertake themselves to woo others to their view. They must accept that they too are joined in the rhetorical battle, and if they want allies they will have to learn how to be persuasive. But that’s very, very different from suggesting that anyone should ever take the passive position of submission.

Clune is scornful of Richard Moran’s “liberal” endorsement of freedom of choice.  So I want to end with a question for all of you as teachers.  Can I safely assume that you would deem it inappropriate, in fact unethical, to tell your students whether or not to believe in god, or what career path to follow, or for whom they should vote?  If you do think, in your position as a teacher, that you have the right to tell your students what to do in such cases, I would like to hear your justification for such interference.  Obviously, what I am suggesting here is that our sensus communis does endorse a kind of baseline autonomy in matters of singular importance to individuals.  I certainly wouldn’t want to live in a society where my freedom to choose for myself about such matters were not respected.  If some of you in the room feel differently, I am very interested in hearing an articulation and defense of such feelings.

Now we could say that our expertise as teachers does not extend to questions of career, religious faith, or politics.  But where we are experts, there we are entitled to tell a student he is wrong.  James really in interesting; Moby Dick really is better than Star Wars.  But surely such bald assertions are worthless.  How could they possibly gain the end we have in view?  Via the path of submission?  I can’t believe it.  Yes, we stand up there in our classrooms and use every trick we can muster to woo our students, to get them interested, and even to endorse our judgments after careful consideration; one of our tasks is to teach (and model) what careful consideration looks like.  And I certainly hope you are especially delighted when some student kicks against the pricks and makes an ardent case that Star Wars is every bit as good as Melville.  Because that’s the sensibility I want aesthetic education to impart.

 

 

 

No Salvation

Somewhere (of course I can’t find it now) in his An American Utopia: Dual Power and the American Army (Verso, 2016), Fredric Jameson tells us that utopia is merely our same human world with a slight difference.  One mistake (his book outlines legions of mistakes) is to think we can effect a total transformation of humankind and human society.  It is not that he eschews the ideal, the dream, of revolution; he only wants to downsize what we think a revolution could accomplish.  Basically, it seems he believes we can collectivize labor, but we cannot overcome social antagonism.  There is a primal fear/envy/hatred/aggression toward the Other that will persist.

I am not particularly interested in Jameson’s proposed utopia;  what interests me is the ramifications of taking the position that there is “no salvation.”  Let me try to state my position starkly.  (I will then complicate matters by exploring my uneasiness with that position.)  The stark formulation: there is no once-for-all, totalizing transformation for the various ills of our current lot.  No deus ex machina, no transcendence.  We are condemned to chipping away at things piecemeal, in making what small improvements when and where we can.  Such improvements are themselves never secured once and for all; there will be backslidings, unexpected twists and turns, unforeseen (and often deeply evil) consequences; the powers of darkness will be ever with us and ever fighting for their side.

This position fits with a robust pluralism; there is no totality, no overarching system, and hence no special point of leverage from which the whole world can be moved.  We have to work with the tools that are to hand and we have to work on the problems that are also to hand.  Successes will be hard won—and partial.  Reliance on a totalizing revolution, on salvation, is a species of magical thinking.  Worse, it is an abdication of involvement in the here and now, a religious focus on a “better world” elsewhere.  This world is all we’ve got, so hunker down and get to work on it.

I trust you get the idea. Radical secularism and anti-transcendentalism. But I want to combine those positions with a radical openness.  The idea is not to create constraints, not to say with Thatcher that there are “no alternatives,” or to adopt the kind of quietism that can go with Nietzschean affirmation.  No “amor fati” please, but a continual kicking against the pricks—and every attempt to think and act creatively.  The constant experimentation of James and Dewey’s pragmatism, where you don’t know what a situation might enable until you try it out, when you discover its affordances and resistances in practice.

I want to avoid every form of what I have called “transcendental blackmail,” meaning ontological or “realistic” claims that declare certain things impossible from the outset.  But I am contradicting myself because I have claimed total revolution impossible, based on an ontological claim of pluralism.  Why deny to the revolutionaries their right to experiment with the possibility of total transformation?  (This becomes like James’s notorious essay “The Will to Believe” with the revolutionaries being granted the right to believe that a revolution is possible.)

What is it about dreams of total escape from the human condition that I find objectionable?  Why do I want to shut down not only the hope, but the very vocabulary, of “salvation” and “redemption”?  I am, it seems to me, partly in Nietzsche’s camp; I want to reject nihilism’s negations of this world, of the here and now.  I want to articulate some version of “affirmation” that accepts where we are—even as it also endeavors to make our current condition better.  No fatalistic resignation to no change at all; but no dream of an utterly different way of life.  In short, Jamesian “meliorism,” which looks luke-warm (and therefore to be spewed from the mouth) by the zealot.

“Sufficient unto the day is the evil thereof.”  Attending to the ordinary slings and arrows of daily life, working to ameliorate them insofar as possible, is the recommended path.

But for many that is not enough, not sufficient.  They want grander progress, grander solutions.  My rejection of their negations seems to have three planks.

 

  1. The ontological claim that totalized solutions are not possible.
  2. The aesthetic (?) claim that total negation misses all that is beautiful and delightful in this imperfect world and society we inhabit. The perpetual sourpuss of puritanical absolutism (in whatever form it takes) is not a look I want to adopt for myself or countenance in others.
  3. The political claim that puritanical absolutism also makes its adherents condemn every reform, every change, as insufficient. Just as they cannot affirm any aspect of current life, they also cannot affirm any change in the conditions of current life.  Everything falls short of the desired total transformation.

Oliver Wendell Holmes: Violence and the Law

Holmes’s war experiences left him with the view that it all boils down to force, to the imposition of death.  “Holmes had little enthusiasm for the idea that human beings possessed any rights by virtue of being human.  Holmes always liked to provoke friends who he thought were being sentimentally idealistic by saying, ‘all society rests on the deaths of men,” and frequently asserted that a ‘right’ was nothing more than ‘those things a given crowd will fight for—which vary from religion to the price of a glass of beer’” (369-70 in Budiansky’s biography of Holmes).

Holmes’ rejection of any “natural” theory of rights always returned to this assertion about death:

The jurists who believe in natural law seem to me to be in that naïve state of

mind that accepts what has been familiar and accepted by them and their

neighbors as something that must be accepted by all men everywhere.  The

most fundamental of the supposed preexisting rights—the right to life—is

sacrificed without a scruple not only in war, but whenever the interest of

society, that is, of the predominant power in the community, is thought to

demand it (376).

 

And he understood the law entirely through its direct relation to force.  “The law, as Holmes never tired of pointing out, is at its foundation ‘a statement of the circumstances in which the public force will be brought to bear upon men through the courts’” (435).  “Holmes’s point was that the law is what the law does; it is not a theoretical collection of axioms and moral principles, but a practical statement of where public force will be brought to bear, and that could only be derived from an examination of it in action” (244).  “[H]e would come to insist as a cornerstone of his legal philosophy that law is fundamentally a statement of society’s willingness to use force—‘every law means I will kill sooner than not have my way,’ as he put it[;] . . . he did not want the men who threw ideas around ever again to escape responsibility for where those ideas led.  It was the same reason he lost the enthusiastic belief he once has in the cause of women’s suffrage: political decision had better come from those who do the killing” (131).

Temperamentally, this is easy enough to characterize.  The manly facing up to harsh facts, to an unsentimental view of humans and their social institutions, and a disgust with all sentimental claptrap.

Philosophically, it is less easy to describe.  Where there is power there must be force is clear enough.  But what Holmes seems to miss is that the law often serves as an attempt to restrict force.  Rights (in some instances) are legal statements about instances where the use of force is illegitimate.  Certainly (as Madison was already well aware and as countless commentators have noted since) there is something paradoxical about the state articulating limitations on its own powers.

Who is going to enforce those limitations?  The answer is the courts.  And the courts do not have an army.  That’s what the rule of law is about: the attempt to establish modus vivendi that are respected absent the direct application of force.  Holmes, of course, is arguing that the court’s decision will not be obeyed unless there is the implied (maybe not even implied, but fully explicit) use of state power to enforce that decision.  But his position, like all reductionisms, does not do justice to the complexities of human behavior and psychology.  The Loving decision of 1967, like earlier decisions on child labor laws, led to significant changes in everyday social practice that came into existence with little fanfare.  There are cases where the desire to live within the law is enough; there is an investment in living in a lawful society.  Its benefits are clear enough that its unpleasant consequences (in relation to my own beliefs and preferences) are a price I am willing to pay in order to enjoy those benefits.  Of course, there are also instances where force needs to be applied—as with the widespread flouting of the Brown decision.  My point is simply that the law’s relationship to force is more complex than Holmes allows.  The law is an alternative to violence in many instances, not its direct expression.

My position fits with my notion of the Constitution as an idealistic document, of a statement of the just society we wish to be.  The law is not, as Holmes would argue, completely divorced from questions of morality and justice (more claptrap!).  That relation is complex and often frustrating, but it does no good (either theoretically or practically) to just cut the tie in the name of clear-sighted realism.  Social institutions exist, in part, to protect citizens from force.  And, yes, that can mean in some instances that state force must be deployed in order to fend off other forces.  But it also means in some instances that the institutions serve to prevent any deployment of force at all.  The law affords, when it works, an escape from force, from the unpredictable, uncontrollable and deeply non-useful side effects of most uses of force.

In short, the manly man creates (at least as much as he discovers) the harsh world of struggle he insists is our basic lot.  True, Holmes did not create the war he marched off to at the age of twenty.  He experienced that war as forced upon him.  But he never got quite clear about who was responsible.  He was inclined to blame the abolitionists and their moral fervor, their uncompromising and intolerant absolutism.  He certainly had no patience for their self-righteous moralizing.  Still, blaming them had some obvious flaws, so he ended up converting the idea of struggle into a metaphysical assertion.  He, like Dewey and James, but in a different, more Herbert Spencer-like register, became a Darwinian, focused on the struggle for existence.  But he yoked Darwin to Hobbes; it is not the best adaptation to environmental conditions that assures survival, but the best application of force.  Of course, if the environmental condition is the war of all against all, then the adepts at violence will be the ones who survive.

All of this goes along with contempt for the losers in the battle.  Holmes had no patience with socialists or with proponents of racial justice.  The unwashed were driven by envy; “no rearrangement of property could address the real sources of social discontent” (396), those sources being the envy of the successful by the unsuccessful.  It’s a struggle; just get on with it and quit the whining—or expecting anyone to offer you a helping hand.  Holmes did accept that the law should level the field of struggle; he was (somewhat contradictorily) committed to the notion of a “fair” fight.  Where this ideal of “fairness” was to come from is never clear in his thought—or his legal opinions.  (He was, in fact, very wary of the broad use of the 14th Amendment’s language about “due process” and “equal protection of the laws.”  The broad use of the 14th amendment was being pioneered by Louis Brandeis in Holmes’ later year on the Supreme Court.)  Budiansky is clear that Holmes is by no stretch of the term a “liberal.”

Holmes’s famous dissents from the more conservative decisions of the pre-New Deal Court are motivated by his ideal of fairness—and (connecting to earlier posts about what liberalism even means) that ideal is used against decisions that in American usage are understood as “conservative” even though those conservative decisions were based on the “liberal” laissez-faire idea that the state cannot interfere in business practices.  Holmes’s scathing dissents from the court’s overturning of child labor laws enacted by the states are usually argued on the grounds of consistency.  He says that state governments already regulate commerce (for example, of alcohol), so it is absurd to say they can’t regulate other aspects of commercial activities.

Regulation, it would seem, is always about competing interests.  Since it is inevitable that there will be competing interests, society (through its regulatory laws) is best served by establishing a framework for the balancing of those interests.  Regulation is neither full permission nor full prohibition.  It strives to set conditions for a practice, conditions that take the various interests involved into account.  But Holmes never really worked out a theoretical account of regulation—another place where his reductionism fails him.  Yes, regulations must be enforced, but they are also always a compromise meant to mitigate the need to resort to force–and to prevent anyone from having a full, free hand in the social field characterized by a plurality of different interests and aims.

A Veteran’s Worldview

I have just finished reading Stephen Budiansky’s riveting biography of Oliver Wendell Holmes, subtitled “A Life in War, Law, and Ideas” (Norton, 2019).  Like Louis Menand, Budiansky claims—and makes a very compelling case for the claim—that Holmes’ manner and belief are all shaped by his service in the Civil War.  Holmes was severely wounded twice (once in late July 1861 and then again at Antietam in September 1862).  The second time (like Robert Graves) his death was reported in the newspapers.  Holmes returned to service after both wounds, but saw only limited combat after 1862 since he joined a general’s staff.  He had had more than enough—and quit the war in 1864 as soon as his three year term of service had expired.

Budiansky does a superb job in portraying Holmes’ worldview, one that I think is shared by many veterans.  It certainly resonates with the hard to describe beliefs that animated my own father, who saw serious combat (although far short of the slaughterhouse that was September 17, 1862 at Antietam) in the Pacific during World War II.  At bottom, Holmes became a “it’s struggle all the way down” guy.  In the final analysis, it is force that tells—and that rules.  That is an ugly truth.  Force is relentless, mindless, brutal, and unrelated to justice or any other ideals.  People who mouth ideals or try to call others to account in the name of ideals are naïve at best, deluded hypocrites speaking claptrap.  At worst, they are moralistic despots, deploying their moral certainties to tyrannize over the rest of us.  Dewey’s pragmatist attack on “the quest for certainty” becomes in Holmes the justification of an activist pluralism.  The role of the law is to create a social field in which individuals are free to live their lives according to their own vision of the good life.  Oddly enough, this yields a positive value: basically the very English value (both Holmes and my father were over-the-top Anglophiles) of “fair play.”  Holmes’ Supreme Court decisions, in almost every instance, were directed to leveling the playing field, to denying any one or any group more power than any other.  Thus he was a liberal in the Judith Shklar’s “liberalism of fear” sense; the focus is on preventing concentrations of power.

But Holmes (and here he is also very pragmatist) did not accept that uncertainty meant nihilism.  “’Of all humbugs the greatest is the humbug of indifference and superiority,’ he wrote . . . in 1897. ‘Our destiny is to care, to idealize, to live toward passionately desired ends.’ He always dismissed the nihilistic attitude ‘it is all futile,’ which he termed ‘the dogmatism that often is disguised under skepticism.  The sceptic has no standard to warrant such universal judgments.  If a man has counted in the actual striving of his fellows he cannot pronounce it vain’” (130).

Eureka!  I can’t help but take this for the cornerstone.  It jives with William James’s constant harping of “striving,” and it is tied to a deep commitment to a certain ideal of masculinity.  Holmes (like my father) was clear-eyed about the waste, the futility, the sheer brutal nastiness and devastation of war. He could see that a killing field like Antietam left nothing to individual initiative, ability, or resolve.  It was all sheer chance as to whether one survived or not.  And yet, he still hung on to the time-worn notion that war was the supreme test of manhood—and thus valuable because (for reasons never examined) manhood has to be tested.  Maybe that goes back to the struggle thing; one needs to compete against others for the prize of being able to, in one’s own eyes and in the eyes of others, be accounted a man.  Since the struggle lies in front of us, the prize goes to those who most energetically strive.  And by upping the stakes to life or death in the way that combat does, manhood is fully tested.

Thus, he famously wrote (in 1884) of himself and his fellow Civil War veterans: “We have shared the incommunicable experience of war; we have felt, we still feel, the passion of life to its top . . . Through our good fortune, in our youth our hearts were touched with fire” (127).  And later, during the First World War, he wrote: “I truly believe that young men who live through a war in which they have taken part will find themselves different thenceforth—I feel it—I see it in the eyes of the few surviving men who served in my Regiment.  So, although I would have averted the war if I could have, I believe that all the suffering and waste are not without their reward.  I hope will all my heart that your boys may win the reward and at not too great a cost” (363).

That last bit strikes the note perfectly.  A real desire to avoid war joined with an equally real belief that war brings its own distinctive rewards, along with the absolute distinction between those who have the incommunicable experienced of war and those who do not.  The veteran is part of the elect; he has looked into the abyss; he has seen the fundamental ugly truth of struggle, and is the better man for it.

In the implacable face of violence and death, high ideals mean nothing.  The only worthy response is to shut up and get on with it. Grim determination, strong silence, and doing the job well are what is worthy of respect; nothing more or less.  His ideal men “were free to be egoists or altruists on the usual Saturday half holiday provided they were neither while on their job.  Their job is their contribution to the general welfare and when a man is on that, he will do it better the less he thinks ether of himself or of his neighbors, and the more he puts his energy into the problem he has to solve” (137).  His contempt for intellectuals and moralists was unbounded.  “More than once he cautioned his friends about ‘the irresponsibility of running the universe on paper. . . . The test of an ideal or rather of an idealist, is the power to hold it and get one’s inspiration from it under difficulties.  When one is comfortable and well off, it is easy to talk high talk’”(131).  His attitude toward intellectuals was very close to that of George Orwell; they talked a talk they never had to walk—and they rendered the world frictionless in their images of its betterment.  It is the contempt of the self-styled man of action for the man of ideals—and is undoubtedly tied up with a cherished ideal of manhood.  And, of course, in both Holmes and Orwell, it comes from two men who are primarily men of words.  But they both share their military experience, so can see themselves as superior to the non-veteran.

When you aspire to be a man of action, the nostalgia for combat is understandable.  What other field of action that is not contemptible does the modern world offer?  What honor is there in making more money than others?  Where, in other words, is the moral equivalent of war?  Certainly not in politics, which is even more contemptible than trade.  Holmes was determined not to become either the gloomy Henry Adams nor the god-seeking William James.  He wanted, instead, to be the tough-minded realist described in the opening pages of James’s Pragmatism book.

I want, in my next post, to consider how tough-minded realism plays itself out in Holmes’ understanding of the law.  But today I will end with the way that realism renders Holmes a pluralist in an additional sense.  He is a pluralist in the John Rawls sense of believing that the central unalterable fact that liberal society must negotiate is the existence of multiple visions of the good, none of which should be allowed to trample on the others.  He is a pluralist in the Isaiah Berlin sense in asserting that, even within a single vision of the good, there are competing goods that require tradeoffs and compromises; we will never getting everything we could wish for because those things cannot co-exist.  Going to the theater tonight means missing a dinner with a different set of friends.  Intellectuals, he thinks, never take the inevitability of never achieving the maximum into account in their criticisms of the men of action or in their imagined utopias.  “Remember, my friend [he wrote], that every good costs something.  Don’t forget that to have anything means to go without something else.  Even to be a person, to be this means to be not that’ (131).

In sum, life’s a struggle and a real man just gets on with the job, harboring no illusion that it will be all wine and roses.  That real man is full of contempt for the complainers and idealists, the ones who aim to change the basic fact of struggle into some kind of gentler form of cooperation that tends toward ameliorating the sufferings of himself and/or others.  You just need to face up to the suffering in stoic silence, doing the best that you can for yourself and for those you love.  Because you are a man and they are depending on you, even as you have no one to depend on but yourself.  It’s a cop-out of your manhood to expect help; it’s a sign of weakness, of not being up to the struggle, to whine for help from the law, from society, from anyone.