Author: john mcgowan

The Future of the Humanities

For some time now, I have a question that I use as a litmus test when speaking with professors of English.  Do you think there will be professors of Victorian literature on American campuses fifty years from now?  There is no discernible pattern, that I can tell, among the responses I get, which cover the full gamut from confident pronouncements that “of course there will be” to sharp laughter accompanying the assertion “I give them twenty years to go extinct.”  (For the record: UNC’s English department currently has five medievalists, seven Renaissance scholars, and six professors teaching Romantic and Victorian literature—that is, if I am allowed to count myself a Victorianist, as I sometime was.)

I have gone through four crises of the humanities in my lifetime, each coinciding with a serious economic downturn (1974, 1981, 1992, and 2008).  The 1981 slump cost me my job when the Humanities Department in which I taught was abolished.  The collapse of the dot.com boom did not generate its corresponding “death of the humanities” moment because, apparently, 9/11 showed us we needed poets.  They were trotted out nation-wide as America tried to come to terms with its grief.

Still, the crisis feels different this time.  Of course, I may just be old and tired and discouraged.  Not “may be.”  Certainly am.  But I think there are also real differences this time around—differences that point to a different future for the humanities.

In part, I am following up my posts about curriculum revision at UNC.  The coverage model is on the wane.  The notion that general education students should gain a familiarity with the whole of English literature is certainly moving toward extinction.  Classes are going to be more focused, more oriented to solving defined problems and imparting designated competencies.  Methods over content.

But, paradoxically, the decline of the professors of Victorian literature is linked to more coverage, not less.  The History Department can be our guide here.  At one time, History departments had two or three specialists in French history (roughly divided by centuries), three or four in English history, along with others who might specialize in Germany or Spain or Italy.  That all began to change (slowly, since it takes some time to turn over a tenured faculty) twenty or so years ago when the Eurocentric world of the American history department was broken open.  Now there needed to be specialists on China, on India, on Latin America, on Africa.  True, in some cases, these non-European specialists were planted in new “area studies” units (Asian Studies, Latin American Studies, Near Eastern Studies etc.).  But usually even those located in area studies would hold a joint appointment in History—and those joint appointments ate up “faculty lines” formerly devoted to the 18th century French specialist.

Art History departments (because relatively small) have always worked on this model: limited numbers of faculty who were supposed, somehow, to cover all art in all places from the beginning of time.  The result was that, while courses covered that whole span, the department only featured scholars of certain periods.  There was no way to have an active scholar in all the possible areas to be studied.  Scholarly “coverage,” in other words, was impossible.

English and Philosophy departments are, in my view, certain to go down this path. English now has to cover world literatures written in English, as well as the literatures of groups formerly not studied (not part of the “canon”).  Philosophy, as well, now incldue non-Western thought, as well as practical, professional, and environmental  ethics, along with new interests in cognitive science.

There will not, fifty years from now, be no professors of Victorian literature in America.  But there will no longer be the presumption that every self-respecting department of English must have a professor of Victorian literature.  The scholarly coverage will be much more spotty—which means, among other things, that someone who wants to become a scholar of Victorian literature will know there are six places to reasonably pursue that ambition in graduate school instead of (as is the case now) assuming you can study Victorian literature in any graduate program.  Similarly, if 18th century English and Scottish empiricism is your heart’s desire, you will have to identify the six philosophy departments you can pursue that course of study.

There is, of course, the larger question.  Certainly (or, at least, it seems obvious to me, although hardly to all those I submit to my litmus test), it is a remarkable thing that our society sees fit to subsidize scholars of Victorian literature.  The prestige of English literature (not our national literature after all) is breath-taking if you reflect upon it for even three seconds.  What made Shakespeare into an American author, an absolute fixture in the American curriculum from seventh grade onwards?  What plausible stake could our society be said to have in subsidizing continued research into the fiction and life of Charles Dickens?  What compelling interest (as a court of law would phrase it) can be identified here?

Another paradox here, it seems to me.  I hate (positively hate, I tell you) the bromides offered (since Matthew Arnold at least) in generalized defenses of the humanities.  When I was (during my years as a director of a humanities center) called upon to speak about the value of the humanities, I always focused on individual examples of the kind of work my center was enabling.  The individual projects were fascinating—and of obvious interest to most halfway-educated and halfway-sympathetic audiences.  The fact that, within the humanities, intellectual inquiry leads to new knowledge and to new perspectives on old knowledge is the lifeblood of the whole enterprise.

But it is much harder to label that good work as necessary.  The world is a better, richer (I choose this word deliberately) place when it is possible for scholars to chase down fascinating ideas and stories because they are fascinating.  And I firmly believe that fascination will mean that people who have the inclination and the leisure will continue to do humanities work come hell and high water.  Yes, they will need the five hundred pounds a year and the room of one’s own that Virginia Woolf identified as the prerequisites, but people of such means are hardly an endangered species at the moment.  And, yes, it is true that society generally (especially after the fact, in the rear view mirror as it were) likes to be able to point to such achievements, to see them as signs of vitality, culture, high-mindedness and the like.  But that doesn’t say who is to pay.  The state?  The bargain up to now is that the scholars (as well as the poets and the novelists) teach for their crust of bread and for, what is more precious, the time to do their non-teaching work of scholarship and writing.  Philanthropists?  The arts in America are subsidized by private charity—and so is much of higher education (increasingly so as state support dwindles.)  The intricacies of this bargain warrant another post.  The market?  Never going to happen.  Poetry and scholarship is never going to pay for itself, and novels only very rarely so.

The humanities, then, are dependent on charity—or on the weird institution that is American higher education.  The humanities’ place in higher education is precarious—and the more the logic of the market is imposed on education, the more precarious that position becomes.  No surprise there.  But it is no help when my colleagues act as if the value of scholarship on Victorian literature is self-evident.  Just the opposite.  Its value is extremely hard to articulate.  We humanists do not have any knock-down arguments.  And there aren’t any out there just waiting to be discovered.  The ground has been too well covered for there to have been such an oversight.  The humanities are in the tough position of being a luxury, not a necessity, even as they are also a luxury which makes life worth living as contrasted to “bare life” (to appropriate Agamben’s phrase).  The cruelty of our times is that the overlords are perfectly content (hell, it is one of their primary aims) to have the vast majority only possess “bare life.”  Perhaps it was always thus, but that is no consolation. Not needing the humanities themselves, our overlords are hardly moved to consider how to provide it for others.

More Comments on What We Should Teach at University

My colleague Todd Taylor weighs in—and thinks he also might be the source for my “formula.”  Here, from Todd’s textbook is his version of the three-pronged statement about what we should, as teachers, be aiming to enable our students to do.

  1. To gather the most relevant and persuasive evidence.
  2. To identify a pattern among that evidence.
  3. To articulate a perspective supported by your analysis of the evidence.

And here are Todd’s further thoughts:

“I might have been a source for the ‘neat formula’ you mention, since I’ve been preaching that three-step process as “The Essential Skill for the Information Age” for over a decade now.  I might have added the formula to the Tar Heel Writing Guide.  I am attaching a scan of my textbook Becoming a College Writer where I distill the formula to its simplest form.  I have longer talks on the formula, with notable points being that step #1 sometimes includes generating information beyond just locating someone else’s data.  And step #3, articulating a perspective for others to follow (or call to action or application), is the fulcrum where “content-consumption, passive pedagogy” breaks down and “knowledge-production, active learning” takes off.

The high-point of my experience preaching this formula was when a senior ENGL 142 student shared with me the news of a job interview that ended successfully at the moment when she recited the three steps in response to the question ‘What is your problem solving process?’

In my textbook, I also have a potentially provocative definition of a “discipline” as “a method (for gathering evidence) applied to a subject,” which is my soft attempt to introduce epistemology to GenEd students.  What gets interesting for us rhet/discourse types is to consider how a “discipline” goes beyond steps #1 and #2 and includes step #3 so that a complete definition of “discipline” also includes the ways of articulating/communicating that which emerges from the application of a method to a subject.  I will forever hold onto to my beloved linguistic determinism.  Of course, this idea is nothing new to critical theorists, especially from Foucault.  What might be new(ish) is to try to explain/integrate such ideas within the institution(s) of GenEd requirements and higher ed.  I expect if I studied Dewey again, I could trace the ideas there, just as I expect other folks have other versions of the ‘neat formula.'”

Todd also raised another issue with me that is (at least to me) of great interest.  The humanities are wedded, we agreed, to “interpretation.”  And it makes sense to think of interpretation as a “method” or “approach” that is distinct from the qualitative/quantitative divide in the social sciences.  Back to Dilthey.  Explanation versus meaning.  Analysis versus the hermeneutic.  But perhaps even more than that, since quantitative/qualitative can be descriptors applied to the data itself, whereas interpretation is about how you understand the data.  So no science, even with all its numbers, without some sort of interpretation.  In other words, quantitative/qualitative doesn’t cover the whole field.  There is much more to be said about how we process information than simply saying sometimes we do it via numbers and sometimes via other means.

Comments on the Last Post

Two colleagues had responses to my post on the curriculum reform currently in proces at UNC.

First, from Chris Lundberg, in the Communications Department, who thinks he may be the source for my (stolen) list of the primary goals of university education in our information saturated age:

“I think I might be the unattributed source for the formula!

The only thing that I’d add to what you already wrote here is to disassociate capacities from skills. Here’s an abbreviated version of my schtick (though you’ve heard it before).

The university is subject to disruption for a number of reasons: folks don’t understand the mission; the content we teach is not responsive to the needs that students have beyond Carolina, and lots of folks have a legitimate argument to teach information and skills.

One of the ways we talked about this in the conversation we had awhile back was to ask “what are the things that can’t be outsourced?”—either to another mode of learning information or skills, or, in the case of the job market, to someone behind a computer screen somewhere else. So the formulation that we’d talked about was something like If you can learn content remotely, the vocation organized around that content that is highly likely to be outsourced.

So the case for the university also has to be a case about what is unique about the mode of instruction. That’s the thing about capacities. They aren’t just about something that you learn as content, they are also the kinds of things that you have to do and receive feedback on in the presence of other folks. Writing, Speaking, reasoning together, framing arguments, etc.

The information/content part of education doesn’t make a case for the uniqueness of the university—the Internet is real good at delivering information. You don’t need a library card anymore to access the repository of the world’s information. What you need is to learn how to effectively search, pick, and put together a case for what that information is useful. The capacity for sorting and seeing connections in information is the thing. (see the “neat formula”)

Skills (or as the folks in the corporate sector call them now, competencies) are defined by the ability to know how to perform a given task in a given context. Their usefulness is bounded measuring (typically a behavior_) against the demands of that context. A capacity, OTOH is a trans-contextual ability (set of habits of inquiry and thought, ways of deliberating, etc.) that works across multiple contexts. For example, the biology text my dad used was horribly misinformed about genetic expression (they didn’t know about epigenetics, micro RNA, etc.). What was valuable about his biology class (dad was a biotech entrepreneur) was that he learned how to engage the content: what was a legitimate scientific argument; what made a study good; how to do observational research; a facility for the vocabulary, etc. That set of capacities for thinking about science benefitted him even if the content did not. A capacity is something like the condition of possibility for learning content—think Aristotle on dunamis here—not unlike a faculty in its function, but unlike a faculty because it is the result of learning a specific style or mode of thought and engagement. Where faculties are arguably innate, capacities are teachable (constrained by the innate presence of faculties). That, by the way, is what makes it hard to outsource capacity based learning either in terms of the mode of learning (harder to do lab research online) and in terms of the vocation (you can’t acquire it as effectively as you might in the context of a face-to-face learning community).

So, a big part of the sell, at least in my opinion, should be about framing capacities as the socially, politically, and economically impactful “middle” ground between information and skills—and therefore justifying both the university and Gen Ed as an element of a liberal arts curriculum.”

Second, from my colleague in the English Department, Jane Danielewicz, who puts some flesh on the bones of “active learning” and weighs in issues of assessment:

“If we relinquish our grip on teaching primarily content, then we must also develop new methods of assessment.  Our standard tests are content focused.  To assess competencies, students must be asked to demonstrate those competencies.  Our methods of assessment will need to evaluate students’ performances rather than their ability to regurgitate content knowledge.

We should be asking students to write in genres that are recognizable to audiences in various real world settings.  We should also strive to provide real occasions where a student can demonstrate their competencies to an audience, starting with an audience of their peers and moving out from there.  For example, students can present posters or conference papers at a min-conference (held during the exam period).

Assessment can be tied to the genre is question.  E.g. for the conference presentation (and we all know what makes a good or bad conference presentation–and should work to convey that knowledge to students), students can be assessed on how well they performed the research, made an argument, supplied evidence, and communicated (orally and visually).

Yes, classes will need to be redesigned to encourage active learning, immersive classroom environments, process-based instruction, problem-oriented class content, and appropriate assessment methods.  Many faculty are already moving in these directions, teaching in ways that develop students’ competencies.  Faculty organizations such as the Center for Faculty Excellence are (and have been) providing instruction and support for active learning, experiential learning, and collaborative learning practices.  Some of our students have built web-sites, started non-profit organizations (grounded in research about an issue), written family histories, presented at national conferences, and published in peer-reviewed journals.  We will be sorely disappointing our very action-oriented student body if we retrench and insist on a coverage model of GenEd.”

 

 

What Should—and Can—the University Teach?

The University of North Carolina, Chapel Hill is currently attempting to develop a substantially new “general education” curriculum.  GenEd, as it is known at Carolina, is the broad “liberal arts” part of a student’s college career; it is a set of requirements separate from the more specialized course of study that is the “major.”

Anyone even remotely connected to universities knows that changing the curriculum always insures lots of Sturm and Drang, gnashing of teeth, and ferocious denunciations.  Much of this is driven by self-interest; any change, necessarily, will benefit some people more than others.  At a time when students are abandoning the humanities (particularly) and the social sciences (to some extent) as “majors,” the health of those departments depends, more than in the past, on enrollment in their “GenEd” classes.  Thus, any curricular change that seems to funnel fewer students toward those classes is viewed as a threat.  Of course, an oppositional stance taken on that ground pushes the (presumably) primary responsibility of the university to serve the educational needs of its students to the back seat, displaced by internal turf battles.

But there is a legitimate, larger issue here—and that’s what I would like to address.  What does a student in 2019 need to know?  And how does our current understanding of how to answer that question relate to the “liberal arts” as traditionally understood?  At a time when respect for the liberal arts in the wider culture seems at an all-time low, how can their continued centrality to university education not only be protected, but (more importantly) justified or even expanded?

My sense is that practitioners of the liberal arts are having a hard time making the shift from a “coverage” model to one that focuses on “skills” or “capacities.”  Yes, all the proponents of the liberal arts can talk the talk about how they teach students to “think critically” and “to communicate effectively.”  So, all of us in the humanities (at least) have, to that extent, adopted skills talk—even where we fear that it turns our departments into training grounds for would-be administrators of the neoliberal world out there.  But, in our heart of hearts, many of us are really committed to the “content” of our classes, not to the skills that, as by-products, study of that content might transmit.

But, please, think of our poor students! The vast universe of knowledge that the modern research university has created means, as any conscientious scholar knows, that one can spend a lifetime studying Milton and his 17th century context without ever getting to the bottom.  Great work if you can get it.  And isn’t it wonderful that universities (and, by extension, our society) sees fit to fund someone to be a life-long Milton devotee?  But it is futile to think our undergrads, in two short years before they assume a major, are going to master Etruscan pottery, Yoruba mythology, EU politics, the demographics of drug addiction, the works of James Joyce, and the principles of relativity.  The standard way of approaching (ducking?) this conundrum has been “survey courses.”  The “if this is Tuesday, it must by John Donne” approach.

Any teacher who has ever read the set of exams written by students at the end of those survey classes knows what the research also tells us.  They are close to useless.  They are simply disorienting—and fly through the material at a speed that does not generate anything remotely like real comprehension.  The way people learn—and, again, the research is completely clear on this point—is by taking time with something, by getting down and dirty with the details, followed by synthesizing what is learned by doing something with it.  Active learning it is called—and, not to put too fine a point on it, faculty who despise it as some fashionable buzz-word are equivalent to climate change deniers.  They are resolutely, despite their claim to be scholars and researchers, refusing to credit the best research out there on how people learn.

Back to our poor student.  Not only has she been subjected to survey classes, but she has been pushed (by curricular requirements) to take a smorgasbord of them, with no effort to make the various dishes relate or talk to one another.  Each course (not to mention each department) is its own fiefdom, existing in splendid isolation from all the rest.  The end result: students have a smattering of ill-digested knowledge about a bunch of different things, with no sense of why they should know these things as opposed to other possible ones, and with no overarching sense of how this all fits together, or a clear sense of what their education has actually given them.  If we wanted to create confusion, if that was our intended outcome, we could hardly have done better.

The “content” approach in my view, then, leads to confusion for the students and tokenism in the curriculum.  We simply cannot deliver a meaningful encounter with the content of our multiple disciplines during GenEd. So the question becomes: what can we do that is meaningful in the GenEd curriculum?  After all, we could scrap GenEd altogether and do as the Brits do: just have students take courses in their chosen majors during their college years.  Like most American educators, I think the British model a bad mistake.  But that does mean I have to offer a coherent and compelling account of what GenEd can do—and the best way to insure it does what it aims for.

The answer, I believe, is to define what we want our students to be able to do as thinkers and writers.  Here’s a neat formula I stole from someone (unfortunately, I cannot remember my source).  We want a student in 2019 to 1) learn how to access information; 2) learn how to assess the information she has accessed; and 3) know how to use that information to solve specific problems and to make a presentation about it to various audiences in order to communicate various things to those audiences.  I take number 3 very expansively to include (crucially) understanding (through having some experience of) the fact that members of your audience come from very different backgrounds, with very different assumptions about what matters. Thus, effective communication relies heavily on being able (to adapt Kant’s formula) “empathize with the viewpoint of the other,” while effective problem solving relies on being able to work with others. Assessing information (#2 on my list) involves understanding that there are various methods of assessment/evaluation.  Judging the features of a text or a lab experiment in terms of its technical components and the success with which they have been deployed is different than judging its ethical implications.

I think we can, if careful and self-conscious, make significant progress toward achieving the three goals stated above during the first two years of college.  I think success requires that we de-fetishize content; that we design our classes to develop the identified skills; and that we re-design our classes to make sure we are achieving them.  Assessment will come in many different varieties, each geared to evaluating students’ performances of the competencies rather than to regurgitation of content knowledge. We should be asking students to “perform” their skills, which involves (partly) the presentation of knowledge acquired through reading, research and hands-on experience, in a variety of genres for different kinds of audiences.  The quality of their performances will be the first indication of whether or not we are being pedagogically successful.

I will confess real impatience with teacher/scholars who resist all “assessment” as a dirty word.  Somehow we are supposed to magically know that we are actually teaching our students something, when (in the old curriculum) all we really knew was that the students had checked off the requisite boxes, gotten a grade, and been passed on.  It is no secret that universities have neglected the arts and sciences of pedagogy over the years—and there is no excuse for it.  If we claim to be teaching our students, we need 1) to state clearly and precisely what it is we claim to be teaching them; 2) to do the work necessary to ascertain that we are actually succeeding; and 3) revise our methods when they are not getting the job done.

Necessarily, courses will still have “content”—and that content matters a lot!  The capacities will be taught through a semester-long engagement with some specific subject matter.  In my ideal university, the person we hire to teach medieval literature, or the history and beliefs of Buddhism, or astronomy, is someone who, in their heart of hearts, believes life is less worth living if you don’t know about their subject of expertise.  They convey that enthusiasm and conviction when they teach their classes—and gain a reputation on campus that attracts students because it is known that Professor X makes Subject Y come alive.  But Professor X also has to know, on another level, that the vast majority of her students are not going to make Subject Y their life work and that even a vaster majority of the human race will lead worthy lives knowing nothing whatsoever of Subject Y.  So we are asking our professor to also—and consciously—design her class to develop some specified capacity.  In other words, her class should model a way of thinking, and require students to put that model into practice.

The proposed new curriculum at Chapel Hill moves from the “coverage” model to one focused on skills or capacities.  I think that means we are moving from something we cannot possibly achieve to something we can, perhaps, do.  I also think the new curriculum has the distinct advantage of trying to specify those skills and capacities.  And it challenges our faculty to craft their classes with care in order to inculcate those capacities in our students.  It is a feature, not a bug, in my eyes that many of our classes will need to be modified.  The point of change is change.  Doing the same old same old is not an improvement—and I, for one, think the need for improvement is evident.

Is the new curriculum perfect?  Of course not.  We cannot know with any certainty exactly how it will play out.  The definition of the capacities and the most effective ways of transmitting them to students will have to be honed and reformed through the crucible of practice.  Any successful institution needs to fight calcification tooth and nail, continually revising itself at it moves along, with an eye firmly on the goals that motivate its practices. The tendency of institutions to stagnate, to do something because that’s the way it has always been done and how it currently distributes its resources and rewards, is all too familiar—and depressing. Change is upsetting and, as I said at the outset, some will benefit more than others from change.

In fact, I think the proposed curriculum protects the arts, humanities, and social sciences at a time when they are particularly vulnerable. I also think the liberal arts will suffer if they stick resolutely to old models that do not respond to larger cultural shifts.  We cannot resist or even speak to those shifts if we don’t find a way of meeting our students—who come to college now with a set of needs and objectives that represent their own response to new societal pressures—at least halfway.  We also must recognize that students will, inevitably (within the “elective” system that dates back to the 1880s) make their own decisions about what courses to take.  Thus we must articulate clear rationales for them to take the various courses that will be available within the GenEd curriculum.  What I like about the new curriculum is the way that it calls us to our task as educators, asking us to identify what we believe passionately our students need to learn, and placing the responsibility that our students get there in our hands.