Tuesday, April 28, 2009
Project-Based Education? A Response to Mark Taylor
It's a frustrating piece, since it moves quickly from 'insightful' to 'crackpot' and back again.
I'll start with some glaring blind spots that made it hard for me to take the piece seriously. As is often true of faculty who have never worked in administration, Prof. Taylor takes existing institutions for granted, even as he claims to move past them. For example, if colleges redid their curricula every seven years or so – his suggested lifetime for the project-based constellations he favors – that would involve every seventh year putting entire new programs through the shared governance process, coming up with entirely new job descriptions, hiring committees, student learning outcomes, assessment mechanisms, articulation agreements, catalog copy, advisor training, and the rest. Who, exactly, would do all this in the absence of departments or permanent faculty goes unmentioned. Lest this seem like an unfair summary:
Abolish permanent departments, even for undergraduate education, and create problem-focused programs. These constantly evolving programs would have sunset clauses, and every seven years each one should be evaluated and either abolished, continued or significantly changed. It is possible to imagine a broad range of topics around which such zones of inquiry could be organized: Mind, Body, Law, Information, Networks, Language, Space, Time, Media, Money, Life and Water.
Alrighty then. Who would evaluate them? Who would define them? (And don't tell me 'the faculty in the program.' They would be hired based on the answers to those questions, not the other way around.) Who would make the decisions to 'abolish, continue, or significantly change' them? Who gets to stay after the sun sets? In the absence of continuity, how would standards develop? Who would define them? What happens to a student who enrolls during, say, the fifth year of a seven year program? Would credits from other programs articulate? If not, would students be unable to transfer from one college to another?
These may sound picky, but they're fundamental.
Yes, the common currency of 'credit hours' is reductive, and I've gone on record saying that until we move away from seat-time-based measures, the upward cost spiral won't go away. But you don't replace something with nothing. If we do away with recognizably transferable credits, what do we replace them with? You might be able to get away from it at the toniest of SLAC's, but the residential-students-with-no-attrition model describes only a very narrow niche of higher education in America.
Then, obviously, there's the matter of graduate training. I agree with Taylor that grad school in the humanities and most of the social sciences is a pyramid scheme. I also agree that mandatory retirement ages and a renewable-contract system for faculty would be vast improvements over the landed aristocracy we currently support at the adjuncts' expense. But I'm at a loss to explain where all these interdisciplinary experts will get their disciplinary expertise. Yes, a significant part of grad school involves exploring new questions. But another significant part – the part he skips – involves getting grounding in the history of a given line of inquiry. Call it a canon or a discipline or a tradition, but it's part of the toolkit scholars bring to bear on new questions. Abandoning the toolkit in favor of, well, ad hoc autodidacticism doesn't really solve the problem. If anything, it makes existing grads even less employable than they already are. I need to hire someone to teach Intro to Sociology. Is a graduate of a program in “Body” or “Water” capable? How the hell do I know? (And even if I think I do, can I convince an accrediting agency?) Am I taking the chance? In this market? Uh, that would be 'no.'
(His proposed solution of extending the change to undergraduate programs actually makes it worse. “Sorry, 'water' grad, that expired last year. We're into 'money' now. Your graduate work is so last year.” In the absence of disciplines, we'd have nothing but fads.)
So we'd have faculty hired by nobody in particular, based on ever-shifting job descriptions written by nobody in particular. They would teach, uh, whatever, to students who happen to start at the right time, and who never drop out or transfer. (“Sorry, kid, we aren't accepting new students this year. Try again next year, when the theme will be cyborgs and we'll have all new faculty to teach it.”) And the graduate students would have to hope that whatever theme they studied in grad school would happen to roll around at the teaching colleges to which they'd apply for work. Unless they didn't. Which is fine, since there's no hotter ticket on the job market than an unemployed, esoteric Ph.D.
Yes, the existing structures are clunky and overtaxed and frequently asinine. They survive because they address certain problems. The way around them is not to wish those problems away or to postulate a world in which every college is modeled on a graduate seminar at Columbia. It's to come up with alternatives that solve those problems better. Prof. Taylor's model could be a lot of fun on a very small scale, like a think tank. But as a blueprint for higher ed across America, it's a farce.
The reality of higher ed in America is mobility. People move from one institution to another all the time. We've developed an admittedly frustrating common language to make that kind of movement possible. Replacing that common language with a babel of tongues is not a serious answer, and replacing what little common knowledge that clusters of scholars share – canons or classics or traditions – with whatever seems convenient at the time would only make matters worse. Disciplines are arbitrary and flawed, but random fads are even worse. And incompatible random fads at different institutions would be disastrous.
I have a recurring dream that someday, somehow, the New York Times will hire a columnist on higher education who actually understands what s/he is talking about. Maybe we could start a graduate program on 'dreams'! Let's see, upon graduation, students will be able to...
I'm not disagreeing with you, I think there are fundamental flaws in Taylor's argument, and I love the way you describe him shifting from insightful to crackpot. But it seems like you're poking holes in his argument with the assumption that only some institutions would adapt this model. If only some institutions change, then yes, accrediting institutions and articulations agreements and governance structures would be a huge problem.
But what if EVERYTHING changed? What if we reimagined higher ed from the ground up? As a longtime reader and someone who has a great deal of respect for you, and someone who is in the early stages of a career in administration, I'd love to see you take the opposite stance on this argument, and imagine ways it (or something like it, or even something else entirely) could be possible. I would learn a lot from reading that argument.
Just my two cents. :)
"But I'm at a loss to explain where all these interdisciplinary experts will get their disciplinary expertise. Yes, a significant part of grad school involves exploring new questions. But another significant part – the part he skips – involves getting grounding in the history of a given line of inquiry. Call it a canon or a discipline or a tradition, but it's part of the toolkit scholars bring to bear on new questions. Abandoning the toolkit in favor of, well, ad hoc autodidacticism doesn't really solve the problem."
EXACTLY. I've been very involved with our interdisciplinary BA and MA programs, and my own research is interdisciplinary (though grounded within a traditional discipline). I can't state strongly enough that interdisciplinary inquiry is just not possible if we do away with the disciplines wholesale.
That said, I've been thinking a lot about ways to rethink the structure of general education requirements, and I can imagine that categories like those that Taylor suggests (though perhaps broader than "water") based in real world questions that higher education can help students to think about deeply might be more illuminating than categories like "humanities." In other words, would it be possible to keep a transferable set of courses for requirements while framing those requirements as ones that help us to answer questions, as opposed to framing them as disconnected courses that a student knows he or she must take without a clear rationale for what those courses will help him or her to do? That to me is a potentially interesting experiment.
Of course, that experiment really has little to do with what Taylor suggests.
So I read this proposal, actually, as a *faculty-centered* proposal--it speaks to the desire of the faculty to do new and interesting things, to be free to develop new directions. But it does not, it seems to me, to speak to the apparent desires of our current generation of students. (And, yes, I include up-scale liberal-arts colleges in this. At my own undergrad institution, the four largest majors are education, sciences for pre-med, social sciences for pre-law, and economics for pre-MBA. Harvard is not much different from that; a huge percentage of the undergrads there take course work designed to get them into professional grad schools.)
I'm somewhat less concerned about the details of how this would work, because actual models do exist. I'm concerned about how disconnected it is from the students we purport to serve.
But there's another issue that doc brings up as well--our students and their majors. How is a first-generation college kid raised on a pig farm going to tell her parents that she's majoring in "body"? When money is tight (and it's always tight), moving higher ed into the amorphous and ill-defined seems problematic.
Taylor's solution is SO privileged. It just doesn't take into account the reality for most of our students who are still focused on the practicality of their degree.
Still, I see a lot of "needs-based" programs withering on the vine both at our universities and at our more technical colleges here in Canada as academics find they're lagging a couple of years behind what the next big thing is for employers and society's interest.
If you make the stupid assumption that there is no such thing as finite resources, the overall idea is still pretty stupid. But not as stupid as it is in the real world.
I like it.
The narrowness he writes about is not the norm in the science area that I come from. In physics, collaboration - both within and without the department - is and was the norm. High energy physics takes this to the extreme, where the 400 member research groups of the last century look small by comparison to today.
I think Sarah raises a good point when asking about Soc 101. Even if traditional departments did not exist anymore, there would still be content to be taught. (I am assuming, here, that there is content taught in Soc 101.) The question of how a Dean would determine if a person is qualified to teach a course in quantum mechanics or statistics or sociology without a disciplinary degree will remain even in a "program" based system. That is the core flaw in his argument.
That "international relations" professors had no knowledge of religion's role, well, that is just a failing of general education of their undergrad schools and basic cognitive function of the people involved.
Missing from the piece, too, is any sense of the role disciplines play in intellectual stewardship. What the heck--reinvent everything every 7 years! The good will rise to the top! I like the idea of short-term interdisciplinary institutes focused on problems, but as a complement to disciplinary training, not as a replacement.
While Taylor was at Williams, he was just the "humanities" professor. He could get away with all kinds of weird things in his classes because he was weird.
There was also a line in the school humor magazine: "Mark C. Taylor jealous of Jay Pascachoff's ego."
Also having met the man, he's fascinating to listen to, "moves quickly from 'insightful' to 'crackpot' and back again" sums him up perfectly. He was one of the few humanities professors willing to be the technology innovator and try out new things in teaching. So he does put his money where his mouth is.
I liked the article because it elucidated all of the major issues in non-science liberal arts programs. Also, sometimes imagining the big picture and thinking big can be useful, especially when we're thinking how to fix something as convoluted and culturally intertwined as academia.
But yes, Taylor is totally coming at this problem from a SLAC POV. Completely. A completely different animal from a CC.
I just realized that from the library collection development standpoint, his open curriculum that changed every 7 years would be a disaster under current ways of accessing information. Unless you had universally free, accessible materials, it would be impossible.
I am wondering about the possibility of perhaps changing the scale of the interdisciplinary efforts. I'm reminded about how New York's Normal, or teaching colleges usually specialized in one kind of teaching. So you went to Geneseo if you wanted to go into speech therapy or school libraries. I was thinking perhaps that many institutions could go back to that model. You have the same fields and training, but you know that there are a whole bunch of both professors and part-time adjuncts working on the issue of "water." Of course, the ideal would be to have these institutions be fairly small and therefore more flexible.
Many students aren't prepared to learn in school because they are hindered by so many personal issues from their home (or lack of it).
If you want to fix education, our society must work to fix the home - particularly the family.
And the sad truth is that many Amreican families are broken leaving children confused, rebellious and unwilling to learn.
I do wish, however, that there were some way to try to re-imagine higher education in ways that don't lead back to "the existing structures . . . survive because they address certain problems." Or lead to hideous, Orwellian schemes, such as alternative models of instruction such as the one proposed by the chancellor in Tennessee (you've probably already discussed this . . . http://www.insidehighered.com/news/2009/02/05/tennessee).
Anyway, your critique is right on. And I'm on sabbatical, so I'm going to back to dreaming up stuff before fall, when I'll be right back teaching "second semester composition," and not, sadly, "Time."
I guess...interdisciplinary projects are beneficial since they're goal-oriented, and this counts for undergrad senior theses and such.
What's to say that "soc101" can't be taught under Taylor's vision? Maybe it would just be called "intro to Marx, Durkheim & Weber". I suppose there's a difference between real departments and abstract academic discourses. What's not to say that you can't have an "intro to Marx, Durkheim & Weber in the contemporary context of Body" class?
Project based education in nice but by doing this way students could not learn new and different things
So, I think it's not a good idea