Wednesday, August 26, 2009
Edupunks and Credit Hours: Fumbling Towards a Theory
An alert reader sent me (last week) this article about how the open-content movement will explode higher education as we know it. Apparently, newly empowered 'edupunks' will cobble together their own educations, thumbing their noses at The Man while reaping the lucrative rewards of globalization.
The article has more than a whiff of the self-congratulatory vanguard about it, but look past that.
The part of it that has stuck in my craw for the last week -- in the way that ideas with kernels of unpleasant truth tend to do -- is the distinction between course credit and demonstrated competence.
It's certainly true that people can learn important things outside of credit-bearing classes. And in some parts of the curriculum, even the stodgier colleges have long had provisions for students to "test out" of individual courses. The idea there is there's little point in marching you through a course when you've already mastered its key content.
While that model isn't new, it has historically been confined to the margins. An AP course here, a CLEP there, but you still have to take enough credits to graduate. (Many colleges have 'residency' requirements that even limit the number of credits you can transfer in.) This article suggests that it's becoming easier to build your own modular degree through a program of sustained self-teaching and exams.
A few years ago I did a thought piece envisioning the rise of Efficient Degree Organizations that would act as something between aggregators and sherpas, helping students put together degree programs on their own terms. The idea there was that with online courses breaking the tyranny of geography, it made a certain amount of sense for students to start acting like internet shoppers.
Now I'm wondering if the 'credit hour' might be the line of attack.
Credit hours are bureaucratic constructs that have little to do with teaching. They're ways of breaking curricula into component parts, the better to allow for transfer, substitution, and the like. (In most states, they've also become tied to various funding formulae. We measure our enrollment both in terms of headcount -- that is, people -- and FTE's, which are denominated in credit hours.) They make inevitable a cost spiral that far outplaces inflation, since you can't increase productivity when your units are measured in time. (As the rest of the economy becomes more productive per hour and teaching doesn't, teaching becomes relatively more expensive.)
Awarding some sort of recognition for task completion or demonstrated competence independent of the time it took to achieve that offers one potential way to break the upward spiral. If you manage to blast through calculus in eight weeks instead of fifteen, more power to you.
That said, though, I could easily envision the abandonment of the credit hour as relatively beneficial to those already on top -- in four years at my SLAC, I never heard the phrase 'credit hour' -- and devastating to the rest.
For students who don't already have considerable cultural capital when they walk in the door, the 'set' curriculum with semesters and credits offers a clear path. It makes the route to achievement legible, even if daunting. It defines a normative amount of time for a course of study (the "two-year degree"). And it allows faculty to push students into courses they might not choose for themselves, based on a sense of educational good. (If I had a nickel for every student at Proprietary U who asked "why do I have to take this?" I'd be a wealthy man.) Yes, business majors need to take English, and yes, many of them would avoid it given the choice.
To the extent that we move from "here's what you need to do" to "what do you want?," we both enable high achievers to cut loose -- a clear good -- and allow the less savvy to wander aimlessly, which is a real problem.
The "edupunk," as near as I can tell, is the nifty-sounding update of the autodidact. And as with the autodidact, the edupunk is susceptible to some predictable shortfalls: uncorrected blind spots, lack of broader perspective, too-early path dependence.
If colleges are going to continue to earn their keep, they'll need to address the very real economic issue of the credit hour, without forfeiting the real value created by making courses of study -- as opposed to individual courses -- legible. That means not giving up on 'general education,' no matter how much some students bitch about it. It also means getting out in front of a competence-driven currency, lest it leave us behind. It probably means making convincing arguments to the effect that an education is more than the sum of its parts. (Hint: the social and extracurricular aspects are not to be discounted.)
As disconcerting as some of that is, I'd hate to see colleges go the way of newspapers. When the mode of production changes, typically, the leading producers change, too. The mode of production of education has to change, and now, can. We'll need to come to grips with that in some sort of serious way, or others will, edupunks or not.
Everyone thought it might be a way to create flexibility but in fact it wound up creating sclerosis. Instead of seeing the 5 or 6 week exit as a reward for hard work and competence, some people tended to see it as a frustrating carrot they could never grasp, and when the weeks had passed without the work being finished, they blamed me for slowing them down with my silly insistence on "quality."
Others saw the two semester maximum not as a maximum but as a procrastinator's dream: two semesters to complete what their peers had to do in one! Kool!
The blame for a lot of the unintended consequences is mine for not being more organized, demanding, inspirational, hardassed, whatever. But that's my genuine cc experience with a different approach to hours.
In the visual arts, where I teach, credit hours (or even a degree from our elite institution) are worth next to zilch. Your portfolio gets you the opportunity, and the unfortunate truth is that many who come into the programme are guaranteed successes from day one, and others are guaranteed non-successes from day one (in that field), and no amount of hours will change the raw material of cultural capital.
If you are educating for a job with much more hierarchical and specific requirements, then for sure, knowing that someone has spent a certain amount of time on something is a useful heuristic in employment.
The credit hour is a poor fit in many cases, but as in everything in the academic game, it seems to me the usual answer is, as you so often remind us, "it depends".
The Deans Scholars on the other hand was a special program for students in the top 1% of school leavers, they basically could do any courses they like from the entire University in any order they liked. So what did they do, the vast majority of them would pick a degree that they liked and follow the structure of that degree very closely. Most students like structure, most students really don't understand what they need to study, I know I didn't when I first started Uni, even though I had a pretty clear idea where I wanted to end up, these roll-your-own degrees are maybe suitable for a very small percentage of students
But think of chemistry and physics and geology and psychoilogy and other subjects in which a laboratory experience is essential to learning the material at any level of competence. That experience does not translate out of an organized and relatively structured environment.
(I suspect, more generally, what are usually called "professional" degrees are fairly highly subject to this phenomenon. For nursing, for example, the point is obvious. These days, getting a business degree without a fairly large amount of structured team work is all but impossible. And so on.)
We we were also encouraged to take the various CLEP and DANTES tests to earn college "credits". My transcript includes about 50 credits from these sources in a variety of subjects.
There are two college tracks -- the "piece of paper" track and the "I actually learned something" track. any system that can contain both will be clunky for either.
"'Suddenly, it is possible to imagine a new model of education using online resources to serve more students, more cheaply than ever before."
"The Internet disrupts any industry whose core product can be reduced to ones and zeros,' says Jose Ferreira, founder and CEO of education startup Knewton."
Online resources are more efficient economically (=more units/price)? Really? Isn't this the whole "take the course online to increase students without increasing cost" line that has been proven false again and again? And the idea that education can be reduced to ones and zeros, i.e. that learning=moving data from teacher to student, well I hardly need to respond to that.
Wake me up when this approach to education produces anything other than a minimally competent ditch digger.
The education centers would be a good fit for teachers who tended towards generalism rather than specialization and enjoyed face to face teaching. The world authority on the translation of Elamite, who might or might not be the most charismatic teacher in the world, could offer online courses, available worldwide, for aspiring Elamite translators. (You wouldn't have to move to Munich or London or whatever to study with him/her.) Teachers who enjoyed face to face teaching and were also world authorities on something or other, could split their time between duties.
Thomas Edison State, Excelsior College, and Charter Oak State are 3 regionally accredited institutions that already do this. No residency requirements at all - just aggregate the necessary hours in the correct buckets and out pops a degree. These are obviously most popular with adult students, but I wouldn't be shocked if they catch on with younger crowds (in particular, home schooled).
This reminds me of Evergreen State College in WA, which doesn't have any majors, but self-developed programs.
I've known a few people who went there, mostly in the 90s, and seen exactly that phenomenon.
It's not even so much a high-achiever issue as it is an internal sense of direction. If you're self-directed, it can be a fantastic experience. If you're aimless -- or interested in lots of different things -- you can go for years and not end up with anything coherent.
Anyone who has been involved in prior learning assessment or course challenge knows that there are many students whose estimation of what they already know - and what credit they should get for it - is inflated, to put it kindly. This article suffers reeks of the same self-congratulatory over-estimation. Plus which, if what is being taught or expected in the workplace or in the world is right, or the best way to do things, then we wouldn't need college education at all.
This isn't to say that there aren't students who have acquired skills outside of college and who might need a more flexible format to complete their programs. But the model that colleges use to measure and deliver education didn't just evolve overnight from the brain of some bureaucracy-obsessed administrator, and it evolves as society and students evolves. To imply that it's completely detached from the real world is simply wrong.