Wednesday, September 05, 2012
Don’t Forget Self-Interest...
Amy Laitinen, of the New America Foundation, issued a must-read report this week that provides some excellent context. Among other things, it reveals that the initial impetus for the credit hour as we know it came from Andrew Carnegie trying to find a basis for faculty pensions at Cornell, where he was a trustee. Credit hours were initially used to equate different high schools, but they quickly became the coin of the realm in higher education, even though they were never tied to student learning.
Dissent came early; Laitinen mentions serious misgivings about the overuse of the credit hour as early as the 1930’s. But it solved several bureaucratic problems, and has since become, by default, the way that colleges denote work.
Laitinen notes, correctly, that the absence of content in the credit hour is made clear when one college won’t take transfer credits from another. But this is where I have to offer a friendly amendment.
Yes, it’s true that a three credit class at college A may well have different outcomes than a similarly-titled three credit class at college B. But that’s not the only reason that transfer credits get denied.
Most of us at community colleges have been through this dance a few times. Credit hours don’t only count what students have taken; they also denote what professors have taught. Credit hours are the way that FTE’s are calculated, which can have direct impact on state funding. They’re how individual teaching loads are calculated, and over time, they’re part of how departmental staffing allocation decisions are made.
Which is to say, a department at a receiving school that “gives away too many credits,” as I’ve had it said to my face, potentially hurts its own claim on resources. Being too generous on transfer credits can cost a department jobs.
There’s a standard playbook for departments that want to deny transfer credits. One way is to fudge the distinction between 200- and 300- level courses. That way, it can deny credit for transferred 200-level classes by claiming that they’re really at the 300 level. They can play with prerequisites, require idiosyncratic sub-sequences, or change the number of credits that a given course carries. Or they can just assign anything threatening to “free elective” status, which is where credits go to die.
Laitinen notes, correctly, that there’s some theoretical room to move in the definition of the credit hour, but that recent clampdowns in financial aid have made colleges wary of trying anything. (The reaction to abuses of financial aid in the for-profit sector has had a severe chilling effect among community colleges. Ironically, handcuffing community colleges actually strengthens the for-profits. You’d think someone would figure that out...) But there are also very real issues of self-interest at every level. I would be shocked to see faculty stand idly by while students were awarded non-trivial amounts of academic credit for learning in venues where the faculty did not teach; the faculty would see that as a direct threat to their continued employment. (That’s lurking below the surface of much of the discussion of MOOCs, for example.)
A reform that would actually take hold has to do more than overcome the flaws of the current system. It would also have to address, in some meaningful way, the self-interest of the various actors in place. Some people would probably lose something, obviously, but a reform that isn’t in any of the incumbents’ best interest will fall prey to interest-group politicking.
Still, kudos to Laitinen and the New America Foundation for doing the homework to explain how we got where we are. It clearly isn’t where we should be, and I suspect that the longer we cling to where we are, the more vulnerable to external disruption we’ll be. But the crafting of alternatives, if it’s going to work, will have to take seriously the less-exalted motives of the current actors. Without that, well, read the quote from 1938.