Thursday, August 23, 2007

Ranking CC's

An alert reader sent me a link to the Washington Monthly's rating of the top 30 community colleges in America.

Ranking schemes like this invariably lead to a two-track response; I know they're flawed, but I want to do well on them, anyway. Every year I check the U.S. News rankings, just to make sure that my alma mater still outranks its hated rival (which it does). The folks who do rankings like these are banking on exactly that kind of response, in much the same way that the folks who make a living tracking, say, Lindsay Lohan's every move know that we think less of ourselves for caring, but care we do, despite ourselves.

(Aside: do regular people ever 'suffer from exhaustion'? What does that even mean? Is it like neurasthenia? The vapors? Angst? Or is it just a euphemism for 'detox'?)

In this case, part of the point of the enterprise seems to be a sort of 'shaming.' In an explanatory essay, the author uses the relative successes of the higher ranking cc's to argue that four-year schools should accomplish much more than they do, given how much more money they typically have.

Although the magazine doesn't disclose its algorithm, it apparently uses inputs taken from a national survey of student engagement; the folks who developed the survey explicitly reject its use in a comparative context. And it looks like they're right – a quick glance at the table shows that every single one of the top 20 cc's nationally, by this chart, has an enrollment below 3000, and most are below 2000. So we're left with the shocking – shocking, I say – finding that smaller settings are often more close-knit.

Well, yes. And bears crap in the woods. This is a completely useless finding if your cc happens to be bigger than that.

I'm a little alarmed that the number 3 cc – Southern University at Shreveport – lists a graduation rate of 17 percent. (And since when do cc's get to be universities, anyway?) By the magazine's methodology, it outranks Wisconsin Indianhead Technical College, which lists a graduation rate of 54 percent. (The highest ranking large cc, South Texas College, checks in at number 21 with a graduation rate of 10 percent. 10 percent! Valencia cc comes in lower, despite a graduation rate three and a half times higher. But then, it is bigger.)

It's also hard not to notice a certain regional skew – only one of the top thirty is in the Northeast, with heavy representation of Texas and Florida. Given the clear preference for smaller schools, that may just reflect Northeastern population density.. It may also reflect higher Northeastern tuitions, to the extent that 'bang for the buck' was being measured. It's hard to say, given the undisclosed algorithm.

Any national ranking of local institutions will be suspect. In some states, community colleges are largely self-governing, and their service areas consist of one or two counties. Some states have statewide systems. Some states have parallel statewide systems – 'community' colleges on one side (focusing largely on transfer), and 'technical' colleges on the other (focusing largely on employment). Some cc's try to be 'comprehensive,' meaning doing both the transfer and the technical functions.

To my mind, a good cc is well-suited to its service area. In some areas, that will mean a heavier transfer focus; in others, a clear career orientation. It will produce a high number of successful student outcomes, whether that's defined as graduation, early transfer, employment in a relevant field, or certification. A measure like “GPA at subsequent college” would tell me a lot more than would any of the measures used here. If the facts on the ground show that our grads do well at their destination schools, and we don't do it by having a five percent graduation rate, then I say we're doing pretty well. If our grads crash and burn at their destination schools, then we have serious work to do. I wouldn't use “Active and Collaborative Learning” as a measure, since that's a method (or an input, if you prefer). (And I wouldn't use student self-reported scores of “Student Effort” for anything.) I'd look at outcomes. Did the cc foster a whole lot of successful outcomes, or not?

That would be astonishingly hard to do across a national sample, but it would actually lead to useful information. Telling me that teeny-tiny schools foster more close interaction doesn't give me anything I can use. (There's no way in hell the local taxpayers would pony up to break us up into several smaller campuses across the same service area. Would. Not. Happen.) Isolating the schools that tend to foster the most successful outcomes, and then looking for common denominators among them, might actually reveal something useful.

More likely, the comparisons would have to be among demographically-similar areas. Telling me that a rural Texas cc with a largely homogeneous population of 400 students has higher levels of student engagement than mine really doesn't help me. That's a different world. But are there colleges in locations relatively comparable to mine that are doing a better job of helping their students succeed? If so, I want to know how they're doing it.

I concede upfront that my cc has areas for improvement, and that there are almost certainly some better ideas floating around out there that we could usefully steal. But this survey gives me nothing I can use. It might be fun for starting barfights at a League for Innovation conference, but there's nothing in it that helps me improve my college. It doesn't even have the 'guilty pleasure' appeal of yet another article about a Pop Tart on the rocks. Thanks, but no thanks.