Thursday, September 26, 2013

The Power of the List



Now Yahoo is ranking community colleges?

Ugh.  Another list.

I’m not sure who the intended audience is.  Most community college students don’t choose from among colleges across the country; most choose locally.  In many areas, that only means one place to go; in most, no more than two or three.  Online degrees have loosened the ties to geography to some degree, but most community colleges still charge a premium for out-of-state (or, in some states, out-of-county) students. Geography isn’t dead.   Knowing that a college in Washington got a great ranking doesn’t help a prospective student in Massachusetts very much.

Of course, there’s no shortage of rankings out there in internet land.  If you’re inclined, for some reason, you could look at national or state rankings, privately or publicly generated, each with its own set of criteria and, therefore, its own implied agenda.  Of course, the lists generate far more attention externally than their criteria do.

Last week I was at a meeting of my counterparts from around the state.  We had a few guest speakers, the first of which was a team from a think tank preparing yet another scorecard for community colleges.  I think they thought they would get constructive suggestions, or, at least, relatively silent fuming.  Instead they got their heads handed to them, with a cascade of comments along the lines of “do you think we don’t know that?”  and “we know perfectly well how these will be read.”  I was reminded of the old saw that nobody ever erected a statue of a critic.

Imagine what would happen on campus if an administration were to rank every single professor against every other one, and then publish those rankings in the local paper and on the college website.  How would that go over?

The larger the scope of the list, the worse the errors.  Sara Goldrick-Rab did a welcome course correction this week when she realized that President Obama’s plan to rank colleges isn’t likely to lead anywhere good.  She’s right, though the reasons go beyond the ones she gives.  Community colleges in states with relatively robust four-year sectors, generally speaking, have lower graduation rates than do community colleges in states with sparse four-year sectors.  That tells you something about context, but nothing about institutional performance.  Even intrastate comparisons can be pretty misleading.  Genesee Community College, in Batavia, New York, faces a very different context than LaGuardia Community College, in New York City.  (Batavia is halfway between Rochester and Buffalo, near where I grew up.  It’s slightly farther from New York City than Richmond, Virginia is.)  The contexts are so very different that the kinds of measures that lend themselves to annual budgeting just become silly.

Yet, for all their clear limitations, lists seem to be becoming more popular and more powerful.

I understand the seductive appeal of false precision.  Magazines move copies by boldly declaring that they’ve cracked the 7 steps to better sex or the 10 foods that will give you superpowers.  (In the case of U.S. News, the “best colleges” list outlasted the magazine itself.)  I always smile when the referees bring out “the chains” in football games to determine whether a player achieved a first down.  They plunk the first chain down by eyeballing it, then get painfully precise with the second one.  The second can only ever be as accurate as the first, but the chains make an underlying arbitrariness seem rigorous and concrete.

But in those contexts, the silliness fits.  In the context of public higher education, the silliness could have real and lasting consequences.

I won’t be judging my college’s performance based on a Yahoo story.  In fact, I’d be concerned about anyone who would.  But that won’t stop the next yahoo from trying.