Tuesday, September 30, 2014

A Question for the Statisticians


In a recent exchange with one of my favorite higher education peeps, I realized that I’ve been stubbing my toe on the same issue for years now.  I’m hoping that some very smart readers have found an elegant way around it, the better to spare my theoretical feet.

Let’s say that you’re trying, for whatever reason, to measure the average earnings of graduates from a particular college or set of colleges.  And let’s pretend that actually getting the data is relatively easy, just for the sake of argument.  (“In theory, there’s data.”)  

What do you do with the students who transfer to the next level of education, instead of getting a job?  

For community colleges, the issue is students who move on to four-year colleges.  If you measure earnings a year after graduation, they’ll look markedly low, but that’s artificial.  If you count those students as they are, you’ll get a misleadingly low average.  If you exclude them from your count, you’ll essentially exclude the highest-achieving echelon of graduates.  We have grads who have gone on to medical school and done quite well for themselves, but they don’t show up in our numbers because the gap between graduating community college and making big money as a physician is too long.  

Four-year colleges face a related issue.  How do you count the students who went on to law school, med school, or, heaven help us, grad school?  In the first year out, or even the second, they’re making peanuts.  But they’re on a track that historically has done quite well, other than certain flavors of grad school.  The future cardiologist will do fine, eventually, but by then it’ll be too late for your numbers.

Again, factoring them out means discarding the cream of the crop, which hardly seems fair.  But leaving them in builds a systemic distortion.

Many of these issues go away if you assume a long enough timeframe.  Look at earnings twenty years out, and get a much clearer picture.  But if the point is to hold colleges’ feet to the fire, using data from twenty years ago isn’t likely to work.  The lag is too long.

I could understand the “just don’t count them” solution for, say, grads who drop out of the workforce to have children.  They’re playing a different game.  It isn’t necessarily as clean as that, of course; dropping out may be more appealing if your job is low-paying and dead-end.  But for the sake of simplicity, I could understand assuming choice.  

Graduates who join the military may also bring some low numbers initially, which I would argue are also misleading.  

For the sake of simplicity, let’s leave out students who “swirl” among several different institutions.  If a student transfers three times before graduating, which college should get credit for the earnings?  

It’s tempting to just throw up one’s hands and declare the entire enterprise a fool’s errand, but politically, that’s a non-starter.  If there will be “accountability,” how should we count?