Friday, June 05, 2009
Assessing Professional Development
When there's a budget crunch, professional development is usually one of the first things to get cut. Part of that is because it's inherently variable anyway, unlike salaries or utilities. (In that sense, it's more like the 'snow removal' budget line than the 'salary' line. Any given winter can be more or less snowy than the one before it, so everybody understands that that particular line item is written in pencil.) And part of that is because the costs of cheaping out on professional development take a while to show, but the savings are immediate. When you're in free fall, the objection that “people might be a little less engaged five years from now” isn't terribly compelling. It's like telling a gunshot victim to quit smoking.
Since it's looking like we'll be short of funds for some time to come, I can foresee some pitched battles coming over what little professional development funding we haven't cut yet. And I can guess that those battles won't be terribly enlightening, because we haven't really figured out what makes some professional development activities better than others.
The traditional version, at least on the faculty side, goes something like this: you have x dollars to spend for the year on subscriptions, memberships, conference travel, and the like. If you get a paper accepted somewhere and go there to present it, you'll get a little more. The idea is that faculty are the experts in their respective fields, so they're likelier to know what they need professionally. Set a few basic criteria, make them show receipts, don't pay for alcohol or pay-per-view, and call it good.
That works tolerably well when money is plentiful. But when it's scarce, and you get requests totaling several multiples of what's available, “I want it because I want it and I'm the expert” doesn't work.
I recently heard someone ask what it would look like if we applied outcomes assessment to professional development. What if we somehow measured which expenditures generated the most bang for the buck, and prioritized accordingly?
I can answer that in two words: define 'bang.'
Although I've heard the phrase 'professional development' for years, I've never really heard a coherent theory behind it. In order to define 'bang,' we'd need to specify the purpose of PD. Instead, we're running on the old “I know it when I see it” model.
I'll make it concrete. Which conference would it make more sense for the college to support: the regional conference on teaching in a given field, or the regional conference of the major disciplinary organization for that same field?
I can envision arguments for either. The teaching conference is clearly more relevant to the college, since it's a teaching institution. And the disciplinary conference is clearly more relevant, since teachers need to know what they're teaching, and need to remain excited. So who wins?
The really evil social scientist in me says this can be resolved empirically. Give one group teaching-focused development, and another discipline-focused. Do that for a few years. Then compare their course completion rates, graduation rates, student evaluations, and any other measures you normally use to gauge teaching effectiveness. If one group clearly beats the other, then you have your policy.
Of course, you'd also have a political firestorm. Because at a really basic level, there's a tension between the view of faculty as employees and faculty as disciplinary ambassadors. The former suggests that PD is really another word for 'training,' and the latter suggests that it should be almost entirely self-directed. (In practice, that usually means that it's another word for 'travel.')
Locally, I've tried to delegate these decisions to a faculty committee, but they're running into the same conceptual brick wall I used to run into. The failure is more democratic, but it's still a failure.
Wise and worldly readers – have you seen a reasonably elegant and fair way to allocate scarce PD money? If so, what's it based on?
I don't know how or if that would work on the faculty side, but one could certainly have faculty present or write a report. Seems silly, but not only is it a way of assessing, it's also a way of spreading the knowledge to colleagues who couldn't attend.
I have no brilliant suggestion, but want to complicate this by saying that sometimes what a conference does is just reconnect you to things you care about. So it may be the "bang" for some faculty is in excitement about their subject.
There are ways to plan PD programs that would cost practically no cash, predicated upon faculty members being able to spend a couple of hours a week. I do understand, very well, the energy and emotional boost one gets from meeting people face to face. What I'm claiming is that if we can't do that b/c of lack of $$, there are some reasonable if less adrenalin-stimulating alternatives. This blog is an example, both for DD and his readers.
Speaking to travel specifically, we have a competitive small grant program in which faculty propose small PD projects, and we are always flooded with requests for conferences. 1) quite a few people end up not going, for one reason or another so the money is wasted, and 2) nobody has any idea what impact it has because so far we have not asked what people do with the information they get, either in the application or in our required (pitifully minimal) reports. I agree with Laura, anyone who receives money ought to "pay it forward" in the form of educating everyone else.