Wednesday, June 15, 2016


Program Reviews On Stage

What if you had to present your department’s five-year program review to the President and the cabinet orally?

We tried that this week.  It was mostly wonderful.

Most colleges have some sort of regular program review cycle, in which departments or programs (or clusters, or…) do some sort of self-examination every x years.  They usually combine some standard questions asked of every program -- enrollment figures, say -- with judgments by the members of the departments.  In some cases -- and I intend to mandate this starting next year -- they need to have at least one person from off campus provide feedback, as well.  The idea there is to get around the problem of unconscious insularity.

Because program reviews are typically done by the members of the program, they tend as a genre to fall prey to certain cliches.  Having seen enough of them over the years, I’ve learned to expect most reviews to include some or all of the following:

These more or less flow inevitably from the idea of self-evaluation.

The black hole into which program reviews are assumed to fall also tends to reward a swing-for-the-cheap-seats style.  If you don’t think anyone will read it anyway, the thinking goes, you might as well at least get some catharsis.  At a previous college, someone once appended a new cover page to a review done ten years prior; I guess he thought I wouldn’t look.  I was torn between admiring the panache and wanting to tell him what to do with his cover page.  (I sent it back and let him know that I noticed.  If nothing else, he learned that the black hole was a myth.)

But having an in-person moment in front of the President, the vice presidents, and the deans changes the dynamics.

At that point, fraud or catharsis become ridiculous.  There’s much less room for unbridled narcissism.  It’s possible to make constructive suggestions, but coming across like a comments section on a blog post about gun control will just embarrass you.  It’s impossible to deny that you’re being heard, when all those eyes are staring right at you.  

The presentations were strong, which wasn’t a surprise; these folks were hired for their ability to teach.  The Q-and-A sessions after each one were particularly good, because they got people off of their usual talking points.  I was especially glad that the people from other parts of the college got a chance to see academics do what they do well; I live in that world, but most of my cabinet colleagues don’t.  That’s not a shot at anybody -- they have complicated jobs in their own right -- but it was useful to shed some light on a very different way of thinking.

Predictably, the prescribed time limits fell apart.  As with academic conferences, the idea that everyone will stick to their allotted time fails so often that I wonder why we keep assuming that the next time will be different.  It’s an occupational hazard.  

Still, even allowing for some clock issues, the discussion was more focused, honest, and constructive than any I’ve seen in the old “just hand in the report” format.  It gave me hope.  And it gave my administrative colleagues some useful insight into my world.  I only wish someone had given me the idea five or ten years earlier.

Presenting to the college cabinet is an excellent idea. (Writing as someone who helped put ours together last year, with serious doubts whether anyone beyond the head of the division read any of it.) It would also surprise me if the division head read all of it, because we got no feedback on some of the remarkable things we turned up -- some of which would have caught the ears and eyes of VPs from those "other" parts of the college. FYI, that would be that web class success rates in our division appear to be contaminated with students who never intended to take any tests, suggesting they are taking it merely to collect financial aid. That data came from the faculty, because it is not in ANY college data base.

I also endorse the idea of getting feedback from outside. That can be employers of your grads in some AS or workforce programs, or colleges that get a lot of your transfers. At lease we manage to get good data about our transfers, but those weren't quite as detailed as we would like because they don't separate early (before all prereqs are complete) from on-time transfers (all frosh- and soph-level classes complete) who are ready to take classes in the major.
At our CC, program faculty do not generate program review documents; a faculty committee reviews relevant program data for all programs over a five-year cycle and makes recommendations for improvement as needed. This peer review proces avoids the self-evaluation problems DD has observed. I think it works well.
At a previous college, someone once appended a new cover page to a review done ten years prior; I guess he thought I wouldn’t look.

To me that says as much about the process as it does about that colleague. If the faculty feels like the review is not valuable, then they're not going to take it seriously. Maybe the usefulness of the process to the faculty is something that needs to be addressed.
Thanks for sharing this great article. Keep up the good work. Want to watch the Euro Cup matches live from your home,just check Euro Cup Live Stream
For all cat lovers to know the information about National Cat Day, you must view my blog National Cat Day and enjoy the day with your beloved pet.
Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?