Tuesday, November 20, 2007
My cc is taking a new look at guidelines for 'advisory boards' for various 'occupational' degree programs.
(A quick definition: an 'occupational' degree refers to one designed primarily to make students employable in a given field upon graduation. Its counterpart is the 'transfer' degree, which is intended to be the first half of a bachelor's degree. Transfer degrees typically include much more 'gen ed,' and their intended audience is four-year colleges and universities, rather than employers.)
The distinction between 'occupational' and 'transfer' isn't always clean. We have several 'occupational' degrees that have, for various reasons, become transfer degrees. (I've never seen it go the other way.) Sometimes that's because of credential creep in the target industry; where a two-year degree was once enough to get a decent entry-level position, now a four-year degree is the de facto minimum. (Sometimes that's driven by employer preference, and sometimes by external legal changes.) Sometimes it's because of increased technological complexity in a given field. Sometimes the field has changed, such that even an entry-level hire is now expected to have the kind of range that would not have been expected at the entry level a generation ago. And sometimes it's driven by some enterprising four-year schools that have established degrees where none existed before, and have created their own demand.
Still, just by looking at the paths our graduates take, we can get a pretty clear idea of which programs are currently (mostly) occupational. For us, for example, Criminal Justice is largely an occupational program; most of our grads go directly into law enforcement, even if many of them later go on to finish four-year degrees while on the job. Nursing is similar.
In my observation, advisory boards tend to go through a distinct life cycle. There's the exciting initial stage, in which folks are happy to be on board, ideas are brimming, and the world is about to be changed. This slowly gives way to the bubble stage, at which everything is assumed to be fine, and the meetings are more about group bonding than substance. (Typically, this is when the dreaded 'loss of touch with reality' sets in.) Eventually, as the irrelevance of the board starts looking like a given, the 'let's skip the next meeting' stage sets in, followed eventually by the “didn't we used to have an advisory board?” stage.
It's not unusual to see program chairs select personal friends as advisory board members, since they're likelier to say 'yes' to sacrificing the occasional evening, and unlikely to do anything threatening. This also makes the segue from stage one to stage two clean and effortless. Unfortunately, it also pretty much guarantees stages three and four.
As with making movies, casting is ninety percent of the battle. If you get people from too high on the food chain, they're often out of touch with what the front-line hiring folk actually do. (Back in the 80's, I recall hearing lots of CEO's say that the skills developed by a liberal arts education were exactly the skills the managers of the future would need. Apparently, the only people who didn't get that message were the hiring managers.) But if you aim too low, you get constant turnover and a need to reinvent the wheel on an annual basis.
There's also the danger of corporate (or employer) myopia. Back in my Proprietary U days, I saw many a corporate muckety-muck tell us with unshakable confidence just exactly what the key to business success was, less than a year before his company went out of business.
All of that said, we're looking at basic guidelines for the composition of advisory boards. I'm thinking that less than half the board should have any other existing connection with the college, and that multiple employers should be represented. Beyond that, though, I'm short on ideas.
So I turn to my wise and worldly readers. Have you seen a good rule of thumb for comprising a useful and effective advisory board?