This story in IHE came along at the right moment. My state is apparently considering an early retirement program for certain kinds of public employees, which may wind up including many of the people who work at the college. Naturally, the rumor mill is aflutter. (I'm not sure if mills can flutter, but you get the idea.)
It's a strange time to try to purge staff, since we have more students than we can handle already. Reducing our staffing will only make it that much worse. But when costs are meaningfully separate from revenues, which they still are for us, it's not surprising that we'd be pulled in contradictory directions.
I'm conflicted. On the one hand, I've made no secret of my belief that part of the reason for the terrible job market for new Ph.D.'s is the lack of turnover in tenured positions. Assuming that at least some of the retirees would have to be replaced -- and in a time of record enrollments, that strikes me as a reasonable assumption -- then we could both cut costs and hire new faculty at the same time. Yes, some positions would probably be adjuncted-out for the usual reasons, but some would have to be replaced. If the pot were sweetened enough for some who feel ready to move on to the next phase, then we could hire some of the newest grads. And given the degree to which it's an employer's market these days, we could pick up some really amazing people. There's a real appeal to that.
On the other hand, though, these incentives are awfully blunt instruments, and they have weird side effects.
In a collective bargaining environment, the incentives would almost certainly have to be offered across the board to anyone who fits a set of bright-line criteria. (Years of service and minimum ages are the usual defaults.) There are good and fair reasons for that, but it also means a dangerous likelihood of the stars leaving and the, um, lesser stars staying. From a student perspective, this is not a happy outcome. (Yes, even at a teaching institution we have stars and lesser stars.)
Retirements can also be spotty by department, which can lead to some abrupt and very annoying staffing imbalances. With enough lead time and the flexibility to backfill as needed, of course, that can mean new openings. But if the lead time is too short, or backfilling is verboten, then we could easily wind up with, say, a top-heavy English department and a completely vacated math department. Not good.
Over time, too, early retirement incentives can become an expectation, and even a sort-of entitlement. I have had faculty tell me, to my face and in all apparent seriousness, that they're only still around because they're waiting for the next incentive. Once that expectation is out there, you live with it for a long time. A quick burst will be followed by a long lull. If the lull lasts into the next recession, then we're right back where we started, which is exactly what some people are counting on. It's a sort of perpetual motion machine that feeds on money, and that money has to come from somewhere.
Ethically, I'm a little uneasy with the idea that we have to exploit the daylights out of the young so we can pay off the old to stop working. Something about that just seems wrong. If the effects are generally positive, I'm willing to put my misgivings aside, but there they are.
If it were up to me, we'd have something closer to the Danish system. We'd have a generous welfare state combined with a relatively fluid, performance-based employment model. Instead, we have a winner-take-all system in which the immovably employed have to be bought off to create opportunity for the desperately underemployed. But you play the hand you're dealt.
Wise and worldly readers, have you seen an early retirement incentive scheme done especially well? If you have, what made the difference?
In which a veteran of cultural studies seminars in the 1990's moves into academic administration and finds himself a married suburban father of two. Foucault, plus lawn care. For private comments, I can be reached at deandad at gmail dot com. The opinions expressed here are my own and not those of my employer.
Friday, February 26, 2010
Thursday, February 25, 2010
Ask the Administrator: Pinch Hitting
A brave-or-foolhardy correspondent writes:
I haven't actually done that myself, though I've seen it done several times, usually for medical reasons. To the extent that you explain the switch to the students, I'd emphasize the medical angle. If they sense that they broke the previous instructor, it will simply embolden the worst of them.
Experience tells me that the students will seize upon the interruption as an excuse for just about anything that doesn't go their way. This will be particularly true if the instructor change is accompanied by a syllabus change. They can argue, with some warrant, that the syllabus was their contract for the class, and that it's unreasonable to change both the instructor and the syllabus at this point in the semester. The local administration may defer to this position to a greater degree than you'd expect, since there's some real legal strength to it.
My first piece of advice is to simply incorporate whatever grades have been given thus far. Don't undermine the authority of the original instructor, even in absentia; it will merely feed the fire. Any perceptible daylight between the two of you will become a cudgel used against both of you. Make it as seamless as possible. If you can keep the original syllabus entirely, all the better.
Changing the tone by example is a good start, but as I mentioned a few days ago, leading by example is often too subtle. Don't just do it; tell them what you're doing, while you're doing it. Make it clear, and repeat as needed.
I'd also be in close discussion with your dean or chair, depending on your system, to let her know what's going on. Make sure that you're listed as the new instructor of record. (That may involve notifying the union, as well.) Whatever you do, don't try to do this below the radar. When students complain -- and they will -- you don't want your dean's first reaction to be "what?" From this side of the desk, I'll just say we don't like surprises like that. At all.
Assuming that everybody is in the know, the other task will be to set expectations. The class has already, for all intents and purposes, failed. You're engaging in a salvage operation. You shouldn't present it to the students that way, of course, but that's essentially what it is. I'd be shocked if you got to the end of the semester with a bunch of happy campers. If you have the kind of administration that makes snap judgments based on student evaluations, you'd better make damn sure you get ahead of the story.
There's also the question of the class you're giving to your mentor. If he's still performing well and this group just happens to be awful, it's probably not a huge issue. But if he's starting to slip, you could be turning one problem class into two.
In terms of actual classroom techniques, I'll defer to those among my wise and worldly readers who've actually been the pinch hitter. What worked? Is there anything specific to embrace or avoid?
Good luck!
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
I am junior, recently tenured faculty at Average Community College. My
mentor is having serious health problems,compounded by an incredibly
difficult class full of student-athletes who are continuously disruptive
discipline problems. The situation has gotten so bad that we are switching
instructors mid-semester, and I will be taking
over his hell class on Monday morning. I have 2 questions. First, do you
have any administrative advice on how to handle a mid-semester instructor
change? Second, any insights on how to engage a class like that?
So far, my plan is to retain as much of his class design as possible while
making a few changes to the syllabus regarding disruptive behavior in class,
and try to change the tone a little by example. To make sure everyone
understands the new rules, they will need to pass a syllabus quiz and sign a
paper indicating that they understand and will follow them. For the
athletes, I'm going to try engaging them by pulling lecture examples from
their experience, involving their coaches in a constructive dialogue on how
we can all succeed cooperatively, and respectful but merciless enforcement
of the class rules and policies. Am I on the right track?
I haven't actually done that myself, though I've seen it done several times, usually for medical reasons. To the extent that you explain the switch to the students, I'd emphasize the medical angle. If they sense that they broke the previous instructor, it will simply embolden the worst of them.
Experience tells me that the students will seize upon the interruption as an excuse for just about anything that doesn't go their way. This will be particularly true if the instructor change is accompanied by a syllabus change. They can argue, with some warrant, that the syllabus was their contract for the class, and that it's unreasonable to change both the instructor and the syllabus at this point in the semester. The local administration may defer to this position to a greater degree than you'd expect, since there's some real legal strength to it.
My first piece of advice is to simply incorporate whatever grades have been given thus far. Don't undermine the authority of the original instructor, even in absentia; it will merely feed the fire. Any perceptible daylight between the two of you will become a cudgel used against both of you. Make it as seamless as possible. If you can keep the original syllabus entirely, all the better.
Changing the tone by example is a good start, but as I mentioned a few days ago, leading by example is often too subtle. Don't just do it; tell them what you're doing, while you're doing it. Make it clear, and repeat as needed.
I'd also be in close discussion with your dean or chair, depending on your system, to let her know what's going on. Make sure that you're listed as the new instructor of record. (That may involve notifying the union, as well.) Whatever you do, don't try to do this below the radar. When students complain -- and they will -- you don't want your dean's first reaction to be "what?" From this side of the desk, I'll just say we don't like surprises like that. At all.
Assuming that everybody is in the know, the other task will be to set expectations. The class has already, for all intents and purposes, failed. You're engaging in a salvage operation. You shouldn't present it to the students that way, of course, but that's essentially what it is. I'd be shocked if you got to the end of the semester with a bunch of happy campers. If you have the kind of administration that makes snap judgments based on student evaluations, you'd better make damn sure you get ahead of the story.
There's also the question of the class you're giving to your mentor. If he's still performing well and this group just happens to be awful, it's probably not a huge issue. But if he's starting to slip, you could be turning one problem class into two.
In terms of actual classroom techniques, I'll defer to those among my wise and worldly readers who've actually been the pinch hitter. What worked? Is there anything specific to embrace or avoid?
Good luck!
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Wednesday, February 24, 2010
Privacy and Diversity
Every so often, I'll hear some longtime employees complain that the newer cohort (of which they still consider me a part) doesn't care about the college like they did. The last time I heard this line, I asked what they meant; what made them think we didn't care? I wasn't expecting the answer I got:
"Everybody used to go out drinking together after work. Nobody does that anymore."
Well, okay. It's true that we don't now; it may be true that they did once. But what the hell does that have to do with dedication to the college?
"We used to make our entire lives revolve around the college. We worked together, played together, drank together, and gave it everything we had. Your generation doesn't do that. It doesn't care as much."
Hmm.
I think this is one of those "is it a rabbit or a duck?" moments. It's true that mandatory face time at the bar has gone the way of the typing pool. But I'd be hard pressed to call that a bad thing. Even leaving the whole drinking-and-driving thing out of it, there's something basically imbalanced about making your workplace your entire life. Workplaces change, they come and go, and they aren't really about individual people. All that forced togetherness can become coercive, and can force out or keep out people with different priorities. And the effect on family life can be devastating, judging by the divorce rates of the 1970's and 1980's.
That very different experience, I think, explains why I've taken a very different view of the relationship between privacy and diversity than the one I was taught in grad school.
In grad school, I was taught that the public/private distinction was a tool of patriarchal oppression, used to relegate women's concerns to the margins. I was taught that making the personal political was a necessary step towards egalitarian empowerment, and that 'problematizing' the public/private distinction pretty much wherever we saw it was the right thing to do.
And that view probably made sense in the context from which it arose. Certainly, "women's issues" (!) were often marginalized, and "privacy" covered a great many sins. And if you spend fourteen hours a day of face time at work, with coworkers at bars, and doing mandatory face time with coworkers at various events, then the private side could get pretty badly neglected. No argument there.
But there's also something to be said for being able to leave work at a reasonable hour and go home.
Part of respecting diversity is allowing people the time and space to lead different lives. Some people go home to their partners and children. Some go home just to partners. Some go home to heaven knows what. And that's okay. In fact, I'd argue that liberating people from the expectation of making their entire lives revolve around work -- getting rid of the coercive mandatory socializing at the bar, say -- is far more respectful of different life choices than "work together, play together, drink together" could possibly be.
In some ways, the public/private split actually serves real diversity quite well. When the boundaries between home and work get too fuzzy, what, exactly, does your job evaluation reflect? What do decisions get based on? How much of my life do I have to make over in somebody else's ideal image?
In my cohort (and the younger one), I see a much stronger impulse to separate work from home, and that strikes me as healthy. Let people lead their home lives as they see fit, and let them have time to do it. Work hard when at work, but live your life as you see fit when you go home. Let work decisions and evaluations reflect only what happens on the job.
In my faculty (and grad student) days, I was constantly frustrated that work was never really 'over.' I could always be reading more, or prepping, or grading, or trying to publish. That "sword of Damocles" feeling made for some pretty stressful times, since it seemed inescapable.
In administration, I spend waaayyyyy more time on campus than I ever did as faculty. But when I go home, for the most part, I go home. I don't make a habit of checking my work email at night, or of hanging out socially with coworkers. The sword of Damocles hangs in the office. I don't mind leaving it there. At home, I can be fully 'present' as a father. Call that 'patriarchal' if you want -- it's certainly paternal -- but being a 'present' father to my kids matters to me. If others choose to live their lives very differently, let them. As long as we can work together well, the rest strikes me as properly private.
It's true that we don't hang out at the bar after work the way people once did. And we're freer as a result. The public/private split gives the room for that freedom. I'm wildly dedicated to my work, and wildly dedicated to my kids, and they both take time. Carving out time for each requires keeping the two mostly separate. I'm plenty dedicated, as are my counterparts; I just reject the compulsion to neglect my family to prove it.
"Everybody used to go out drinking together after work. Nobody does that anymore."
Well, okay. It's true that we don't now; it may be true that they did once. But what the hell does that have to do with dedication to the college?
"We used to make our entire lives revolve around the college. We worked together, played together, drank together, and gave it everything we had. Your generation doesn't do that. It doesn't care as much."
Hmm.
I think this is one of those "is it a rabbit or a duck?" moments. It's true that mandatory face time at the bar has gone the way of the typing pool. But I'd be hard pressed to call that a bad thing. Even leaving the whole drinking-and-driving thing out of it, there's something basically imbalanced about making your workplace your entire life. Workplaces change, they come and go, and they aren't really about individual people. All that forced togetherness can become coercive, and can force out or keep out people with different priorities. And the effect on family life can be devastating, judging by the divorce rates of the 1970's and 1980's.
That very different experience, I think, explains why I've taken a very different view of the relationship between privacy and diversity than the one I was taught in grad school.
In grad school, I was taught that the public/private distinction was a tool of patriarchal oppression, used to relegate women's concerns to the margins. I was taught that making the personal political was a necessary step towards egalitarian empowerment, and that 'problematizing' the public/private distinction pretty much wherever we saw it was the right thing to do.
And that view probably made sense in the context from which it arose. Certainly, "women's issues" (!) were often marginalized, and "privacy" covered a great many sins. And if you spend fourteen hours a day of face time at work, with coworkers at bars, and doing mandatory face time with coworkers at various events, then the private side could get pretty badly neglected. No argument there.
But there's also something to be said for being able to leave work at a reasonable hour and go home.
Part of respecting diversity is allowing people the time and space to lead different lives. Some people go home to their partners and children. Some go home just to partners. Some go home to heaven knows what. And that's okay. In fact, I'd argue that liberating people from the expectation of making their entire lives revolve around work -- getting rid of the coercive mandatory socializing at the bar, say -- is far more respectful of different life choices than "work together, play together, drink together" could possibly be.
In some ways, the public/private split actually serves real diversity quite well. When the boundaries between home and work get too fuzzy, what, exactly, does your job evaluation reflect? What do decisions get based on? How much of my life do I have to make over in somebody else's ideal image?
In my cohort (and the younger one), I see a much stronger impulse to separate work from home, and that strikes me as healthy. Let people lead their home lives as they see fit, and let them have time to do it. Work hard when at work, but live your life as you see fit when you go home. Let work decisions and evaluations reflect only what happens on the job.
In my faculty (and grad student) days, I was constantly frustrated that work was never really 'over.' I could always be reading more, or prepping, or grading, or trying to publish. That "sword of Damocles" feeling made for some pretty stressful times, since it seemed inescapable.
In administration, I spend waaayyyyy more time on campus than I ever did as faculty. But when I go home, for the most part, I go home. I don't make a habit of checking my work email at night, or of hanging out socially with coworkers. The sword of Damocles hangs in the office. I don't mind leaving it there. At home, I can be fully 'present' as a father. Call that 'patriarchal' if you want -- it's certainly paternal -- but being a 'present' father to my kids matters to me. If others choose to live their lives very differently, let them. As long as we can work together well, the rest strikes me as properly private.
It's true that we don't hang out at the bar after work the way people once did. And we're freer as a result. The public/private split gives the room for that freedom. I'm wildly dedicated to my work, and wildly dedicated to my kids, and they both take time. Carving out time for each requires keeping the two mostly separate. I'm plenty dedicated, as are my counterparts; I just reject the compulsion to neglect my family to prove it.
Tuesday, February 23, 2010
Assessment as Marketing
In a conversation last week with a big muckety-muck, I realized that there are two fundamentally different, and largely opposed, understandings of outcomes assessment in play. Which definition you accept will color your expectations.
The first is what I used to consider the basic definition: internal measures of outcomes, used to generate improvement over time. If you understand assessment in this way, then several things follow. You might not want all of it to be public, since the candid warts-and-all conversations that underlie real improvement simply wouldn't happen on the public record. You'd pay special attention to shortcomings, since that's where improvement is most needed. You'd want some depth of understanding, often favoring thicker explanations over thinner ones, since an overly reductive measure would defeat the purpose.
The second understanding is of assessment as a form of marketing. See how great we are! You should come here! The "you" in that last sentence could be prospective students being lured to a particular college, or it could be companies being lured to a particular state. If you understand assessment in this way, then several things follow. You'd want it to be as public as possible, since advertising works best when people see it. You'd pay special attention to strengths, rather than shortcomings. You'd downplay 'improvement,' since it implies an existing lack. And you'd want simplicity. When in doubt, go with the thinner explanation rather than the thicker one; you can't do a thick description in a thirty-second elevator pitch.
Each of these understandings is valid, in its way, but they often use the same words, with the result that people who should work together sometimes talk past each other.
If I'm a governor of an economically struggling state, I want easy measures with which I can lure prospective employers. Look how educated our workforce is! Look at what great colleges your kids could attend! I want outcomes that I can describe to non-experts, heavy on the positive.
And in many ways, there's nothing wrong with that. When TW and I bought our house, we looked at the various public measures of school district quality that we could find, and used them to rule out some towns as against others. We want our kids to attend schools that are worthy of them, and we make no apologies for that. They're great kids, and they deserve schools that will do right by them. I knew enough not to place too much stock on minor differences in the middle, but the low-end outliers were simply out of the question. I can concede all the issues with standardized testing, but a train wreck is a train wreck.
The issue comes when the two understandings crash into each other.
I'm happy to publicize our transfer rates, since they're great. But too much transparency in the early stages of improvement-driven assessment can kill it, leading to CYA behavior rather than candor. Basing staffing or funding decisions on assessment results, which sounds reasonable at first blush, can also lead to meaningful distortions. If a given department or program is lagging, would more resources solve it, or would it amount to throwing good money after bad? If a given program is succeeding, should it be rewarded, or should it be considered an area of relatively less need for the near future? (If you say both need more resources, your budget will be a shambles.) Whichever answer seems to open the money spigot is the answer you'll get from everybody once they figure it out.
Until we get some clarity on the different expectations of assessment, I don't see much hope for real progress. Faculty won't embrace what they see as extra work, especially if they believe -- correctly or not -- that the results could be used against them. Governors won't embrace what they see as evasive navel-gazing ("let's do portfolio assessment!") when what they really need is a couple juicy numbers to lure employers. And the public won't get what it really wants until it figures out what that is.
The first is what I used to consider the basic definition: internal measures of outcomes, used to generate improvement over time. If you understand assessment in this way, then several things follow. You might not want all of it to be public, since the candid warts-and-all conversations that underlie real improvement simply wouldn't happen on the public record. You'd pay special attention to shortcomings, since that's where improvement is most needed. You'd want some depth of understanding, often favoring thicker explanations over thinner ones, since an overly reductive measure would defeat the purpose.
The second understanding is of assessment as a form of marketing. See how great we are! You should come here! The "you" in that last sentence could be prospective students being lured to a particular college, or it could be companies being lured to a particular state. If you understand assessment in this way, then several things follow. You'd want it to be as public as possible, since advertising works best when people see it. You'd pay special attention to strengths, rather than shortcomings. You'd downplay 'improvement,' since it implies an existing lack. And you'd want simplicity. When in doubt, go with the thinner explanation rather than the thicker one; you can't do a thick description in a thirty-second elevator pitch.
Each of these understandings is valid, in its way, but they often use the same words, with the result that people who should work together sometimes talk past each other.
If I'm a governor of an economically struggling state, I want easy measures with which I can lure prospective employers. Look how educated our workforce is! Look at what great colleges your kids could attend! I want outcomes that I can describe to non-experts, heavy on the positive.
And in many ways, there's nothing wrong with that. When TW and I bought our house, we looked at the various public measures of school district quality that we could find, and used them to rule out some towns as against others. We want our kids to attend schools that are worthy of them, and we make no apologies for that. They're great kids, and they deserve schools that will do right by them. I knew enough not to place too much stock on minor differences in the middle, but the low-end outliers were simply out of the question. I can concede all the issues with standardized testing, but a train wreck is a train wreck.
The issue comes when the two understandings crash into each other.
I'm happy to publicize our transfer rates, since they're great. But too much transparency in the early stages of improvement-driven assessment can kill it, leading to CYA behavior rather than candor. Basing staffing or funding decisions on assessment results, which sounds reasonable at first blush, can also lead to meaningful distortions. If a given department or program is lagging, would more resources solve it, or would it amount to throwing good money after bad? If a given program is succeeding, should it be rewarded, or should it be considered an area of relatively less need for the near future? (If you say both need more resources, your budget will be a shambles.) Whichever answer seems to open the money spigot is the answer you'll get from everybody once they figure it out.
Until we get some clarity on the different expectations of assessment, I don't see much hope for real progress. Faculty won't embrace what they see as extra work, especially if they believe -- correctly or not -- that the results could be used against them. Governors won't embrace what they see as evasive navel-gazing ("let's do portfolio assessment!") when what they really need is a couple juicy numbers to lure employers. And the public won't get what it really wants until it figures out what that is.
Monday, February 22, 2010
The Toxic Workplace Test
I've come up with a one-question quiz to determine whether your workplace is toxic.
1. When Smith attacks Jones in public in dirty, ad hominem, and generally unprofessional ways, and Jones responds by taking the high road, what happens?
a. Jones would never take the high road. Nobody ever does. It's on!
b. Jones takes the high road out of town.
c. Jones is viewed as the loser, since the high road is interpreted as weakness.
d. Onlookers divide into warring camps, and others do the dirty work for Jones.
e. Smith is viewed as the loser, having been decisively outclassed.
If the answer is anything other than e, you have a toxic workplace.
From an administrator's perspective, changing a culture that would answer a through d into one that would answer e is a real, and incredibly important, challenge. (Ideally, of course, the attack wouldn't happen in the first place, but to count on that would be foolish.)
It's difficult because the benefits of the high road usually take time to show up, but the emotional satisfaction of a sucker punch is immediate. Worse, many of the benefits of the high road are contingent upon others recognizing it for what it is, and appreciating it. (Yes, virtue can also be its own reward, but sometimes we need more than that.) That takes a certain confidence in your expectation that others will understand what you're doing. In the absence of that confidence, taking the high road can feel like unilateral disarmament.
Part of the job of campus leadership is setting a climate in which people can be reasonably confident that they won't have to resort to frontier justice to defend themselves. If an expectation develops over time that it's possible to disagree in public without getting personal or nasty, then those who violate that expectation will start to find themselves isolated. I consider that a good outcome.
In my early, naive days of deaning, I thought that setting the example would be enough. It wasn't. The lead-by-example thing wasn't enough, because too many people didn't notice or get it. To the small-minded thug, in the short term, the high road can look like weakness. It also didn't address the reality that no matter how a particular situation unfolded, different people had different slivers of information about it, and interpreted it accordingly. Missing a key piece of information, or placing it in an unrelated context, could lead even well-meaning people to mistaken conclusions.
Instead, I've slowly come to realize that if you want to give people confidence that the high road will work, you have to take several steps.
First, obviously, model it yourself. This isn't enough by itself, but without this, you're sunk.
Second, explain what you're doing and why you're doing it. If you model the behavior but trust that it will speak for itself, you'll often fall victim to weird interpretations. Put your interpretation out there, preferably several times. If you can manage a 'before, during, and after' approach, all the better. And for heaven's sake, be consistent.
Third, acknowledge when you fail. Everybody does, from time to time, but some people just can't bring themselves to admit mistakes. If you let slip an ill-considered comment in a moment of frustration, don't try to justify it or pretend it didn't happen; admit it, apologize for it, and renew the commitment to higher ground. One of the benefits of this approach is that it shifts the locus of authority from the individual to the ideal. That's very much of a piece with separating the speaker from the speech, which is the basis of civility. It shows respect for others, without which there's simply no basis for taking the high road seriously. I've found over time that people who feel respected are usually much less likely to escalate conflict to unproductive levels.
Finally, be patient. Some people will catch on more slowly than others, and a few never will. Trust is built slowly and lost quickly, so you have to be willing to stick with it for a while before seeing the results you want.
The tragedy of the middle manager -- I know, boo-hoo, but stay with me -- is when you follow all of these assiduously, only to be undercut from above. I've lived through that more than once, and I can attest that it's demoralizing at a really fundamental level. Sometimes, the high road can only lead out of town. But if you have leadership that actually enables the high road, stick around. You'll miss it when it's gone.
1. When Smith attacks Jones in public in dirty, ad hominem, and generally unprofessional ways, and Jones responds by taking the high road, what happens?
a. Jones would never take the high road. Nobody ever does. It's on!
b. Jones takes the high road out of town.
c. Jones is viewed as the loser, since the high road is interpreted as weakness.
d. Onlookers divide into warring camps, and others do the dirty work for Jones.
e. Smith is viewed as the loser, having been decisively outclassed.
If the answer is anything other than e, you have a toxic workplace.
From an administrator's perspective, changing a culture that would answer a through d into one that would answer e is a real, and incredibly important, challenge. (Ideally, of course, the attack wouldn't happen in the first place, but to count on that would be foolish.)
It's difficult because the benefits of the high road usually take time to show up, but the emotional satisfaction of a sucker punch is immediate. Worse, many of the benefits of the high road are contingent upon others recognizing it for what it is, and appreciating it. (Yes, virtue can also be its own reward, but sometimes we need more than that.) That takes a certain confidence in your expectation that others will understand what you're doing. In the absence of that confidence, taking the high road can feel like unilateral disarmament.
Part of the job of campus leadership is setting a climate in which people can be reasonably confident that they won't have to resort to frontier justice to defend themselves. If an expectation develops over time that it's possible to disagree in public without getting personal or nasty, then those who violate that expectation will start to find themselves isolated. I consider that a good outcome.
In my early, naive days of deaning, I thought that setting the example would be enough. It wasn't. The lead-by-example thing wasn't enough, because too many people didn't notice or get it. To the small-minded thug, in the short term, the high road can look like weakness. It also didn't address the reality that no matter how a particular situation unfolded, different people had different slivers of information about it, and interpreted it accordingly. Missing a key piece of information, or placing it in an unrelated context, could lead even well-meaning people to mistaken conclusions.
Instead, I've slowly come to realize that if you want to give people confidence that the high road will work, you have to take several steps.
First, obviously, model it yourself. This isn't enough by itself, but without this, you're sunk.
Second, explain what you're doing and why you're doing it. If you model the behavior but trust that it will speak for itself, you'll often fall victim to weird interpretations. Put your interpretation out there, preferably several times. If you can manage a 'before, during, and after' approach, all the better. And for heaven's sake, be consistent.
Third, acknowledge when you fail. Everybody does, from time to time, but some people just can't bring themselves to admit mistakes. If you let slip an ill-considered comment in a moment of frustration, don't try to justify it or pretend it didn't happen; admit it, apologize for it, and renew the commitment to higher ground. One of the benefits of this approach is that it shifts the locus of authority from the individual to the ideal. That's very much of a piece with separating the speaker from the speech, which is the basis of civility. It shows respect for others, without which there's simply no basis for taking the high road seriously. I've found over time that people who feel respected are usually much less likely to escalate conflict to unproductive levels.
Finally, be patient. Some people will catch on more slowly than others, and a few never will. Trust is built slowly and lost quickly, so you have to be willing to stick with it for a while before seeing the results you want.
The tragedy of the middle manager -- I know, boo-hoo, but stay with me -- is when you follow all of these assiduously, only to be undercut from above. I've lived through that more than once, and I can attest that it's demoralizing at a really fundamental level. Sometimes, the high road can only lead out of town. But if you have leadership that actually enables the high road, stick around. You'll miss it when it's gone.
Friday, February 19, 2010
When Documentation Fails
I've been following the Bill Reader case with interest for the last few weeks. (For the record, I don't know him, and I don't know anyone at Ohio University.)
I read it differently than most folks in internet-land. The question of the proper weight to give to considerations of 'collegiality' in tenure deliberations is a thorny one, and not where I'll focus here. I'll just note that one person's strategic vitriol is another person's hostile work environment, and that administrators who don't keep an eye out for the latter aren't doing their jobs.
Here I'll focus on the issue of documentation.
As I understand it, the university is claiming that Reader has a long history of toxic interactions. Reader is saying that nothing has ever been documented, so the university must be trumping it up.
To which I say, not necessarily. Documentation doesn't always work.
Documentation works poorly when the issue is long-simmering and cumulative. Any single comment can be explained away as hyperbolic, as heat-of-the-moment, or as a simple misunderstanding, and sometimes that's accurate. But harassment often takes the form of sustained patterns of comments or behavior, no one of which in itself necessarily rises to the level of actionable. If I wrote up everybody who ever made a hostile comment, I wouldn't get much else done, and I'd foster paranoia. Among other things, that means that there's typically a breaking point at which the recipient/target decides that a line has been crossed. The accused, at this point, invariably claims that he had no idea anything was wrong. When he challenges the evidence -- they always do -- much of it has been lost to the sands of time. The breaking point may not look like much in isolation, but it didn't occur in isolation. There's a disconnect between the fantasy of the law and the reality of human behavior.
Documentation also fails when people are so intimidated that they're afraid to sign anything. I can't tell you -- literally -- how many conversations I've had with faculty or staff in which someone makes serious complaints about somebody else's conduct, but refuses to write any of it down. They don't want to get "dragged into anything." From my perspective, this is worse than useless. I "know," but I don't. I don't have anything that the accused could even rebut. And the one who told me often walks away thinking that my lack of follow-through is a sign of a sinister agenda, rather than of a basic epistemological flaw. ("The Administration knows about it, but doesn't do anything.") I can't take anyone to task based on hearsay.
Documentation also frequently fails to convey context. Some comments carry meanings in one setting that they don't in another. In isolation, on paper, they may not look like much; in the moment, they were deliberately devastating. I've seen some very savvy bullies who know how to work that system.
Then there are the process fighters.
Although you wouldn't know it from the management textbooks, documenting offenses isn't as easy as simply documenting them. In this climate, every single step gets challenged. It's not unusual to be threatened with grievances and lawsuits simply for putting a memo in a personnel file. At that point, it's easy to default to verbal warnings, at which point someone can claim, as Reader has, that nothing has been documented.
Whether any of this applies in the Reader case, or if he's an innocent victim of mobbing and/or a toxic culture and/or an administrative vendetta, I simply don't know. I find each of those scenarios plausible, at least from this distance. Given the apparent lack of a good paper trail, I assume that he'll eventually win the legal battle. From this side of the desk, I'll just say that effective paper trails are far harder -- and less under one's control -- than most people assume. It's just not that simple.
I read it differently than most folks in internet-land. The question of the proper weight to give to considerations of 'collegiality' in tenure deliberations is a thorny one, and not where I'll focus here. I'll just note that one person's strategic vitriol is another person's hostile work environment, and that administrators who don't keep an eye out for the latter aren't doing their jobs.
Here I'll focus on the issue of documentation.
As I understand it, the university is claiming that Reader has a long history of toxic interactions. Reader is saying that nothing has ever been documented, so the university must be trumping it up.
To which I say, not necessarily. Documentation doesn't always work.
Documentation works poorly when the issue is long-simmering and cumulative. Any single comment can be explained away as hyperbolic, as heat-of-the-moment, or as a simple misunderstanding, and sometimes that's accurate. But harassment often takes the form of sustained patterns of comments or behavior, no one of which in itself necessarily rises to the level of actionable. If I wrote up everybody who ever made a hostile comment, I wouldn't get much else done, and I'd foster paranoia. Among other things, that means that there's typically a breaking point at which the recipient/target decides that a line has been crossed. The accused, at this point, invariably claims that he had no idea anything was wrong. When he challenges the evidence -- they always do -- much of it has been lost to the sands of time. The breaking point may not look like much in isolation, but it didn't occur in isolation. There's a disconnect between the fantasy of the law and the reality of human behavior.
Documentation also fails when people are so intimidated that they're afraid to sign anything. I can't tell you -- literally -- how many conversations I've had with faculty or staff in which someone makes serious complaints about somebody else's conduct, but refuses to write any of it down. They don't want to get "dragged into anything." From my perspective, this is worse than useless. I "know," but I don't. I don't have anything that the accused could even rebut. And the one who told me often walks away thinking that my lack of follow-through is a sign of a sinister agenda, rather than of a basic epistemological flaw. ("The Administration knows about it, but doesn't do anything.") I can't take anyone to task based on hearsay.
Documentation also frequently fails to convey context. Some comments carry meanings in one setting that they don't in another. In isolation, on paper, they may not look like much; in the moment, they were deliberately devastating. I've seen some very savvy bullies who know how to work that system.
Then there are the process fighters.
Although you wouldn't know it from the management textbooks, documenting offenses isn't as easy as simply documenting them. In this climate, every single step gets challenged. It's not unusual to be threatened with grievances and lawsuits simply for putting a memo in a personnel file. At that point, it's easy to default to verbal warnings, at which point someone can claim, as Reader has, that nothing has been documented.
Whether any of this applies in the Reader case, or if he's an innocent victim of mobbing and/or a toxic culture and/or an administrative vendetta, I simply don't know. I find each of those scenarios plausible, at least from this distance. Given the apparent lack of a good paper trail, I assume that he'll eventually win the legal battle. From this side of the desk, I'll just say that effective paper trails are far harder -- and less under one's control -- than most people assume. It's just not that simple.
Thursday, February 18, 2010
Allocating Positions
It may seem weird for me to ask this now, after all these years, but how does your college allocate positions?
I've seen it done in a few ways, and I have my own preference, but I've never actually seen alternatives spelled out systematically.
The most common way that I've seen has been as a sort of spoils system for gladiatorial combat among administrators. Each dean curries favor with the VP over time, and when hiring time rolls around, positions are allocated in rough proportion to the political standing of the deans.
I'm not a fan of that system, for what should be obvious reasons. It's arbitrary, it has nothing to do with teaching or staffing needs, and it redirects employee energy away from the core business at hand and towards internal politicking. Yet it survives, probably because it satisfies the egos of some powerful people. Not for me, thanks.
The second most common way is historical. In relatively flush times, this means one-for-one replacements; in leaner times, it means a de facto "take a number" system. (I suppose this wouldn't work during expansionary times, but I've never seen expansionary times, so the point is pretty abstract. Maybe someday...) This doesn't seem like any kind of improvement to me, since at its base, it's still arbitrary. Who's to say that the historical allocation is still the best? Even if it made sense when it was established -- a big 'if,' to be sure -- circumstances change over time. Worse, it gives rise to a sense of entitlement at the department level, which adds an unnecessary layer of conflict when the needs of the college have changed over time. "They took away our position." No, the college put it where it was more needed. It was never 'yours.' But good luck making that distinction when it has been the de facto standard for years.
It's also possible to allocate positions 'strategically,' which in management speak usually means deciding what the next 'hot' area will be and pouring resources into it. The argument for this 'picking winners' approach is that a college can build up a given area pretty quickly, even without a lot of money. The downsides are several. It starves out other areas, it rolls the dice on a single judgment call, and in practice it usually dovetails with method 1.
I've heard, too, of colleges using something like quotas, and calling it 'evidence-based.' The idea would be to set numerical parameters -- a given ft/adjunct ratio, say -- and use new positions to get outlying areas into compliance with the parameters.
This method strikes me as less objectionable, since it reduces the relevance of the courtier and actually refers to conditions on the ground. But it's still a blunt instrument. On the faculty side, adjuncts are easier to find in some fields than others, particularly during the day. Ignoring that reality will lead to some very unfortunate results in short order. Some fields -- music, foreign languages -- will always have relatively high adjunct percentages, just because the subject matter is spread over so many subfields (instruments or languages). You wouldn't add up single sections of Japanese, Russian, and Arabic and combine them into one position. And some fields have stringent external accreditation requirements that essentially take the choice out of your hands.
This method also falls apart when comparing unlike jobs. Given enough funding for one position, what's the basis for comparing a financial aid application processor, a reference librarian, and a biologist? There's no obvious common denominator.
My personal preference is a hybrid evidence-based/star-chamber method. Have a meeting in which the deans make arguments using the same criteria, then have a vote. Having to make arguments with reference to given criteria can smoke out the weaker claims, and a vote reduces the chances of any one person's arbitrary whims mattering. It's still flawed, though, to the extent that criteria are blunt instruments and that voting can fall prey to logrolling, wheeling/dealing, and the like.
Wise and worldly readers, I'm wondering if you've seen (or imagined) a more successful way to allocate positions. Any great ideas floating around out there?
I've seen it done in a few ways, and I have my own preference, but I've never actually seen alternatives spelled out systematically.
The most common way that I've seen has been as a sort of spoils system for gladiatorial combat among administrators. Each dean curries favor with the VP over time, and when hiring time rolls around, positions are allocated in rough proportion to the political standing of the deans.
I'm not a fan of that system, for what should be obvious reasons. It's arbitrary, it has nothing to do with teaching or staffing needs, and it redirects employee energy away from the core business at hand and towards internal politicking. Yet it survives, probably because it satisfies the egos of some powerful people. Not for me, thanks.
The second most common way is historical. In relatively flush times, this means one-for-one replacements; in leaner times, it means a de facto "take a number" system. (I suppose this wouldn't work during expansionary times, but I've never seen expansionary times, so the point is pretty abstract. Maybe someday...) This doesn't seem like any kind of improvement to me, since at its base, it's still arbitrary. Who's to say that the historical allocation is still the best? Even if it made sense when it was established -- a big 'if,' to be sure -- circumstances change over time. Worse, it gives rise to a sense of entitlement at the department level, which adds an unnecessary layer of conflict when the needs of the college have changed over time. "They took away our position." No, the college put it where it was more needed. It was never 'yours.' But good luck making that distinction when it has been the de facto standard for years.
It's also possible to allocate positions 'strategically,' which in management speak usually means deciding what the next 'hot' area will be and pouring resources into it. The argument for this 'picking winners' approach is that a college can build up a given area pretty quickly, even without a lot of money. The downsides are several. It starves out other areas, it rolls the dice on a single judgment call, and in practice it usually dovetails with method 1.
I've heard, too, of colleges using something like quotas, and calling it 'evidence-based.' The idea would be to set numerical parameters -- a given ft/adjunct ratio, say -- and use new positions to get outlying areas into compliance with the parameters.
This method strikes me as less objectionable, since it reduces the relevance of the courtier and actually refers to conditions on the ground. But it's still a blunt instrument. On the faculty side, adjuncts are easier to find in some fields than others, particularly during the day. Ignoring that reality will lead to some very unfortunate results in short order. Some fields -- music, foreign languages -- will always have relatively high adjunct percentages, just because the subject matter is spread over so many subfields (instruments or languages). You wouldn't add up single sections of Japanese, Russian, and Arabic and combine them into one position. And some fields have stringent external accreditation requirements that essentially take the choice out of your hands.
This method also falls apart when comparing unlike jobs. Given enough funding for one position, what's the basis for comparing a financial aid application processor, a reference librarian, and a biologist? There's no obvious common denominator.
My personal preference is a hybrid evidence-based/star-chamber method. Have a meeting in which the deans make arguments using the same criteria, then have a vote. Having to make arguments with reference to given criteria can smoke out the weaker claims, and a vote reduces the chances of any one person's arbitrary whims mattering. It's still flawed, though, to the extent that criteria are blunt instruments and that voting can fall prey to logrolling, wheeling/dealing, and the like.
Wise and worldly readers, I'm wondering if you've seen (or imagined) a more successful way to allocate positions. Any great ideas floating around out there?
Wednesday, February 17, 2010
Tick Tock
Last night I took The Girl to the local Father-Daughter dance, which is a big event for girls here in the K-5 set.
TG was soooo excited to go. Her dress looked like the chocolate-strawberry-vanilla ice cream cartons, with bands of white, pink, and brown. We took a bunch of pictures at home before heading out, so the Grandmas will have pictures of TG in her finest.
TW took The Boy to see the new Percy Jackson movie. Fair is fair. He gave the movie the thumbs-up.
The dance was packed, and it looked like a cheesy wedding reception. It was Dads in suits and girls in dresses. Girls dragged Dads onto the dance floor, and the Dads were mostly game. It was oddly reassuring to be surrounded by so many men my own age; I realized that for all the signs of aging I see in myself, it could be a lot worse. Also, some guys need to lay off the self-tanner. I'm just sayin'.
The musical choices were a little scattered. A few wedding reception standards: the chicken dance, the macarena, twist and shout. A Michael Jackson medley, with mixed results. (The girls were far too young to know the Thriller dance, and didn't seem to know quite what to do when the dj did the dancing-zombie thing.) A Taylor Swift song, to which every 8 year old girl in the place knew every word, and sang along loudly. ("MARRY ME JULIET YOU'LL NEVER HAVE TO BE ALONE...") Ke$ha's "Tick Tock," which starts with an image of using Jack Daniels as mouthwash; probably not a choice I would have made for the grade school set. YMCA, of course. A few songs with instructions, which reminded me of those awful square dancing units in junior high gym class. Some hip-hop with lyrics I didn't think entirely appropriate to the setting. And a few father/daughter heartbreaker songs.
Okay, a word about father/daughter heartbreaker songs. Yes, they're manipulative, and sappy, and predictable, and problematic in reinscribing blah blah blah. But when you're imagining your own daughter growing up...
When I got there, the Dads were mostly uncomfortable, and doing that sulky, reserved-menacing thing. During the faster songs, they either mostly didn't dance, or made a goofy-good-sport effort.
But during the father/daughter heartbreaker songs, they -- okay, we -- got a little weepy. And I took comfort in seeing that it wasn't just me.
TG didn't get the lyrics, which somehow made them that much more effective. She has no idea what's next. Her world is clear and safe and loving, and it mostly makes sense. When we got home, she got her usual bedtime story and contentedly fell asleep. She's trusting, because she has no reason not to be.
That will change over time, though she doesn't know it yet. She'll get her heart broken, and she'll learn that people don't always make sense. She'll develop self-doubt, and experience betrayals and disappointments and everything that goes with them. Right now, I know that and she doesn't.
I know the clock is ticking. But she doesn't, and that's okay. Let her wonder why the Dads got a little weepy. A little wonder can be my Valentine's gift to her, just as her cloudless optimism is her gift to me. For now, there are bedtime stories to read, and stuffed animals to keep company.
TG was soooo excited to go. Her dress looked like the chocolate-strawberry-vanilla ice cream cartons, with bands of white, pink, and brown. We took a bunch of pictures at home before heading out, so the Grandmas will have pictures of TG in her finest.
TW took The Boy to see the new Percy Jackson movie. Fair is fair. He gave the movie the thumbs-up.
The dance was packed, and it looked like a cheesy wedding reception. It was Dads in suits and girls in dresses. Girls dragged Dads onto the dance floor, and the Dads were mostly game. It was oddly reassuring to be surrounded by so many men my own age; I realized that for all the signs of aging I see in myself, it could be a lot worse. Also, some guys need to lay off the self-tanner. I'm just sayin'.
The musical choices were a little scattered. A few wedding reception standards: the chicken dance, the macarena, twist and shout. A Michael Jackson medley, with mixed results. (The girls were far too young to know the Thriller dance, and didn't seem to know quite what to do when the dj did the dancing-zombie thing.) A Taylor Swift song, to which every 8 year old girl in the place knew every word, and sang along loudly. ("MARRY ME JULIET YOU'LL NEVER HAVE TO BE ALONE...") Ke$ha's "Tick Tock," which starts with an image of using Jack Daniels as mouthwash; probably not a choice I would have made for the grade school set. YMCA, of course. A few songs with instructions, which reminded me of those awful square dancing units in junior high gym class. Some hip-hop with lyrics I didn't think entirely appropriate to the setting. And a few father/daughter heartbreaker songs.
Okay, a word about father/daughter heartbreaker songs. Yes, they're manipulative, and sappy, and predictable, and problematic in reinscribing blah blah blah. But when you're imagining your own daughter growing up...
When I got there, the Dads were mostly uncomfortable, and doing that sulky, reserved-menacing thing. During the faster songs, they either mostly didn't dance, or made a goofy-good-sport effort.
But during the father/daughter heartbreaker songs, they -- okay, we -- got a little weepy. And I took comfort in seeing that it wasn't just me.
TG didn't get the lyrics, which somehow made them that much more effective. She has no idea what's next. Her world is clear and safe and loving, and it mostly makes sense. When we got home, she got her usual bedtime story and contentedly fell asleep. She's trusting, because she has no reason not to be.
That will change over time, though she doesn't know it yet. She'll get her heart broken, and she'll learn that people don't always make sense. She'll develop self-doubt, and experience betrayals and disappointments and everything that goes with them. Right now, I know that and she doesn't.
I know the clock is ticking. But she doesn't, and that's okay. Let her wonder why the Dads got a little weepy. A little wonder can be my Valentine's gift to her, just as her cloudless optimism is her gift to me. For now, there are bedtime stories to read, and stuffed animals to keep company.
Tuesday, February 16, 2010
Local Control?
I know I'm supposed to believe that standardization, state or national rules, or anything beyond local control is of the devil, a corporate conspiracy, and of a piece with water fluoridation and the metric system on the list of communist conspiracies against our way of life. I get that.
But I'm still at a loss to explain why we leave some really fundamental decisions in the K-12 system to local control.
To give a really close-to-home example, my state doesn't require high school students to take four years of math. Many students stop after their sophomore year, so they can better focus on, uh, well, whatever the hell it is that they focus on for the next two years. The statewide high school exit exam doesn't test any math above the ninth-grade level, so the students can pass it early and spend two years focused on texting and football, or whatever. And apparently many of them do.
Then we wonder why so many recent high school grads place developmental in math, and we get mad at community colleges for teaching a "second" time something may or may not have been taught a first time.
The higher education that matters is regionally accredited. It's not local, and there's a good reason for that. Given how incredibly important good K-12 instruction is for success at higher levels, I'm perplexed why we leave decent instruction in the fundamentals to chance. "We don't believe in trigonometry here in East Englishmuffin." It's not up to you.
I understand a semi-reasonable argument for local control beyond the fundamentals. A school that gives the community the extras it wants will have an easier time getting higher taxes tolerated. Fair enough. But to just opt out of history, or math, or chemistry shouldn't be an option. Those aren't extras. They're the core. They should be the first order of business. Whether to field a lacrosse team could be a local option; whether to teach biology shouldn't be.
Developmental classes are designed on the assumption that the student has seen the material before; that's why they move so quickly. But if some districts are just opting out of the basics altogether, those students are in for a rude shock. And from what years of national studies of developmental ed have taught us, students who start out behind will have a much tougher row to hoe. It's possible, yes, but it's an extra burden that many students find unbearable.
I've heard economists argue that if you don't like the choices made by your local school district, you should just move. But that so badly understates the transition costs, and so grossly overstates the access to relevant information that most parents have, that it's just otherworldly. It's like those old physics word problems that start with "assuming no gravity..." Even granting a certain theoretical truth to the assertion, it's impractical to the point of silliness.
If we aren't teaching math in the latter years of high school, just what, exactly, are we paying for? What are they doing all day? And from a systems perspective, how much sense does it make to blow off math for the last two years of high school, only to teach it in compressed form in the first semesters of community college? Wouldn't it have made vastly more sense to get it right the first time?
Call me a stalking horse for standardization, if you must, but I think four years of required math and history and science in high school makes sense. Lacrosse is fine, and electives are great, but first things first. I'm tired of watching successive waves of students crash like the soldiers at Gallipoli. We know better. Frankly, I can think of worse uses for my property tax dollars than teaching math.
But I'm still at a loss to explain why we leave some really fundamental decisions in the K-12 system to local control.
To give a really close-to-home example, my state doesn't require high school students to take four years of math. Many students stop after their sophomore year, so they can better focus on, uh, well, whatever the hell it is that they focus on for the next two years. The statewide high school exit exam doesn't test any math above the ninth-grade level, so the students can pass it early and spend two years focused on texting and football, or whatever. And apparently many of them do.
Then we wonder why so many recent high school grads place developmental in math, and we get mad at community colleges for teaching a "second" time something may or may not have been taught a first time.
The higher education that matters is regionally accredited. It's not local, and there's a good reason for that. Given how incredibly important good K-12 instruction is for success at higher levels, I'm perplexed why we leave decent instruction in the fundamentals to chance. "We don't believe in trigonometry here in East Englishmuffin." It's not up to you.
I understand a semi-reasonable argument for local control beyond the fundamentals. A school that gives the community the extras it wants will have an easier time getting higher taxes tolerated. Fair enough. But to just opt out of history, or math, or chemistry shouldn't be an option. Those aren't extras. They're the core. They should be the first order of business. Whether to field a lacrosse team could be a local option; whether to teach biology shouldn't be.
Developmental classes are designed on the assumption that the student has seen the material before; that's why they move so quickly. But if some districts are just opting out of the basics altogether, those students are in for a rude shock. And from what years of national studies of developmental ed have taught us, students who start out behind will have a much tougher row to hoe. It's possible, yes, but it's an extra burden that many students find unbearable.
I've heard economists argue that if you don't like the choices made by your local school district, you should just move. But that so badly understates the transition costs, and so grossly overstates the access to relevant information that most parents have, that it's just otherworldly. It's like those old physics word problems that start with "assuming no gravity..." Even granting a certain theoretical truth to the assertion, it's impractical to the point of silliness.
If we aren't teaching math in the latter years of high school, just what, exactly, are we paying for? What are they doing all day? And from a systems perspective, how much sense does it make to blow off math for the last two years of high school, only to teach it in compressed form in the first semesters of community college? Wouldn't it have made vastly more sense to get it right the first time?
Call me a stalking horse for standardization, if you must, but I think four years of required math and history and science in high school makes sense. Lacrosse is fine, and electives are great, but first things first. I'm tired of watching successive waves of students crash like the soldiers at Gallipoli. We know better. Frankly, I can think of worse uses for my property tax dollars than teaching math.
Monday, February 15, 2010
Thoughts on the Alabama Shooting
Every time I hear about a shooting on a college campus, I wince. This one was especially surprising, given that the (alleged) shooter was a professor and a woman.
It seems that Amy Bishop, a professor in the biology department at the University of Alabama at Huntsville, opened fire at a department meeting, killing several colleagues and wounding several more. She has been taken into custody, and she will face murder charges.
Although it's hard to know immediate causes, Prof. Bishop had recently been denied tenure, and this was to be her final semester at Alabama.
First, obviously, my condolences to everybody at Alabama, and everybody with family and friends there. At some level, I think we all know that there but for the grace of God...
Second, though, I'm cringing as I imagine the ways this case will get used in other arguments.
Second-day coverage revealed that Bishop had previously shot and killed her brother in Massachusetts in the 1980's. The case was officially classified as an accident, though the paper trail is murky. For my money, any explanation of the Alabama incident needs to mention the Massachusetts one. Instead of using Bishop as somehow typical of a tenure case gone bad, let's keep in mind that this is a woman who shot a close family member in the chest. There's nothing typical about her.
Regular readers know that I consider the tenure system unethical, and that I've specifically taken it to task for the "up or out" moment of decision. That position isn't terribly popular in higher ed, but there it is.
But to use this case to argue against the tenure system strikes me as way out of bounds. This isn't about a typical, predictable consequence of the tenure system. It's about someone who has killed before, killing again.
To make this case about the tenure system, we'd have to imagine that shootings like these don't happen in other employment settings. But they do. They happen in corporations, schools, and all manner of public and private places. I don't begin to understand all the reasons they happen, but they do. My kids have had 'lockdown drills' at their school, so they'll know what to do if a deranged person gets in. The fact that we think to put five year olds through lockdown drills says a lot about our culture, most of it harrowing, but I'm glad they do the drills. When I was a kid, I never heard the word 'lockdown.' Then again, I never heard words like "Columbine" or "Virginia Tech," either.
The horror of this case, besides the obvious human loss, is the sheer randomness of it. Although the facts are still streaming in, it doesn't look like any reasonable measures could have prevented it. Bishop wasn't trespassing; she was still an employee there. Short of patdowns of every person every time they come on campus, she wouldn't have been stopped. Since she was never charged in the 1980's shooting, it's unlikely that even strict gun control laws would have prevented her from getting a weapon if she wanted one, or that even a rigorous pre-employment background check would have prevented her hire. On paper, as far as I know at this point, there weren't any red flags. Given how quickly the shooting apparently unfolded, even a heavy and heavily armed campus security force (or heavily armed colleagues) couldn't have stopped her in time.
From what is known now, this case doesn't provide another example for some ideological battle. It's not about gun control, or campus security, or tenure, or any of that. It's about someone who came unhinged and committed a horrible crime, and about the losses of several innocent people.
I know the internet has its own habits of mind, but for anyone out there thinking of using this case as "yet another example of...," please don't. Let's not use a deranged shooter to make points. The crime is awful enough as it is.
It seems that Amy Bishop, a professor in the biology department at the University of Alabama at Huntsville, opened fire at a department meeting, killing several colleagues and wounding several more. She has been taken into custody, and she will face murder charges.
Although it's hard to know immediate causes, Prof. Bishop had recently been denied tenure, and this was to be her final semester at Alabama.
First, obviously, my condolences to everybody at Alabama, and everybody with family and friends there. At some level, I think we all know that there but for the grace of God...
Second, though, I'm cringing as I imagine the ways this case will get used in other arguments.
Second-day coverage revealed that Bishop had previously shot and killed her brother in Massachusetts in the 1980's. The case was officially classified as an accident, though the paper trail is murky. For my money, any explanation of the Alabama incident needs to mention the Massachusetts one. Instead of using Bishop as somehow typical of a tenure case gone bad, let's keep in mind that this is a woman who shot a close family member in the chest. There's nothing typical about her.
Regular readers know that I consider the tenure system unethical, and that I've specifically taken it to task for the "up or out" moment of decision. That position isn't terribly popular in higher ed, but there it is.
But to use this case to argue against the tenure system strikes me as way out of bounds. This isn't about a typical, predictable consequence of the tenure system. It's about someone who has killed before, killing again.
To make this case about the tenure system, we'd have to imagine that shootings like these don't happen in other employment settings. But they do. They happen in corporations, schools, and all manner of public and private places. I don't begin to understand all the reasons they happen, but they do. My kids have had 'lockdown drills' at their school, so they'll know what to do if a deranged person gets in. The fact that we think to put five year olds through lockdown drills says a lot about our culture, most of it harrowing, but I'm glad they do the drills. When I was a kid, I never heard the word 'lockdown.' Then again, I never heard words like "Columbine" or "Virginia Tech," either.
The horror of this case, besides the obvious human loss, is the sheer randomness of it. Although the facts are still streaming in, it doesn't look like any reasonable measures could have prevented it. Bishop wasn't trespassing; she was still an employee there. Short of patdowns of every person every time they come on campus, she wouldn't have been stopped. Since she was never charged in the 1980's shooting, it's unlikely that even strict gun control laws would have prevented her from getting a weapon if she wanted one, or that even a rigorous pre-employment background check would have prevented her hire. On paper, as far as I know at this point, there weren't any red flags. Given how quickly the shooting apparently unfolded, even a heavy and heavily armed campus security force (or heavily armed colleagues) couldn't have stopped her in time.
From what is known now, this case doesn't provide another example for some ideological battle. It's not about gun control, or campus security, or tenure, or any of that. It's about someone who came unhinged and committed a horrible crime, and about the losses of several innocent people.
I know the internet has its own habits of mind, but for anyone out there thinking of using this case as "yet another example of...," please don't. Let's not use a deranged shooter to make points. The crime is awful enough as it is.
Friday, February 12, 2010
Ask the Administrator: No Exit?
An off-the-beaten-path correspondent writes:
I won't go into the path that took you here, since it is what it is. But the question raises a wonderful point about career paths.
Many staff positions at community (and other) colleges don't lend themselves to obvious career ladders. That's not to say people can't move up; it's just to say that the paths often aren't immediately apparent. Educational planners, academic advisors, and other support staff in student service areas often hit the ceiling of their role very early on. When that happens, there's a choice to be made. (The same is true of faculty, in many ways. Once they hit "full professor" status, the only place left to go is administration, which is a fundamentally different job.)
As you correctly point out, some trades are clearer than others. Depending on the local situation, it wouldn't be unusual to see a de facto trade of higher job security for lower pay. If you don't care about the money, for whatever reason, then that's fine. But if money is an issue, a low ceiling can be a real demotivator.
On the campuses I've seen, the ed planners or academic advisors typically report to some sort of director, who, in turn, reports to a dean of something much larger -- usually 'student affairs' or something like that. The turnover rate among directors seems to be pretty low -- admittedly, that's anecdotal, and I'm open to correction on that -- but the skill set required to be taken seriously at the dean level is much broader. A dean of students or something similar requires a graduate degree and at least passing familiarity with areas like counseling, admissions, residency laws (a very big deal in some systems, not so much in others), transfer requirements, discipline, and the like. It also requires a pretty good sense of the academic side of the house, ideally through personal teaching experience. Being a good educational planner isn't nearly enough.
Over time, a work area with high security, low pay, and little or no prospect for advancement can get pretty stale, if not toxic.
If your ambition is to move up quickly, I wouldn't advise going this route. But if the job sounds appealing enough that you'd be willing to stick with it a while at its current pay level, go for it. At least here you'll be selling something worth selling.
Good luck!
Wise and worldly readers -- have you seen an issue with low career ceilings on your campus? Is there a graceful way around that?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
I graduated from Public University back in 2008 and since then I have been a yellowpages salesman and then moved onto a scam sales company which I had presumed was a legitimate business model when starting. My question to you: I love sales but I want a more secure position and I applied for a position with a local community college as an educational planner and retention specialist. This position offers a fairly low base salary but I could see myself staying there for a long while. At the community college level is there a lot of chance for promotion? I would love to take this position but I don't want to put myself in another dead end job situation.
I won't go into the path that took you here, since it is what it is. But the question raises a wonderful point about career paths.
Many staff positions at community (and other) colleges don't lend themselves to obvious career ladders. That's not to say people can't move up; it's just to say that the paths often aren't immediately apparent. Educational planners, academic advisors, and other support staff in student service areas often hit the ceiling of their role very early on. When that happens, there's a choice to be made. (The same is true of faculty, in many ways. Once they hit "full professor" status, the only place left to go is administration, which is a fundamentally different job.)
As you correctly point out, some trades are clearer than others. Depending on the local situation, it wouldn't be unusual to see a de facto trade of higher job security for lower pay. If you don't care about the money, for whatever reason, then that's fine. But if money is an issue, a low ceiling can be a real demotivator.
On the campuses I've seen, the ed planners or academic advisors typically report to some sort of director, who, in turn, reports to a dean of something much larger -- usually 'student affairs' or something like that. The turnover rate among directors seems to be pretty low -- admittedly, that's anecdotal, and I'm open to correction on that -- but the skill set required to be taken seriously at the dean level is much broader. A dean of students or something similar requires a graduate degree and at least passing familiarity with areas like counseling, admissions, residency laws (a very big deal in some systems, not so much in others), transfer requirements, discipline, and the like. It also requires a pretty good sense of the academic side of the house, ideally through personal teaching experience. Being a good educational planner isn't nearly enough.
Over time, a work area with high security, low pay, and little or no prospect for advancement can get pretty stale, if not toxic.
If your ambition is to move up quickly, I wouldn't advise going this route. But if the job sounds appealing enough that you'd be willing to stick with it a while at its current pay level, go for it. At least here you'll be selling something worth selling.
Good luck!
Wise and worldly readers -- have you seen an issue with low career ceilings on your campus? Is there a graceful way around that?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Thursday, February 11, 2010
Attack of the Blob?
The Chronicle of Higher Ed has a few pieces this week on for-profit higher education. They're revealing and thought-provoking, though not always in the ways intended.
As regular readers know, I worked for a while at a for-profit, which I call Proprietary U. It's one of the big ones -- you've heard of it -- and it's fairly typical for the sector. It runs programs in 'hot' occupational fields, with an unapologetic eye towards placing graduates in jobs. I was on the 'general education' side of the house, which was somewhat anomalous within the culture, but not entirely; the folks in the career placement office were quite clear on the importance of communication and critical thinking skills, which were identified mostly with gen ed. Teaching there involved one part Paolo Freire to three parts Henry Higgins: economic empowerment required some pretty substantial transmission of cultural capital.
In my time there, I was at odds with what became a fairly clear expectation that the student is always right. There was also a fixation on 'responsiveness' to the market that became a sort of institutional ADD. Most curricula changed every couple of semesters, and it wasn't unusual to have three different versions of the same program running at the same time. Curricular changes were handed down from Home Office, usually just a few weeks before the start of the semester. Since those changes often involved shifts in the credit hours for various courses, students who deviated in any way from the cohort to which they had been assigned -- whether because of transfer credits, part-time status, stopouts, course failures, or whatever -- were nearly impossible to schedule. (I'd imagine that the move to much more online instruction has rendered much of that moot by now.) In one memorable instance, Home Office executed a drastic, last-minute change to a popular program, but didn't bother updating the registrar's software to reflect it. We local administrators had to move heaven and earth to patch the holes. I eventually tired of the insanity and decamped for a community college.
That said, though, I still think that the flaws were largely in the execution, as opposed to the design. Many restaurants fail, but that doesn't mean that the restaurant industry is doomed. There's no particular reason that market responsiveness has to be mindless, or that making a profit has to mean neglecting quality. (As Toyota has discovered recently, neglecting quality can hit your profits pretty hard.) To my mind, the basic mistake has been bottom-feeding. You can't undercut community colleges on price; "taxed and unsubsidized" simply can't compete with "subsidized and untaxed." So instead of going for the low end of the market, the way to win is to go for the high end. Provide that personal attention and small class size that the elites offer, but give it to the kids who couldn't get into the Swarthmores of the world.
Still, part of what strikes me about the rapid rise of the for-profits -- and make no mistake, regionally accredited degree-granting for-profits are growing fast -- is that they've been made possible by the failings of the traditional system.
One of the Chronicle pieces quotes Cary Nelson, the President of the AAUP, as
liken[ing] the for-profit sector to "the blob," an alien life form that consumed everything in its path in the 1958 Steve McQueen movie of the same name.
"The blob would shimmer and then be half again as big as before," Mr. Nelson says. "You'd turn your attention away and look back and suddenly, it's blocking out most of the sun."
Exactly wrong. The for-profits didn't come from outer space, or for no particular reason. They emerged to fill gaps in the nonprofit system. Their growth is a direct and predictable reflection of the existing system's failures. If the best the AAUP can come up with is an analogy to space aliens, they just don't get it. And if they don't get it, they won't stop it. Which, so far, they haven't.
Where do the for-profits find the credentialed faculty to teach accredited classes? Hint: is there a shortage of prospective faculty out there?
Where do the for-profits find the students to fill their classes? Hint: how much capacity have the publics added in the last, say, thirty years? And how much of that is in forms and times that are convenient for people with jobs and/or kids?
The for-profits are growing because someone has to. Looking at what California is doing to its system of public higher education, nobody should be surprised to see that the for-profits are thriving there. Students who choose them are frequently the students who couldn't find what they wanted in the public system, or who found it but got shut out by budget cuts.
As Peter Smith correctly notes in a second article, the unmistakable trend of the last few decades in the public systems has been cost-shifting to the individual student. To the extent that costs go the students anyway, the competitive advantage of the publics diminishes. And when the cost-shifting is accompanied by watering-down of quality -- whether by adjuncting-out the faculty, or cheaping out on facilities, or stuffing classes ever larger, or any of the other deaths of a thousand cuts -- it's easy to understand why a student would choose to spend more to get more. That's even truer when getting more involves getting it at convenient times and places.
The for-profits have grown too large to be written off as irrelevant anymore. But for those who object to them, the way to fight them is not by attacking them directly. It's by offering a better alternative. Students and faculty didn't beat a path to the University of Phoenix because they read Milton Friedman. They did it because the U of P offered something they wanted in a way that made it more appealing than the alternatives. If you want to compete, you have to compete.
The blob is from this world, not some other. Until we understand that and act accordingly, it will just keep growing.
As regular readers know, I worked for a while at a for-profit, which I call Proprietary U. It's one of the big ones -- you've heard of it -- and it's fairly typical for the sector. It runs programs in 'hot' occupational fields, with an unapologetic eye towards placing graduates in jobs. I was on the 'general education' side of the house, which was somewhat anomalous within the culture, but not entirely; the folks in the career placement office were quite clear on the importance of communication and critical thinking skills, which were identified mostly with gen ed. Teaching there involved one part Paolo Freire to three parts Henry Higgins: economic empowerment required some pretty substantial transmission of cultural capital.
In my time there, I was at odds with what became a fairly clear expectation that the student is always right. There was also a fixation on 'responsiveness' to the market that became a sort of institutional ADD. Most curricula changed every couple of semesters, and it wasn't unusual to have three different versions of the same program running at the same time. Curricular changes were handed down from Home Office, usually just a few weeks before the start of the semester. Since those changes often involved shifts in the credit hours for various courses, students who deviated in any way from the cohort to which they had been assigned -- whether because of transfer credits, part-time status, stopouts, course failures, or whatever -- were nearly impossible to schedule. (I'd imagine that the move to much more online instruction has rendered much of that moot by now.) In one memorable instance, Home Office executed a drastic, last-minute change to a popular program, but didn't bother updating the registrar's software to reflect it. We local administrators had to move heaven and earth to patch the holes. I eventually tired of the insanity and decamped for a community college.
That said, though, I still think that the flaws were largely in the execution, as opposed to the design. Many restaurants fail, but that doesn't mean that the restaurant industry is doomed. There's no particular reason that market responsiveness has to be mindless, or that making a profit has to mean neglecting quality. (As Toyota has discovered recently, neglecting quality can hit your profits pretty hard.) To my mind, the basic mistake has been bottom-feeding. You can't undercut community colleges on price; "taxed and unsubsidized" simply can't compete with "subsidized and untaxed." So instead of going for the low end of the market, the way to win is to go for the high end. Provide that personal attention and small class size that the elites offer, but give it to the kids who couldn't get into the Swarthmores of the world.
Still, part of what strikes me about the rapid rise of the for-profits -- and make no mistake, regionally accredited degree-granting for-profits are growing fast -- is that they've been made possible by the failings of the traditional system.
One of the Chronicle pieces quotes Cary Nelson, the President of the AAUP, as
liken[ing] the for-profit sector to "the blob," an alien life form that consumed everything in its path in the 1958 Steve McQueen movie of the same name.
"The blob would shimmer and then be half again as big as before," Mr. Nelson says. "You'd turn your attention away and look back and suddenly, it's blocking out most of the sun."
Exactly wrong. The for-profits didn't come from outer space, or for no particular reason. They emerged to fill gaps in the nonprofit system. Their growth is a direct and predictable reflection of the existing system's failures. If the best the AAUP can come up with is an analogy to space aliens, they just don't get it. And if they don't get it, they won't stop it. Which, so far, they haven't.
Where do the for-profits find the credentialed faculty to teach accredited classes? Hint: is there a shortage of prospective faculty out there?
Where do the for-profits find the students to fill their classes? Hint: how much capacity have the publics added in the last, say, thirty years? And how much of that is in forms and times that are convenient for people with jobs and/or kids?
The for-profits are growing because someone has to. Looking at what California is doing to its system of public higher education, nobody should be surprised to see that the for-profits are thriving there. Students who choose them are frequently the students who couldn't find what they wanted in the public system, or who found it but got shut out by budget cuts.
As Peter Smith correctly notes in a second article, the unmistakable trend of the last few decades in the public systems has been cost-shifting to the individual student. To the extent that costs go the students anyway, the competitive advantage of the publics diminishes. And when the cost-shifting is accompanied by watering-down of quality -- whether by adjuncting-out the faculty, or cheaping out on facilities, or stuffing classes ever larger, or any of the other deaths of a thousand cuts -- it's easy to understand why a student would choose to spend more to get more. That's even truer when getting more involves getting it at convenient times and places.
The for-profits have grown too large to be written off as irrelevant anymore. But for those who object to them, the way to fight them is not by attacking them directly. It's by offering a better alternative. Students and faculty didn't beat a path to the University of Phoenix because they read Milton Friedman. They did it because the U of P offered something they wanted in a way that made it more appealing than the alternatives. If you want to compete, you have to compete.
The blob is from this world, not some other. Until we understand that and act accordingly, it will just keep growing.
Wednesday, February 10, 2010
Charging for Quality
There's a thoughtful discussion over at Dr. Crazy's about full-time faculty workload. (The post was a response to Tenured Radical's own discussion here.) Within a recognition of the importance of context, Dr. C notes that what looks on paper like a static workload has actually been increasing in insidious ways over the years. (It's all the extra, off-the-books stuff, like advising and assessment, that consumes the extra hours.) In her estimation, some faculty have made themselves martyrs, and others have "half-assed it" in teaching, since there's really no institutional reason not to. Recognizing the limits of the strategy, she has allied herself with some colleagues to look for ways to streamline the extra tasks to allow for a more sustainable workload that doesn't shortchange students.
She notes in passing the different understandings of 'productivity' that underlie the work speedup. More students per class increases the tuition generated per professor, at least in the short run; that's one version of 'productivity.' More students per class decreases the amount of individual attention the professor can really give, which leads to a decline in the quality of feedback; that's another version of 'productivity.' Both versions are internally valid, but they don't necessarily mesh with each other. And that's where the real problem is.
From the standpoint of an individual instructor, the controllable variable (at least to some degree) is the quality of instruction. That's also what you care the most about, what you pride yourself on, and at a really basic level, why you're there.
From the standpoint of trying to make payroll, though, the opposite is true. A thrilled student doesn't pay any more than does a barely-contented student. (There's presumably a minimal level at which attrition becomes an issue, but I'm assuming at least basic competence.) Students pay by the credit, the course, or the year; they don't pay by the breakthrough. The 'extras' that a great class can generate don't show up in the budget. Worse, some students actually prefer classes that don't ask very much of them. (If you doubt the truth of this, spend a day at in-person registration, just listening.) The mutual non-aggression pact between an instructor who doesn't ask very much and students who'd rather not be bothered is one of the open secrets of American higher ed, and it fits short-term institutional needs disturbingly well. There's a reason that Rocks for Jocks and Physics for Poets still exist.
(I'll add here that I agree with Dr. C that in some classes, there's really a minimum size beneath which quality can actually drop. A public speaking class with two students isn't really a public speaking class. I once had a section of six very quiet personalities, and teaching that was painful. A couple of sparkplugs would have enhanced the class tremendously. But this is really a side issue.)
The endemic conflict is that beyond a minimal level, and outside of the elites, there's no economic incentive for the institution to do better than okay in the classroom. Once you understand that, the rest follows. (There's a moral imperative, but that's a different issue.) For a college that's struggling to stay afloat financially, the short-term cost of stuffing a few extra seats into each class is dwarfed by the tangible and immediate tuition gain (or labor saving). You can blame pinheaded administrators for that if you want to, but you'd be missing the point; the math is what the math is. When the college is flush, it's possible to make a choice to have your cake and eat it too; when the college is strapped, though, the contradiction is unavoidable.
And that's the core issue. If you want to be paid for quality instead of quantity, you have to charge by quality rather than quantity. You have to align the incentives.
The elite SLAC's do a version of this by selling exclusivity. If you charge 50k a year for a small residential college, small classes are part of what you sell. There's a market for that, and you'd defeat your own niche if you watered that down too far. But most colleges and universities don't do that.
And it's not entirely clear how to do that. In olden times, I'm told, lecturers went out on public circuits, and the audiences paid them according to how impressed they were. It was lecturing for tips. But I don't see that (or anything terribly close to it) making sense now, if it ever did.
One could argue that philanthropy is a very delayed response to 'extra' quality --- quality above and beyond what the student paid for --- but the length and uncertainty of the delay makes it a difficult sell. I agree that community colleges could and should do a better job courting philanthropic resources, but I'm not convinced that this will tip the balance in most cases, particularly in the short term. One could also argue that 'prestige' is a proxy for quality, but anyone who has t.a.'ed intro classes at prestigious places (or who has taken those intro classes with t.a.'s) can tell you that the connection between prestige and quality is problematic at best. A good adjunct will often do a better job than a full-time professor who's "half-assing" it, and will do it for much less. As long as price isn't connected to quality, these perverse incentives will arise.
Since I haven't cracked this particular nut, I'll crowdsource it. Wise and worldly readers, is there a sustainable way for colleges to charge by quality rather than quantity?
She notes in passing the different understandings of 'productivity' that underlie the work speedup. More students per class increases the tuition generated per professor, at least in the short run; that's one version of 'productivity.' More students per class decreases the amount of individual attention the professor can really give, which leads to a decline in the quality of feedback; that's another version of 'productivity.' Both versions are internally valid, but they don't necessarily mesh with each other. And that's where the real problem is.
From the standpoint of an individual instructor, the controllable variable (at least to some degree) is the quality of instruction. That's also what you care the most about, what you pride yourself on, and at a really basic level, why you're there.
From the standpoint of trying to make payroll, though, the opposite is true. A thrilled student doesn't pay any more than does a barely-contented student. (There's presumably a minimal level at which attrition becomes an issue, but I'm assuming at least basic competence.) Students pay by the credit, the course, or the year; they don't pay by the breakthrough. The 'extras' that a great class can generate don't show up in the budget. Worse, some students actually prefer classes that don't ask very much of them. (If you doubt the truth of this, spend a day at in-person registration, just listening.) The mutual non-aggression pact between an instructor who doesn't ask very much and students who'd rather not be bothered is one of the open secrets of American higher ed, and it fits short-term institutional needs disturbingly well. There's a reason that Rocks for Jocks and Physics for Poets still exist.
(I'll add here that I agree with Dr. C that in some classes, there's really a minimum size beneath which quality can actually drop. A public speaking class with two students isn't really a public speaking class. I once had a section of six very quiet personalities, and teaching that was painful. A couple of sparkplugs would have enhanced the class tremendously. But this is really a side issue.)
The endemic conflict is that beyond a minimal level, and outside of the elites, there's no economic incentive for the institution to do better than okay in the classroom. Once you understand that, the rest follows. (There's a moral imperative, but that's a different issue.) For a college that's struggling to stay afloat financially, the short-term cost of stuffing a few extra seats into each class is dwarfed by the tangible and immediate tuition gain (or labor saving). You can blame pinheaded administrators for that if you want to, but you'd be missing the point; the math is what the math is. When the college is flush, it's possible to make a choice to have your cake and eat it too; when the college is strapped, though, the contradiction is unavoidable.
And that's the core issue. If you want to be paid for quality instead of quantity, you have to charge by quality rather than quantity. You have to align the incentives.
The elite SLAC's do a version of this by selling exclusivity. If you charge 50k a year for a small residential college, small classes are part of what you sell. There's a market for that, and you'd defeat your own niche if you watered that down too far. But most colleges and universities don't do that.
And it's not entirely clear how to do that. In olden times, I'm told, lecturers went out on public circuits, and the audiences paid them according to how impressed they were. It was lecturing for tips. But I don't see that (or anything terribly close to it) making sense now, if it ever did.
One could argue that philanthropy is a very delayed response to 'extra' quality --- quality above and beyond what the student paid for --- but the length and uncertainty of the delay makes it a difficult sell. I agree that community colleges could and should do a better job courting philanthropic resources, but I'm not convinced that this will tip the balance in most cases, particularly in the short term. One could also argue that 'prestige' is a proxy for quality, but anyone who has t.a.'ed intro classes at prestigious places (or who has taken those intro classes with t.a.'s) can tell you that the connection between prestige and quality is problematic at best. A good adjunct will often do a better job than a full-time professor who's "half-assing" it, and will do it for much less. As long as price isn't connected to quality, these perverse incentives will arise.
Since I haven't cracked this particular nut, I'll crowdsource it. Wise and worldly readers, is there a sustainable way for colleges to charge by quality rather than quantity?
Tuesday, February 09, 2010
Thoughts on Louis Menand's A Marketplace of Ideas
I read Menand's new book on the kindle app on my ipod touch, which means that I don't have page numbers for references. The good folks at Amazon are invited to find a way for those of us who like to cite sources to do that.
I've been a fan of Menand's for a while. The Metaphysical Club is a fantastic (and readable!) bit of intellectual history, especially for its portrait of William James. Menand has the rare ability to boil complexity down to readability without flattening out the nuances in the process, and heaven knows he does his homework. When I heard that he had a book coming out addressing higher ed, I was excited at the prospect.
It wouldn't be entirely fair to call A Marketplace of Ideas disappointing, since the expectations I brought to it aren't its fault. Menand admits at the outset that the questions he addresses are salient mostly at the rarified level of elite graduate institutions; community colleges are mentioned only in passing, and mostly as afterthoughts. Given the perspective from which he writes, the analyses strike me as sane and grounded, but they mostly aren't the questions I would have asked. For example, he devotes a chapter to the battles over defining general education, focusing mostly on Columbia vs. Harvard. Well, okay, but that's not really my world. At my cc, we define general education mostly by transfer requirements. Lively questions around Gen Ed here have mostly to do with developmental needs, outcomes assessment, and how to offer non-standardized experiences (i.e. freshman seminars) when transfer agreements are written around checklists of traditional disciplines. Jacques Barzun has nothing to do with it.
That said, Menand's book offers some useful food for thought, even if it has to be rearranged a bit on the plate.
A few of his basic facts tell a good bit of the story. From 1945 to 1975, the number of undergraduate students in the US went up 500 percent, but the number of graduate students went up 900 percent. Since then, growth of undergrads has slowed dramatically, but graduate students just keep increasing. Menand pointed out that from 1989 to 1996, the number of graduate students in most liberal arts disciplines increased steadily, even as the number of undergrads nationally declined every year. As he correctly put it, by the 90's "the supply curve had completely lost touch with the demand curve in American academic life." That's because the incentives for individual universities are skewed in favor of producing as many ABD's as possible, whether there are full-time jobs out there for them or not. As one would expect from incentives like that, the time-to-completion figures in the liberal arts fields keep getting longer, even as the market utility of the degree continues to drop. In Menand's estimation, the predictable consequence of these conflicting trends is that all but the truest believers are screened out, and those who remain are neurotically attached to the status quo, despite its obvious unsustainability.
Oddly, though, his proposed solution is to make Ph.D.'s much easier (and faster) to achieve. By setting degree completion at a determinate length, like the three years of law school, he hopes to open up the doctorate to people who currently self-select out. The upside would be a greater likelihood of diverse approaches to scholarship. Why an already-flooded market would benefit from an even greater influx, though, is not entirely clear. Given that the liberal arts Ph.D. is largely unrecognized as an asset outside of academia, and given that the supply curve left the demand curve in the dust decades ago, one would expect that increasing the supply would be the last thing you'd want to do. My best guess at an interpretation is that Menand considers groupthink a greater problem than unemployment. I suspect that solving the unemployment problem would also solve most of the groupthink problem. History will decide.
Given Menand's well-deserved prominence, I hoped he would have used the opportunity to address the root causes of the crisis of higher ed. He gestured towards some of them with his discussions of demographics and the anomalous growth rates of the 1960's, but simply didn't address the productivity trap of measuring learning in units of seat time. He also left out most of the competing demands for public dollars, and offered only a glancing gesture at the political climate of the last thirty years. Dispiritingly, he attributed much of the economic problem to more students majoring in business. At the cc level, at least, that's mostly irrelevant; even business majors have to take gen ed classes. The real story is the farming out of the gen eds to adjuncts, which is made possible by the overproduction of ABD's and Ph.D.'s. Supply and demand curves have a funny way of finding equilibria, even if they aren't where we wish they were.
I'll give Menand credit for his usual readability, and for an unusual level of self-awareness. As advertised, his book gives an elite-faculty-eye view of the trajectory of the liberal arts in America. It just could have been so much more.
I've been a fan of Menand's for a while. The Metaphysical Club is a fantastic (and readable!) bit of intellectual history, especially for its portrait of William James. Menand has the rare ability to boil complexity down to readability without flattening out the nuances in the process, and heaven knows he does his homework. When I heard that he had a book coming out addressing higher ed, I was excited at the prospect.
It wouldn't be entirely fair to call A Marketplace of Ideas disappointing, since the expectations I brought to it aren't its fault. Menand admits at the outset that the questions he addresses are salient mostly at the rarified level of elite graduate institutions; community colleges are mentioned only in passing, and mostly as afterthoughts. Given the perspective from which he writes, the analyses strike me as sane and grounded, but they mostly aren't the questions I would have asked. For example, he devotes a chapter to the battles over defining general education, focusing mostly on Columbia vs. Harvard. Well, okay, but that's not really my world. At my cc, we define general education mostly by transfer requirements. Lively questions around Gen Ed here have mostly to do with developmental needs, outcomes assessment, and how to offer non-standardized experiences (i.e. freshman seminars) when transfer agreements are written around checklists of traditional disciplines. Jacques Barzun has nothing to do with it.
That said, Menand's book offers some useful food for thought, even if it has to be rearranged a bit on the plate.
A few of his basic facts tell a good bit of the story. From 1945 to 1975, the number of undergraduate students in the US went up 500 percent, but the number of graduate students went up 900 percent. Since then, growth of undergrads has slowed dramatically, but graduate students just keep increasing. Menand pointed out that from 1989 to 1996, the number of graduate students in most liberal arts disciplines increased steadily, even as the number of undergrads nationally declined every year. As he correctly put it, by the 90's "the supply curve had completely lost touch with the demand curve in American academic life." That's because the incentives for individual universities are skewed in favor of producing as many ABD's as possible, whether there are full-time jobs out there for them or not. As one would expect from incentives like that, the time-to-completion figures in the liberal arts fields keep getting longer, even as the market utility of the degree continues to drop. In Menand's estimation, the predictable consequence of these conflicting trends is that all but the truest believers are screened out, and those who remain are neurotically attached to the status quo, despite its obvious unsustainability.
Oddly, though, his proposed solution is to make Ph.D.'s much easier (and faster) to achieve. By setting degree completion at a determinate length, like the three years of law school, he hopes to open up the doctorate to people who currently self-select out. The upside would be a greater likelihood of diverse approaches to scholarship. Why an already-flooded market would benefit from an even greater influx, though, is not entirely clear. Given that the liberal arts Ph.D. is largely unrecognized as an asset outside of academia, and given that the supply curve left the demand curve in the dust decades ago, one would expect that increasing the supply would be the last thing you'd want to do. My best guess at an interpretation is that Menand considers groupthink a greater problem than unemployment. I suspect that solving the unemployment problem would also solve most of the groupthink problem. History will decide.
Given Menand's well-deserved prominence, I hoped he would have used the opportunity to address the root causes of the crisis of higher ed. He gestured towards some of them with his discussions of demographics and the anomalous growth rates of the 1960's, but simply didn't address the productivity trap of measuring learning in units of seat time. He also left out most of the competing demands for public dollars, and offered only a glancing gesture at the political climate of the last thirty years. Dispiritingly, he attributed much of the economic problem to more students majoring in business. At the cc level, at least, that's mostly irrelevant; even business majors have to take gen ed classes. The real story is the farming out of the gen eds to adjuncts, which is made possible by the overproduction of ABD's and Ph.D.'s. Supply and demand curves have a funny way of finding equilibria, even if they aren't where we wish they were.
I'll give Menand credit for his usual readability, and for an unusual level of self-awareness. As advertised, his book gives an elite-faculty-eye view of the trajectory of the liberal arts in America. It just could have been so much more.
Monday, February 08, 2010
Ask the Administrator: Is Working at a CC the Kiss of Death in Academia?
A longtime reader writes:
Never having hired at a university, I'll have to defer to my wise and worldly readers for feedback on what they've actually seen and done there. Having said that, though, my initial reaction here is similar to my initial reaction to a reader who was trying to find the perfect time to have a baby: you can control only what you can control.
Yes, I've personally seen community college faculty hired away by four-year public colleges, and once by a second-tier public university. One of the most interesting writers of my generation, Jennifer Michael Hecht, taught full-time at Nassau Community College before moving to her perch at the New School, which ain't too shabby. (I don't know her, but I recommend her book The Happiness Myth to everybody within shouting distance.) Last year my college lost a particularly wonderful junior faculty member to a four-year state college, and it has happened several times over the last few years. I'd be surprised to see a direct hop from a cc to an Ivy, but hops from cc's to state colleges happen with some frequency.
In each case, though, the candidate had something unusual. If you go simply as an accomplished teacher with a doctorate, you'll be one of hundreds. If you really want to make the leap, you'd have to do a kind of double duty while at the cc: do the cc job well, and still build a publication record (or something similar) that would make you desirable at the level you want. It can be done, but there are limits to how much most people will publish with a fifteen credit semester load. For all intents and purposes, you'd be doing at least a job and a half, if not two. Not impossible, but not for the faint of heart.
In any event, though, I wouldn't rule out a job that offers the prospect of a sane and happy life for you and your family on the basis of a hypothetical attack of status anxiety five years from now at some hypothetical university. These things are notoriously hard to predict, and living according to other people's status anxiety is a recipe for misery. If the cc job offers the prospect of doing what you love to do, in a location that works well for your personal life, for a living wage with benefits and security, I wouldn't rule it out. I made a similar decision when I took my nifty academic pedigree to Proprietary U, where it was all teaching, all the time. It wasn't what I had envisioned when I signed up for grad school, but it paid the rent, made sense for my personal life, and eventually opened up an unexpected career path. I couldn't have predicted that at the time, but that's sort of the point.
To my mind, the only convincing argument against applying for the cc job would be if you really don't want it. If the thought of teaching developmental writing, or lots of freshman comp, or fifteen credits per semester gives you chills, then don't do it. But if you can imagine enjoying it for a while, I wouldn't look at it as a life sentence. The world is a huge and unpredictable place.
Wise and worldly readers, especially those who have hired at a university or who have made a similar leap -- what counsel would you offer? Is the c.v. stain indelible, or not?
Good luck!
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
I'm hoping you and your readers can offer some input. I'm on the cusp of receiving my PhD in English from an RI school, having been trained by a well-known and distinguished senior scholar in my field. I went on the academic job market last year and got a couple of nibbles from small regional schools, who were reluctant to make any offers to an ABD when they had equally-qualified applicants with PhDs in hand. I didn't lose hope, though, because my advisor has a 100% placement rating. As things unfolded, my husband and I actually accepted short-term faculty posts at an overseas branch of our school. We planned to get out of some debt, experience a new culture, finish up our dissertations, work on getting published, and then return to the job market after a year or two. The time has come for us to return now, if at all possible, and we have both applied to tenure-track jobs at four-year universities, SLACs, and several community colleges.
In a bizarre turn of events, a tenure-track post has opened in the English department at a cc in my home state (40 minutes away from almost all of my family). My children are the only grandchildren on my side of the family, and they would love nothing more than to live near their grandparents. The cost of living is extremely affordable, and I would easily be happy living there. Would I be happy working there? I'm sure I would for a while...Forever? I don't know for sure, but I'm willing to give it my best shot. I am doing my best to take the advice of many who have commented on the plight of recent humanities PhDs and advised them to alter their idea of what kind of work constitutes academic success. I am willing to do this, and my general feeling is that, as long as I get to teach some literature courses (rather than all composition, all the time), I have some job security, and my family can put down some roots, I'd be pretty happy.
My primary concern, however, is that *if* I decided in the future that cc work wasn't something I could be happy doing for the rest of my career, would a university still be willing to hire me? Or, would I be branded with a blazing CC on my chest and laughed at when I applied for a more research-oriented position? Is there an insurmountable stigma attached to cc work? Have you (or your readers) seen a humanities scholar move from the cc-level to a SLAC or regional state school?
Never having hired at a university, I'll have to defer to my wise and worldly readers for feedback on what they've actually seen and done there. Having said that, though, my initial reaction here is similar to my initial reaction to a reader who was trying to find the perfect time to have a baby: you can control only what you can control.
Yes, I've personally seen community college faculty hired away by four-year public colleges, and once by a second-tier public university. One of the most interesting writers of my generation, Jennifer Michael Hecht, taught full-time at Nassau Community College before moving to her perch at the New School, which ain't too shabby. (I don't know her, but I recommend her book The Happiness Myth to everybody within shouting distance.) Last year my college lost a particularly wonderful junior faculty member to a four-year state college, and it has happened several times over the last few years. I'd be surprised to see a direct hop from a cc to an Ivy, but hops from cc's to state colleges happen with some frequency.
In each case, though, the candidate had something unusual. If you go simply as an accomplished teacher with a doctorate, you'll be one of hundreds. If you really want to make the leap, you'd have to do a kind of double duty while at the cc: do the cc job well, and still build a publication record (or something similar) that would make you desirable at the level you want. It can be done, but there are limits to how much most people will publish with a fifteen credit semester load. For all intents and purposes, you'd be doing at least a job and a half, if not two. Not impossible, but not for the faint of heart.
In any event, though, I wouldn't rule out a job that offers the prospect of a sane and happy life for you and your family on the basis of a hypothetical attack of status anxiety five years from now at some hypothetical university. These things are notoriously hard to predict, and living according to other people's status anxiety is a recipe for misery. If the cc job offers the prospect of doing what you love to do, in a location that works well for your personal life, for a living wage with benefits and security, I wouldn't rule it out. I made a similar decision when I took my nifty academic pedigree to Proprietary U, where it was all teaching, all the time. It wasn't what I had envisioned when I signed up for grad school, but it paid the rent, made sense for my personal life, and eventually opened up an unexpected career path. I couldn't have predicted that at the time, but that's sort of the point.
To my mind, the only convincing argument against applying for the cc job would be if you really don't want it. If the thought of teaching developmental writing, or lots of freshman comp, or fifteen credits per semester gives you chills, then don't do it. But if you can imagine enjoying it for a while, I wouldn't look at it as a life sentence. The world is a huge and unpredictable place.
Wise and worldly readers, especially those who have hired at a university or who have made a similar leap -- what counsel would you offer? Is the c.v. stain indelible, or not?
Good luck!
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Friday, February 05, 2010
The Times Whiffs Again
Several alert readers sent me links to this article from the New York Times. It's a weirdly chipper "pick up some money in your spare time by adjuncting!" piece, written for (and apparently by) people who aren't terribly conversant in higher ed.
Depending on your angle to the universe, it could be read as refreshing, bizarre, or deeply offensive. (I fall into the 'bizarre' camp, with sympathies for the 'deeply offensive.')
First, credit where it's due: there's nothing actually false in the article. It notes, correctly, that the demand for adjunct faculty is high right now in many areas, and that the pay is generally underwhelming. It notes, correctly, that a graduate degree isn't always a hard and fast requirement, though from reading the piece you'd think it matters a lot less than it actually does. (At my cc, it's usually a deal-breaker outside of a few, very specialized, occupational programs.) It cites professional networking as a major benefit of adjuncting, which is probably true in a few niche areas, but which most composition instructors would find strange.
That said, the reality is sooo much more complex than the article suggests.
Having been a freeway-flier myself, I know it's easy to assume that all adjuncts feel exploited and really want to be full-time, but it isn't true. Many do, many don't. Adjunct gigs can make a certain sense in some situations, all of which exist on my campus:
- The full-timer who picks up an 'extra' course or two, just to supplement salary. I have a surprising number of these on my campus. Some of them are young and paying off student loans; some of them have kids in college; some, I'm told, will do anything not to go home. (I try not to pry.) These people get health insurance and salaries anyway, but the marginal benefit of another course is adjunct pay.
- People with other full-time jobs, whether on campus or off. We have full-time staff who pick up a class at night because they love teaching and/or want to pick up a few extra bucks. We also have a non-trivial number of high school teachers who like to stretch their wings a bit with an evening class. Of course, there are also the classic professionals-in-the-field, the model that adjuncting was built to fit. We actually do have a few of those -- lawyers who like to pick up the occasional business law class, say.
- Trailing spouses. Typically, they aren't trailing anyone who works here, but the two-body problem brought them to this geographical area, and a course or two fits their needs. In some cases, we get some pretty wonderful people this way. Some would probably prefer full-time employment, but some find the part-time schedule a better fit for their lives.
- Grad students trying to gain experience in the classroom. It's one thing to TA a discussion section; it's something else to teach your own class. I'd argue that you hit diminishing returns relatively quickly, in terms of future employability, but some experience is better than none. This is particularly true for folks who want to find a full-time community college position; hiring committees here are much friendlier to candidates who have taught at the cc level.
- Retirees. We have about a dozen retired full-time faculty who like to teach a class or two. (Some of them teach only in the Fall, using the Spring to travel. Looks good to me...) It's a way of staying connected, without being bogged down in the stuff that comes with a full-time gig. These folks are usually wonderful instructors, and we're happy to have them. We also get occasional retired muckety-mucks from the business or legal worlds who like to pick up a class as a way of sharing what they know and love. Again, most of the time, these work out quite well (though this group usually needs more orientation than the others).
None of this is to discount the real frustration of someone who's trying to break in, ekeing out a living in the meantime by cobbling together jobs that were never meant to be cobbled together. But I think it does explain, in part, why it can be so difficult to get adjuncts to organize; their interests aren't always the same. Reforms that might appeal to a freeway flier may be irrelevant to the full-timer teaching an overload, and might be actually distasteful to the retiree or the high school teacher. I've seen each of these, and untold variations.
The common denominator, though, and what really irks me about the piece, is that college teaching isn't something to be done on a lark. It's work. (Historiann did a nice piece on this -- check it out.) Doing it well requires time, focus, and a willingness to do what needs to be done. Even when it pays badly, the students don't expect -- or, to my mind, deserve -- any less. It's not an easy and fun way to pick up a few bucks. (It can be fun, but the fun is a byproduct of job satisfaction.) I've gone on record suggesting that romanticizing the task too much is a bad idea, and I stand by that, but this piece trivializes it. When the professor is in class, she's the professor, regardless of her paycheck. If she doesn't respect her own role, I don't know why the students should.
I don't expect much from the Times' coverage of higher ed, but this is really a bit much. The pay is bad enough; suggesting that anybody off the street could do it just adds insult to injury. No, thanks.
Depending on your angle to the universe, it could be read as refreshing, bizarre, or deeply offensive. (I fall into the 'bizarre' camp, with sympathies for the 'deeply offensive.')
First, credit where it's due: there's nothing actually false in the article. It notes, correctly, that the demand for adjunct faculty is high right now in many areas, and that the pay is generally underwhelming. It notes, correctly, that a graduate degree isn't always a hard and fast requirement, though from reading the piece you'd think it matters a lot less than it actually does. (At my cc, it's usually a deal-breaker outside of a few, very specialized, occupational programs.) It cites professional networking as a major benefit of adjuncting, which is probably true in a few niche areas, but which most composition instructors would find strange.
That said, the reality is sooo much more complex than the article suggests.
Having been a freeway-flier myself, I know it's easy to assume that all adjuncts feel exploited and really want to be full-time, but it isn't true. Many do, many don't. Adjunct gigs can make a certain sense in some situations, all of which exist on my campus:
- The full-timer who picks up an 'extra' course or two, just to supplement salary. I have a surprising number of these on my campus. Some of them are young and paying off student loans; some of them have kids in college; some, I'm told, will do anything not to go home. (I try not to pry.) These people get health insurance and salaries anyway, but the marginal benefit of another course is adjunct pay.
- People with other full-time jobs, whether on campus or off. We have full-time staff who pick up a class at night because they love teaching and/or want to pick up a few extra bucks. We also have a non-trivial number of high school teachers who like to stretch their wings a bit with an evening class. Of course, there are also the classic professionals-in-the-field, the model that adjuncting was built to fit. We actually do have a few of those -- lawyers who like to pick up the occasional business law class, say.
- Trailing spouses. Typically, they aren't trailing anyone who works here, but the two-body problem brought them to this geographical area, and a course or two fits their needs. In some cases, we get some pretty wonderful people this way. Some would probably prefer full-time employment, but some find the part-time schedule a better fit for their lives.
- Grad students trying to gain experience in the classroom. It's one thing to TA a discussion section; it's something else to teach your own class. I'd argue that you hit diminishing returns relatively quickly, in terms of future employability, but some experience is better than none. This is particularly true for folks who want to find a full-time community college position; hiring committees here are much friendlier to candidates who have taught at the cc level.
- Retirees. We have about a dozen retired full-time faculty who like to teach a class or two. (Some of them teach only in the Fall, using the Spring to travel. Looks good to me...) It's a way of staying connected, without being bogged down in the stuff that comes with a full-time gig. These folks are usually wonderful instructors, and we're happy to have them. We also get occasional retired muckety-mucks from the business or legal worlds who like to pick up a class as a way of sharing what they know and love. Again, most of the time, these work out quite well (though this group usually needs more orientation than the others).
None of this is to discount the real frustration of someone who's trying to break in, ekeing out a living in the meantime by cobbling together jobs that were never meant to be cobbled together. But I think it does explain, in part, why it can be so difficult to get adjuncts to organize; their interests aren't always the same. Reforms that might appeal to a freeway flier may be irrelevant to the full-timer teaching an overload, and might be actually distasteful to the retiree or the high school teacher. I've seen each of these, and untold variations.
The common denominator, though, and what really irks me about the piece, is that college teaching isn't something to be done on a lark. It's work. (Historiann did a nice piece on this -- check it out.) Doing it well requires time, focus, and a willingness to do what needs to be done. Even when it pays badly, the students don't expect -- or, to my mind, deserve -- any less. It's not an easy and fun way to pick up a few bucks. (It can be fun, but the fun is a byproduct of job satisfaction.) I've gone on record suggesting that romanticizing the task too much is a bad idea, and I stand by that, but this piece trivializes it. When the professor is in class, she's the professor, regardless of her paycheck. If she doesn't respect her own role, I don't know why the students should.
I don't expect much from the Times' coverage of higher ed, but this is really a bit much. The pay is bad enough; suggesting that anybody off the street could do it just adds insult to injury. No, thanks.
Wednesday, February 03, 2010
Are Green Jobs the New Metric System?
A few years ago I mentioned my bewilderment at why the failure of the push to adopt the metric system in the United States in the 70's hasn't received more scholarly attention. I remember teachers earnestly walking us through the various units -- centimeters, kilograms, celsius degrees, etc. -- to prepare us for the Big Change. Obviously, with a few isolated exceptions, it didn't happen. It caught on in some scientific and medical applications, where it makes the math a lot easier, but never went much beyond that. I suspect there's a great American Studies dissertation waiting to be written on the whole kerfuffle.
Based on early results, I'm wondering if the Green Jobs movement will come to a similar fate.
As with the metric system, there's a certain logic to the idea that green jobs are the wave of the future. Efficiency gains are a form of productivity, and given the age of much of our public and private infrastructure, there's no denying the presence of uncaptured efficiency gains. The past few years have made it clear that energy prices can be annoyingly volatile, and that one way to insulate (no pun intended) oneself against the fluctuations is to consume less. I like the concept a lot, since it promises steady work, reduced consumption of fossil fuels, and reduced reliance on the people who own/produce the fuels. The political appeal of uniting the environmentalists, the national security enthusiasts, and the working class is obvious. And yet...
Like many cc's, mine is running several workforce development programs to train people for green jobs. Most of them involve working on buildings, whether it's doing energy audits, weatherizing, or installing specialized high-efficiency equipment (like water heaters). The instructors are experienced and pragmatic, the students are motivated, the curriculum makes sense, and...
Nobody's hiring.
The graduates of the programs aren't finding jobs. From a 'workforce development' standpoint, we're developing a workforce that can't find work.
I hope that this is mostly a function of the depressed housing market, and that things will pick up when the economy does. But I'm starting to wonder.
My house was built in the 80's. It still has the original water heater, since the original owner was a bit of a fussbudget about maintenance. (He's a contractor, and he's still local -- we've actually had him do a few jobs for us.) I've read a few things about tankless systems, and thought that they sounded pretty good. But when I asked the contractor about them, and he explained the cost and time involved in retrofitting all that would need to be retrofitted before the system could even go in, I dropped the idea. The cost of adjustment would so outstrip the annual savings that I'd never come close. When this heater dies, I'll get a slightly more efficient new one, but I'm not going nuts. It's not worth it. If I were building a new house from scratch, it might make sense, but as an incumbent homeowner, I'll pass.
Multiply that logic by most people, and you get a lot of unemployed technicians. The greatest gains from weatherization would come in the oldest and most poorly maintained houses, but the people who live in those houses generally don't have the cash lying around for large-scale retrofitting. Even if they did, the chances of them getting back their investments over time are minimal; the rate of residential turnover is much faster than the payback time, and a 'green' house in a slum is still in a slum, and will be valued accordingly. (That's not even taking into account the reality that the lowest-income people tend to rent. In a rental, the cost of the improvement would go to the landlord, and the payback would go to the tenant. Not much incentive there for the landlord, as long as tenants don't put much stock in energy efficiency when they look for places to live.)
In other words, I can see a broad societal need for greater efficiency, and I can see a widespread need for good jobs. But I don't see why enough people will demand the service to make the jobs sustainable. Just as I can concede that 1,000 is a rounder number than 5,280, but I don't see the need to convert all the road signs from miles to kilometers. The gain doesn't seem worth the cost of change.
So we're preparing students for the jobs of the future, though they could really use jobs in the present, and it's not entirely clear just when that particular future will come. I can't remember the last time I measured something in centimeters; some futures take longer than others. The students can't wait that long, and there isn't much point in developing a workforce that can't find work.
Wise and worldly readers -- have you seen more success for Green Jobs programs in your area? Is there a way to thread the needle? Can we save Green Jobs from becoming the new metric system?
Based on early results, I'm wondering if the Green Jobs movement will come to a similar fate.
As with the metric system, there's a certain logic to the idea that green jobs are the wave of the future. Efficiency gains are a form of productivity, and given the age of much of our public and private infrastructure, there's no denying the presence of uncaptured efficiency gains. The past few years have made it clear that energy prices can be annoyingly volatile, and that one way to insulate (no pun intended) oneself against the fluctuations is to consume less. I like the concept a lot, since it promises steady work, reduced consumption of fossil fuels, and reduced reliance on the people who own/produce the fuels. The political appeal of uniting the environmentalists, the national security enthusiasts, and the working class is obvious. And yet...
Like many cc's, mine is running several workforce development programs to train people for green jobs. Most of them involve working on buildings, whether it's doing energy audits, weatherizing, or installing specialized high-efficiency equipment (like water heaters). The instructors are experienced and pragmatic, the students are motivated, the curriculum makes sense, and...
Nobody's hiring.
The graduates of the programs aren't finding jobs. From a 'workforce development' standpoint, we're developing a workforce that can't find work.
I hope that this is mostly a function of the depressed housing market, and that things will pick up when the economy does. But I'm starting to wonder.
My house was built in the 80's. It still has the original water heater, since the original owner was a bit of a fussbudget about maintenance. (He's a contractor, and he's still local -- we've actually had him do a few jobs for us.) I've read a few things about tankless systems, and thought that they sounded pretty good. But when I asked the contractor about them, and he explained the cost and time involved in retrofitting all that would need to be retrofitted before the system could even go in, I dropped the idea. The cost of adjustment would so outstrip the annual savings that I'd never come close. When this heater dies, I'll get a slightly more efficient new one, but I'm not going nuts. It's not worth it. If I were building a new house from scratch, it might make sense, but as an incumbent homeowner, I'll pass.
Multiply that logic by most people, and you get a lot of unemployed technicians. The greatest gains from weatherization would come in the oldest and most poorly maintained houses, but the people who live in those houses generally don't have the cash lying around for large-scale retrofitting. Even if they did, the chances of them getting back their investments over time are minimal; the rate of residential turnover is much faster than the payback time, and a 'green' house in a slum is still in a slum, and will be valued accordingly. (That's not even taking into account the reality that the lowest-income people tend to rent. In a rental, the cost of the improvement would go to the landlord, and the payback would go to the tenant. Not much incentive there for the landlord, as long as tenants don't put much stock in energy efficiency when they look for places to live.)
In other words, I can see a broad societal need for greater efficiency, and I can see a widespread need for good jobs. But I don't see why enough people will demand the service to make the jobs sustainable. Just as I can concede that 1,000 is a rounder number than 5,280, but I don't see the need to convert all the road signs from miles to kilometers. The gain doesn't seem worth the cost of change.
So we're preparing students for the jobs of the future, though they could really use jobs in the present, and it's not entirely clear just when that particular future will come. I can't remember the last time I measured something in centimeters; some futures take longer than others. The students can't wait that long, and there isn't much point in developing a workforce that can't find work.
Wise and worldly readers -- have you seen more success for Green Jobs programs in your area? Is there a way to thread the needle? Can we save Green Jobs from becoming the new metric system?
Countercyclical Hiring and Flight Risk
Too many of the arguments I've read and heard for hiring more full-time faculty rely on moralistic appeals. The idea seems to boil down to a simpleminded equation of "market" with "bad" and "tradition" with "good."
Moralistic arguments don't work because they solve the wrong problem. But there's a perfectly reasonable market-based argument for hiring full-time faculty right now: buy low, sell high. Great people have never been as undervalued as they are now; this is an unprecedented hiring opportunity.
In normal times, good candidates sometimes get passed over for posing excessive 'flight risk.' The idea is that tenure-track lines don't always get renewed when vacated, so you don't want to waste one on someone who will up and leave in a couple of years. (What this says about the idea of 'market as meritocracy' I'll leave as an exercise for the reader.) Search costs are substantial, and the risk of losing the position to the next round of budget cuts is ever-present, so some places will decide strategically to go for the very-good-but-not-best candidate on the theory that he'll stay if he gets the job. After all, if he leaves, the position may just get adjuncted out.
When the market is so completely dead, though, the candidate pool gets even stronger, and flight risk diminishes substantially. After all, if nobody else is hiring, where else is the slumming superstar going to go?
I've seen this on my own campus. We haven't been able to hire much lately, but we have hired for a few scattered positions, and the candidate pools have been off-the-charts amazing. The few new folks we've brought on recently have been absurdly great. I've actually used this as justification to reallocate resources to hiring more faculty, since now we'd have a realistic shot at people who normally would have been gobbled up elsewhere. While the moral argument is constant, the market-opportunity argument is uniquely strong now. If you hire when everyone else does, you're in a war for talent. If you hire when nobody else does, talent is in a war for you. Leaving morality out of it, the time when nobody else is hiring is exactly the time to strike.
Obviously, well-endowed private colleges have an advantage here. If your college is struggling to meet the payroll it already has, then the argument for countercyclical hiring is of no more than theoretical interest. But if there's any wiggle room at all, this is the time to use it.
Come to think of it, if more people figure out this strategy, it won't work as well. Humph. Never mind; as you were, everyone. Move along. Nothing to see here...
Moralistic arguments don't work because they solve the wrong problem. But there's a perfectly reasonable market-based argument for hiring full-time faculty right now: buy low, sell high. Great people have never been as undervalued as they are now; this is an unprecedented hiring opportunity.
In normal times, good candidates sometimes get passed over for posing excessive 'flight risk.' The idea is that tenure-track lines don't always get renewed when vacated, so you don't want to waste one on someone who will up and leave in a couple of years. (What this says about the idea of 'market as meritocracy' I'll leave as an exercise for the reader.) Search costs are substantial, and the risk of losing the position to the next round of budget cuts is ever-present, so some places will decide strategically to go for the very-good-but-not-best candidate on the theory that he'll stay if he gets the job. After all, if he leaves, the position may just get adjuncted out.
When the market is so completely dead, though, the candidate pool gets even stronger, and flight risk diminishes substantially. After all, if nobody else is hiring, where else is the slumming superstar going to go?
I've seen this on my own campus. We haven't been able to hire much lately, but we have hired for a few scattered positions, and the candidate pools have been off-the-charts amazing. The few new folks we've brought on recently have been absurdly great. I've actually used this as justification to reallocate resources to hiring more faculty, since now we'd have a realistic shot at people who normally would have been gobbled up elsewhere. While the moral argument is constant, the market-opportunity argument is uniquely strong now. If you hire when everyone else does, you're in a war for talent. If you hire when nobody else does, talent is in a war for you. Leaving morality out of it, the time when nobody else is hiring is exactly the time to strike.
Obviously, well-endowed private colleges have an advantage here. If your college is struggling to meet the payroll it already has, then the argument for countercyclical hiring is of no more than theoretical interest. But if there's any wiggle room at all, this is the time to use it.
Come to think of it, if more people figure out this strategy, it won't work as well. Humph. Never mind; as you were, everyone. Move along. Nothing to see here...
Tuesday, February 02, 2010
Ask the Administrator: Jobs in Education Reform
A dispirited correspondent writes:
My knowledge of this is pretty limited, so I'll ask my wise and worldly readers to chime in and fill out the picture.
I'll start with the obvious: if your local administration is pushing you to dumb down the curriculum in the name of retention, then your local administrators are idiots. The flaws in their strategy are several and basic. If you water down the degree, you'll lose transferability over time. If you water down the classes, it will become harder to maintain order in the classroom, since students will see no reason to take sanctions seriously. If you tell creative workers that their daily work should be entirely in the service of pleasing the customer, you'll actually get more displeased customers, because the quality of the work will suffer when their morale collapses.
Assessment is another matter, but I'll just say that if it's entirely dead weight, they aren't doing it right.
That said, I'll concede that many of my administrative colleagues seem to miss the big picture. (That's one of the reasons I've stayed in administration so long. I've seen the damage lousy admins can do, and I want to prevent it.) It sounds like you want to address the big picture. Getting paid for it is the tricky part.
Obviously, one way to do that is to go into administration yourself. That will take time and patience, though, since the first rung of the ladder involves far more trivia than thought.
Another way is to look at grant-funded programs. Philanthropic agencies usually have some sort of change or reform agenda; the trick is finding an agency with an agenda you find congenial, and that needs the skills you bring. One fairly common model is a grant that funds a partnership between a social service agency and a community college, usually teaching non-credit courses for targeted populations in specific niche occupations. Introducing yourself to the continuing-ed side of your college and expressing interest in working with them can open up opportunities in ways that are hard to anticipate.
Nonprofits and various ngo's often do the kind of work that seems to speak to you, though you'd have to figure out what your unique contribution would be. If it won't be on the financial or technical side, it could be on the fundraising or publicity side. I wouldn't expect to find a full-time, decent-paying job right out of the gate, but if you're willing to take baby steps, you might be able to find a niche.
Of course, there's always writing. That's my personal fave.
I'm reasonably sure that some of my wise and worldly readers have something to offer on this one, so I'll put out an open call. Does anyone know another way to make a living while fighting the good fight?
Good luck!
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
I'm wondering if you have any words of advice for those of us who are interested in paying jobs in the field of educational reform. I need to earn a living, but I'd like to further the cause if I could. I have 20 years experience in the college classroom, and am currently full time and tenured at a community college, but I am more than ready to leave the classroom. It's painful to me to have to constantly "game" statistics on so-called student learning outcomes while dumbing down the curriculum ever more to improve our "retention" and "completion" rates. We are getting ready for our re-accreditation, so we are in "full assessment mode"; I find it harder and harder to cooperate with the fundamental dishonesty of the whole process. I'd like to leave and work for real reform and excellence in education. My Ph.D. is in one of the humanities, not ed leadership, and I don't have a math/statistics background, so perhaps I'm not the ideal candidate. Then again, who is?
My knowledge of this is pretty limited, so I'll ask my wise and worldly readers to chime in and fill out the picture.
I'll start with the obvious: if your local administration is pushing you to dumb down the curriculum in the name of retention, then your local administrators are idiots. The flaws in their strategy are several and basic. If you water down the degree, you'll lose transferability over time. If you water down the classes, it will become harder to maintain order in the classroom, since students will see no reason to take sanctions seriously. If you tell creative workers that their daily work should be entirely in the service of pleasing the customer, you'll actually get more displeased customers, because the quality of the work will suffer when their morale collapses.
Assessment is another matter, but I'll just say that if it's entirely dead weight, they aren't doing it right.
That said, I'll concede that many of my administrative colleagues seem to miss the big picture. (That's one of the reasons I've stayed in administration so long. I've seen the damage lousy admins can do, and I want to prevent it.) It sounds like you want to address the big picture. Getting paid for it is the tricky part.
Obviously, one way to do that is to go into administration yourself. That will take time and patience, though, since the first rung of the ladder involves far more trivia than thought.
Another way is to look at grant-funded programs. Philanthropic agencies usually have some sort of change or reform agenda; the trick is finding an agency with an agenda you find congenial, and that needs the skills you bring. One fairly common model is a grant that funds a partnership between a social service agency and a community college, usually teaching non-credit courses for targeted populations in specific niche occupations. Introducing yourself to the continuing-ed side of your college and expressing interest in working with them can open up opportunities in ways that are hard to anticipate.
Nonprofits and various ngo's often do the kind of work that seems to speak to you, though you'd have to figure out what your unique contribution would be. If it won't be on the financial or technical side, it could be on the fundraising or publicity side. I wouldn't expect to find a full-time, decent-paying job right out of the gate, but if you're willing to take baby steps, you might be able to find a niche.
Of course, there's always writing. That's my personal fave.
I'm reasonably sure that some of my wise and worldly readers have something to offer on this one, so I'll put out an open call. Does anyone know another way to make a living while fighting the good fight?
Good luck!
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Monday, February 01, 2010
Wisdom and Knowledge
In ninth grade, I had a wonderful, high-energy social studies teacher whose favorite exhortation was "wisdom and knowledge!" He'd usually punctuate it by thwacking his yardstick against a desk while we took notes. At the time, it was an entertaining shtick, and I didn't think much about the distinction between wisdom and knowledge.
With age and experience, though, I'm beginning to appreciate the difference.
Knowledge is cumulative. The more you know, the more you know. Knowledge can be stored, transmitted, shared, hoarded, traded, taught, learned, and even brandished. People who are "quick studies" can accumulate a surprising amount of it in a short time. Both trivia and timeless truths can be knowledge.
Wisdom is different. It's about knowing the relative importance of different things, and knowing what can be downplayed or ignored. It's closer to 'intuition' than to 'knowledge,' in that it's difficult to store, hoard, transmit, etc. It's hard to 'cram' wisdom, and some people just never quite get it. Wisdom can discern the difference between the important and the trivial, or the passing and the permanent. My grandfather dropped out of the ninth grade and worked most of his life as an electrical lineman, but he was wise. He knew what to care about and what not to, and allocated his energies accordingly. I always admired that about him. His house had the worst paint job in the Western world, and his fashion sense could be described as idiosyncratic, but he didn't care; he was focused on family, politics, and baseball, and the rest was trivia. He knew exactly who he was.
We could use some of that wisdom locally, in preparing next year's budget. There's plenty of knowledge flying around: the governor wants to propose figure x, various legislators are thinking figures d through j, and state tax revenue projections fall somewhere between 'ambiguous' and 'pulling numbers out of a hat.' If you want it, there's warrant for projecting any one of dozens of different outcomes, each with different implications for what we can afford.
Wisdom, here, comes in the form of knowing what to ignore. Some numbers are just wishful thinking or political posturing, and others are actually realistic. Discerning what to discount and what to take seriously is far more important than knowing the exact figure with which someone will grandstand.
The relevance now is that any tuition increase has to be approved by the Board of Trustees, and the Board will likely be more or less open to increases depending on what it expects will happen at the state level. If the best case scenario comes to pass, then passing a tuition increase would just seem punitive to the students. If the worse case scenario comes to pass and we don't have a significant tuition increase, heaven help us all. For various reasons, the Board has to make its decision before the state has to make its decision, so the game is "guess the gap." By necessity, that involves weighing unknowns, which means guessing likelihoods.
If we base the proposal we bring to the Board on the latest-and-most-inside knowledge, we take the very real risk of basing a crucial decision on a number that was never meant to be taken literally. (Initial proposals are often understood by those making them to be strategic, whether highball or lowball.) If we go in with too much optimism and get it wrong, we wind up in much worse shape. If we go in too pessimistic and get it wrong, we inflict needless pain on students, and damage our own future credibility. But the answer is probably not to be found entirely in more assiduous information-gathering. In fact, in many ways, the trick is to block out the more outlandish numbers, and to go with as realistic a projection as experience-based intuition can generate.
Wanted: budgetary wisdom. Apply within.
With age and experience, though, I'm beginning to appreciate the difference.
Knowledge is cumulative. The more you know, the more you know. Knowledge can be stored, transmitted, shared, hoarded, traded, taught, learned, and even brandished. People who are "quick studies" can accumulate a surprising amount of it in a short time. Both trivia and timeless truths can be knowledge.
Wisdom is different. It's about knowing the relative importance of different things, and knowing what can be downplayed or ignored. It's closer to 'intuition' than to 'knowledge,' in that it's difficult to store, hoard, transmit, etc. It's hard to 'cram' wisdom, and some people just never quite get it. Wisdom can discern the difference between the important and the trivial, or the passing and the permanent. My grandfather dropped out of the ninth grade and worked most of his life as an electrical lineman, but he was wise. He knew what to care about and what not to, and allocated his energies accordingly. I always admired that about him. His house had the worst paint job in the Western world, and his fashion sense could be described as idiosyncratic, but he didn't care; he was focused on family, politics, and baseball, and the rest was trivia. He knew exactly who he was.
We could use some of that wisdom locally, in preparing next year's budget. There's plenty of knowledge flying around: the governor wants to propose figure x, various legislators are thinking figures d through j, and state tax revenue projections fall somewhere between 'ambiguous' and 'pulling numbers out of a hat.' If you want it, there's warrant for projecting any one of dozens of different outcomes, each with different implications for what we can afford.
Wisdom, here, comes in the form of knowing what to ignore. Some numbers are just wishful thinking or political posturing, and others are actually realistic. Discerning what to discount and what to take seriously is far more important than knowing the exact figure with which someone will grandstand.
The relevance now is that any tuition increase has to be approved by the Board of Trustees, and the Board will likely be more or less open to increases depending on what it expects will happen at the state level. If the best case scenario comes to pass, then passing a tuition increase would just seem punitive to the students. If the worse case scenario comes to pass and we don't have a significant tuition increase, heaven help us all. For various reasons, the Board has to make its decision before the state has to make its decision, so the game is "guess the gap." By necessity, that involves weighing unknowns, which means guessing likelihoods.
If we base the proposal we bring to the Board on the latest-and-most-inside knowledge, we take the very real risk of basing a crucial decision on a number that was never meant to be taken literally. (Initial proposals are often understood by those making them to be strategic, whether highball or lowball.) If we go in with too much optimism and get it wrong, we wind up in much worse shape. If we go in too pessimistic and get it wrong, we inflict needless pain on students, and damage our own future credibility. But the answer is probably not to be found entirely in more assiduous information-gathering. In fact, in many ways, the trick is to block out the more outlandish numbers, and to go with as realistic a projection as experience-based intuition can generate.
Wanted: budgetary wisdom. Apply within.
Subscribe to:
Posts (Atom)