There was a time when I faithfully brought lunch to work. It was economical, and it saved driving, and it seemed vaguely virtuous. But I noticed, gradually, that never leaving campus made me batty. It felt like house arrest.
On the days when I leave campus for lunch – and honesty compels me to admit that that’s most days now, except for days with lunch meetings – I don’t get that ‘trapped’ feeling. Even the cafeteria doesn’t really do the job, since I’m still very much on stage there. I have to actually get in the car and drive someplace physically separate and distinct. Just changing scene – even if it’s only a mile or two away – keeps me sane.
Sanity is good – the world could use more of it, frankly – but money is money (and calories are calories). I can’t really bring a bagged lunch into a restaurant – they get kind of picky about that – and outdoor settings (parks) are weather-dependent. Since it’s been raining for what seems like years now, anything outdoors is either soaked or steamy. Neither particularly lends itself to going back to the office. And eating in the car is just sad.
As a card-carrying introvert, a little alone time at lunch helps me recharge the batteries. I know it’s anathema to admit that in a culture that uses ‘network’ as a verb, but it’s true. I’m more balanced and more sane in the afternoon when I’ve had a brief respite in the middle of the day.
So, a question for my wise and worldly readers. Surely some of you have faced similar dilemmas. Have you found good places for bringing lunch on the lam?
In which a veteran of cultural studies seminars in the 1990's moves into academic administration and finds himself a married suburban father of two. Foucault, plus lawn care. For private comments, I can be reached at deandad at gmail dot com. The opinions expressed here are my own and not those of my employer.
Tuesday, June 30, 2009
Monday, June 29, 2009
Umps and Academics
I just finished Bruce Weber’s new book, As They See ‘Em, which is about professional umpires. As a longstanding baseball fan, it’s a hoot, but I couldn’t help but notice a few, oddly-comforting parallels to the academic world.
Take, for example, this excerpt from an interview Weber did with Pat O’Conner, who was at the time the chief operating officer of Minor League Baseball. He was in charge of negotiating contracts with the minor league umps’ union. Minor league umps make ten to twenty thousand dollars a year.
This rings a bell…why does this ring a bell…think, think…
The book goes into tremendous detail about the long, hard path to the majors that umpires have to follow. Something about this passage sounded vaguely familiar, too:
Hmm. A paucity of full-time positions leads to a backlog of very qualified and very frustrated applicants, making hiring for diversity even more politically charged than it otherwise would be. I’ve heard of that happening somewhere before…think, think…
Much of the book is primarily of interest to baseball fans, which is to be expected. (For fans of a certain age, there’s a laugh-out-loud funny explanation of the George Brett/Billy Martin ‘pine tar’ home run incident from the early 80’s.) But the collision of an overly long and inhumane training period with a clogged pipeline for good jobs was eerily familiar.
There’s even a variation on an academic freedom dispute. As longtime fans know, the strike zone defined in the rule book exists only in the rule book. In the early 90’s the strike zone the umps actually called on the field started to change, getting both shorter and wider. For a period in the mid-90’s, a pitch could be six inches off the plate outside and still get called a strike. (For my money, this was part of what drove hitters to go steroid-crazy in the late 90’s. They had to level the playing field.) Although the umps routinely denied that the zone had moved, they also rebelled mightily against any encroachment by the front office on how the strike zone was to be called.
The conflict came to a head in the early 2000’s, when Major League Baseball enlisted QuesTec, a computerized system for determining balls and strikes. Umpires were assessed on their outcomes; umpires who get ‘bad’ scores on QuesTec -- that is, whose calls differed from the computer the most – were pressured to toe the line. (Interestingly, MLB had rejected a high-tech pitch simulator – like a flight simulator, but for calling balls and strikes – as part of umpire training. The use of QuesTec was punitive, rather than formative.) For several years, a war of attrition waged between the umps, who took the position that they know better even if they all disagree with each other, and the league, which insisted on a uniform zone.
Here, too, the parallels were striking (no pun intended). Outcomes assessment was resented by the longtime practitioners, who asserted unaccountability as a prerogative of their station. The league used assessment in a hamhanded and even backwards way, giving credibility to some of the worst fears of the practitioners. Over time, an uneasy truce evolved in which the very worst excesses on both sides were curtailed, but nobody could really say anything was better.
Hmm. I can’t put my finger on it…
If nothing else, it’s comforting to see that some of the more persistent and annoying dilemmas of academia aren’t unique to academia. At least we don’t have our mistakes replayed endlessly on national television.
Take, for example, this excerpt from an interview Weber did with Pat O’Conner, who was at the time the chief operating officer of Minor League Baseball. He was in charge of negotiating contracts with the minor league umps’ union. Minor league umps make ten to twenty thousand dollars a year.
“They are the first line of defense for the integrity of the game,” O’Conner said to me. “I respect the hell out of these guys. They’re doing something I couldn’t do.”
In that case, I said, why are they paid so poorly?
They aren’t, O’Conner said; there is simply a difference of opinion about their stature (sic) as employees. That is, the umpires think of themselves as being on a professional career path [to the majors], and the minor leagues consider them to be neither full-fledged professionals nor full-time employees. Rather, their time in the minor leagues, he said, is an apprenticeship; their contacts are for seasonal work.
“Our program is not designed for them to be able to live on their salaries for twelve months a year,” O’Connor said. “To want to change that is to change the financial underpinnings of the entire minor league system. That being said, not many employers pay seasonal workers what we pay them.” (p. 130)
This rings a bell…why does this ring a bell…think, think…
The book goes into tremendous detail about the long, hard path to the majors that umpires have to follow. Something about this passage sounded vaguely familiar, too:
Jimmie Lee Solomon, baseball’s executive vice president, who is himself black, acknowledged that the paucity of black umpires is a problem he’s determined to solve, though that won’t be easy. As of the 2008 season, in terms of seniority and experience, the next several Triple A umpires [the highest level of the minor leagues] in line for major league jobs were white, including at least three – Chris Guccione, Rob Drake, and James Hoye – who have worked more than five hundred games each in the big leagues [as subs]. If Solomon were to promote a black umpire ahead of them, several umpires – white umpires – told me, the resentment would be fierce. (pp. 296-7)
Hmm. A paucity of full-time positions leads to a backlog of very qualified and very frustrated applicants, making hiring for diversity even more politically charged than it otherwise would be. I’ve heard of that happening somewhere before…think, think…
Much of the book is primarily of interest to baseball fans, which is to be expected. (For fans of a certain age, there’s a laugh-out-loud funny explanation of the George Brett/Billy Martin ‘pine tar’ home run incident from the early 80’s.) But the collision of an overly long and inhumane training period with a clogged pipeline for good jobs was eerily familiar.
There’s even a variation on an academic freedom dispute. As longtime fans know, the strike zone defined in the rule book exists only in the rule book. In the early 90’s the strike zone the umps actually called on the field started to change, getting both shorter and wider. For a period in the mid-90’s, a pitch could be six inches off the plate outside and still get called a strike. (For my money, this was part of what drove hitters to go steroid-crazy in the late 90’s. They had to level the playing field.) Although the umps routinely denied that the zone had moved, they also rebelled mightily against any encroachment by the front office on how the strike zone was to be called.
The conflict came to a head in the early 2000’s, when Major League Baseball enlisted QuesTec, a computerized system for determining balls and strikes. Umpires were assessed on their outcomes; umpires who get ‘bad’ scores on QuesTec -- that is, whose calls differed from the computer the most – were pressured to toe the line. (Interestingly, MLB had rejected a high-tech pitch simulator – like a flight simulator, but for calling balls and strikes – as part of umpire training. The use of QuesTec was punitive, rather than formative.) For several years, a war of attrition waged between the umps, who took the position that they know better even if they all disagree with each other, and the league, which insisted on a uniform zone.
Here, too, the parallels were striking (no pun intended). Outcomes assessment was resented by the longtime practitioners, who asserted unaccountability as a prerogative of their station. The league used assessment in a hamhanded and even backwards way, giving credibility to some of the worst fears of the practitioners. Over time, an uneasy truce evolved in which the very worst excesses on both sides were curtailed, but nobody could really say anything was better.
Hmm. I can’t put my finger on it…
If nothing else, it’s comforting to see that some of the more persistent and annoying dilemmas of academia aren’t unique to academia. At least we don’t have our mistakes replayed endlessly on national television.
Friday, June 26, 2009
Friday Fragments
- I don't often get excited about amending forms, but if President Obama is able to simply the FAFSA in a meaningful way, I say, Hooray! The FAFSA is the form that students and prospective students have to fill out to apply for Federally-backed financial aid, and it's worse than the 1040. It's just horrible. Kafka would have considered it over-the-top. I'm not a believer in the "it should fit on a postcard" theory, but surely there's middle ground between a postcard and a dissertation. I wouldn't be at all surprised to see a much simpler FAFSA result in more completed applications from first-generation students and students whose first language isn't English. You pretty much need a graduate degree to navigate the flippin' thing. And even for those who survive the present form, surely reducing their time spent on paperwork by a few hours is a good thing.
- TW and the kids are out of town for a few days -- not in Argentina, happily -- so I'm doing the temporary bachelor thing. It's amazing how quickly old habits come back. (She would probably use a term like "regression.") Having the house to myself is a lot of fun for the first hour or so. Then it starts to get lonely. I also hadn't fully appreciated the appetite-suppressant role the kids play. When they're gone, I eat like I'm preparing to hibernate. It's a good thing they won't be gone long, or I'd have to start buying all new clothes.
- Doing employee evaluations sucks. I will offer no further details on that.
- Farrah Fawcett and Michael Jackson were both major figures with other people my age, but neither of them meant much to me. She seemed harmless enough, but I was never a fan. (When she hit big, I was all about Lynda Carter.) And MJ went from 'oddity' to 'criminal' in my mind some time ago, when the pedophilia became too obvious to ignore. The only celebrity death I recall really affecting me emotionally beyond the initial 'that's too bad' was Kurt Cobain. I remember my Dad's reaction when Elvis died, and I remember a whole bunch of people around me being sad when John Lennon was killed, but those were both more salient to previous generations. Cobain and I were nearly the same age, and I remember really admiring the way he embodied the contradictory impulses towards both cynicism and hope that were very real to me then. He also had a contrarian sense of humor that I found refreshing. When Courtney Love read his suicide note over loudspeakers to a crowd outside their Seattle house, doing a running commentary as she read, I was riveted. Shortly after his death, I remember Andy Rooney dismissing his depression as a sort of affectation; that was the last time I paid attention to Andy Rooney. Since then, I've cut Courtney Love more slack than she probably deserves, but hey.
- Along the lines of age and generation, I've hit the age at which I'll hear baseball players' names and immediately recognize them as "that's so-and-so's kid." Last week I saw part of a Brewers game, and during the few minutes that TW watched with me, Prince Fielder came to bat. I let slip something like "he looks just like his father," which elicited a pretty good eye-roll from TW. I haven't yet hit the "get off the lawn!" stage, but it's probably inevitable.
- TB's end-of-year report card was a smashing success. Among the piles of school detritus he brought home were several notebooks' worth of stories he had written while waiting for others to finish assignments, and a pair of mash notes from girls in his class. He's mostly excited that now he gets to stay up later reading his books. We've got some summer stuff lined up, but we're building in some "figure out for yourself what to do" time, too. I suspect he's up to it.
- TW and the kids are out of town for a few days -- not in Argentina, happily -- so I'm doing the temporary bachelor thing. It's amazing how quickly old habits come back. (She would probably use a term like "regression.") Having the house to myself is a lot of fun for the first hour or so. Then it starts to get lonely. I also hadn't fully appreciated the appetite-suppressant role the kids play. When they're gone, I eat like I'm preparing to hibernate. It's a good thing they won't be gone long, or I'd have to start buying all new clothes.
- Doing employee evaluations sucks. I will offer no further details on that.
- Farrah Fawcett and Michael Jackson were both major figures with other people my age, but neither of them meant much to me. She seemed harmless enough, but I was never a fan. (When she hit big, I was all about Lynda Carter.) And MJ went from 'oddity' to 'criminal' in my mind some time ago, when the pedophilia became too obvious to ignore. The only celebrity death I recall really affecting me emotionally beyond the initial 'that's too bad' was Kurt Cobain. I remember my Dad's reaction when Elvis died, and I remember a whole bunch of people around me being sad when John Lennon was killed, but those were both more salient to previous generations. Cobain and I were nearly the same age, and I remember really admiring the way he embodied the contradictory impulses towards both cynicism and hope that were very real to me then. He also had a contrarian sense of humor that I found refreshing. When Courtney Love read his suicide note over loudspeakers to a crowd outside their Seattle house, doing a running commentary as she read, I was riveted. Shortly after his death, I remember Andy Rooney dismissing his depression as a sort of affectation; that was the last time I paid attention to Andy Rooney. Since then, I've cut Courtney Love more slack than she probably deserves, but hey.
- Along the lines of age and generation, I've hit the age at which I'll hear baseball players' names and immediately recognize them as "that's so-and-so's kid." Last week I saw part of a Brewers game, and during the few minutes that TW watched with me, Prince Fielder came to bat. I let slip something like "he looks just like his father," which elicited a pretty good eye-roll from TW. I haven't yet hit the "get off the lawn!" stage, but it's probably inevitable.
- TB's end-of-year report card was a smashing success. Among the piles of school detritus he brought home were several notebooks' worth of stories he had written while waiting for others to finish assignments, and a pair of mash notes from girls in his class. He's mostly excited that now he gets to stay up later reading his books. We've got some summer stuff lined up, but we're building in some "figure out for yourself what to do" time, too. I suspect he's up to it.
Thursday, June 25, 2009
A Real Forehead-Slapper
Why do so many states require only two years of math in high school?
In a discussion this week about the struggles we have with developmental math classes, someone mentioned that this state, like so many others, requires only two years of math in high school. That means that even many brand-new grads come to us not having done math in two years, and having stopped out before they even got to trig. Then everyone is shocked at low pass rates in developmental math.
We have anecdotal evidence that suggests that students who actually take math for all four years of high school do better in math here than those who don't. We also have anecdotal evidence that bears crap in the woods. Why the hell do the high schools only require two years of math?
In a discussion this week about the struggles we have with developmental math classes, someone mentioned that this state, like so many others, requires only two years of math in high school. That means that even many brand-new grads come to us not having done math in two years, and having stopped out before they even got to trig. Then everyone is shocked at low pass rates in developmental math.
We have anecdotal evidence that suggests that students who actually take math for all four years of high school do better in math here than those who don't. We also have anecdotal evidence that bears crap in the woods. Why the hell do the high schools only require two years of math?
Wednesday, June 24, 2009
An Unmarked Car
Many years ago, in one of those gender theory seminars, I remember a remark to the effect that men have the privilege of being able to choose to dress 'unmarked' in a way that women don't. The idea was that American culture had settled on several different uniforms for men, depending on the context, and that men have the option of wearing those uniforms if they want to fit in and not draw particular attention to what they're wearing. Since there isn't a similar understanding of a uniform for women, women have to make conscious decisions about how they dress (and others feel free to draw conclusions about them based on those choices). They don't have a 'default' option the way men do, and they don't have the option of not calling attention to what they're wearing.
(Whether that's still true for women, I'll leave to the collective wisdom of my wise and worldly readers.)
There was enough truth to that for it to stick with me. At work, I can wear, say, a gray suit, and be both situationally appropriate and utterly impersonal. On dress down days, the alternate uniform of tie-less Oxford and khakis (or a close variant) gets the job done. There's nothing terribly interesting about either ensemble, but that's precisely the point. I don't have to think about them, and neither does anybody else. They're like driving unmarked cars. I go where I want without calling undue attention to myself.
Except that they aren't. Over the last couple of weeks, on three separate occasions, I've run into people from the college out in the world, and they've all had the same reaction. "I didn't recognize you without the suit."
Hmm. If the markings were truly neutral, that wouldn't happen.
Uniforms carry meanings of their own, of course. Although it's somewhat dated, I still sometimes hear Administration referred to as "the suits." (For the record, academics don't wear suits quite the way businesspeople do. On the milder side, we blow off the "button-down collars are for sport jackets" rule, which is fine by me. On the more severe side, well, let's just say that some of us need Garanimals sewn into our clothes, and some have apparently never heard of 'ironing.') But even allowing for that, it's still striking to be told, repeatedly and in apparent sincerity, that the suit simply erases the person. I can't blame on it what I was wearing in civilian life, either -- it's not like I put on a spiky Goth number and pasted a Mohawk toupee over the bald spot. I was just dressed like a suburban dad, which, in fact, I am.
The civilian clothes carry markings of their own, admittedly. At TG's preschool graduation, I saw another Dad in a jumpsuit with his name sewn on a patch. I was doing the Oxford-and-khakis thing. It wasn't hard to guess who had the office job. But even allowing for that, it's not like I was somehow out of character when I wasn't recognized.
There's unmarked, and then there's unmarked. The late Mitch Hedberg once theorized that the reason all those photos of Bigfoot are blurry is that Bigfoot himself was blurry. Maybe the clothes carry meaning, and I'm just indistinct.
Hmm.
Wise and worldly readers -- has something like this happened to you?
(Whether that's still true for women, I'll leave to the collective wisdom of my wise and worldly readers.)
There was enough truth to that for it to stick with me. At work, I can wear, say, a gray suit, and be both situationally appropriate and utterly impersonal. On dress down days, the alternate uniform of tie-less Oxford and khakis (or a close variant) gets the job done. There's nothing terribly interesting about either ensemble, but that's precisely the point. I don't have to think about them, and neither does anybody else. They're like driving unmarked cars. I go where I want without calling undue attention to myself.
Except that they aren't. Over the last couple of weeks, on three separate occasions, I've run into people from the college out in the world, and they've all had the same reaction. "I didn't recognize you without the suit."
Hmm. If the markings were truly neutral, that wouldn't happen.
Uniforms carry meanings of their own, of course. Although it's somewhat dated, I still sometimes hear Administration referred to as "the suits." (For the record, academics don't wear suits quite the way businesspeople do. On the milder side, we blow off the "button-down collars are for sport jackets" rule, which is fine by me. On the more severe side, well, let's just say that some of us need Garanimals sewn into our clothes, and some have apparently never heard of 'ironing.') But even allowing for that, it's still striking to be told, repeatedly and in apparent sincerity, that the suit simply erases the person. I can't blame on it what I was wearing in civilian life, either -- it's not like I put on a spiky Goth number and pasted a Mohawk toupee over the bald spot. I was just dressed like a suburban dad, which, in fact, I am.
The civilian clothes carry markings of their own, admittedly. At TG's preschool graduation, I saw another Dad in a jumpsuit with his name sewn on a patch. I was doing the Oxford-and-khakis thing. It wasn't hard to guess who had the office job. But even allowing for that, it's not like I was somehow out of character when I wasn't recognized.
There's unmarked, and then there's unmarked. The late Mitch Hedberg once theorized that the reason all those photos of Bigfoot are blurry is that Bigfoot himself was blurry. Maybe the clothes carry meaning, and I'm just indistinct.
Hmm.
Wise and worldly readers -- has something like this happened to you?
Tuesday, June 23, 2009
Delicate Cutters
Okay, I’m a little late to this one, but there’s a nifty exchange between Notorious Ph.D. and Historiann about the perversities of budget cuts at their respective institutions. The comments are worth reading, too. What starts as a fairly standard-issue set of complaints about budget cuts sort of backs into a thoughtful discussion of job expectations and reciprocity in the workplace.
Among other things, it makes me glad that I work where I work. We have our financial issues, God knows, but we haven’t done anything as drastic and destructive as furloughs or salary cuts, let alone layoffs. We’ve cut travel and release time, as well as a whole bunch of back-office expenses that faculty tend not to notice but that actually matter quite a bit. We have a salary freeze, which is annoying, but nowhere near as annoying as furloughs or cuts. (For the record, cuts are worse than furloughs. Future salary increases are percentages of base pay. Furloughs don’t affect base pay, but cuts do. You won’t see a difference at the time, but you’ll see it down the road.)
(Also for the record, while I’m at it, I recall that one of the selling points of the 401(k) or 403(b) was supposed to be dollar-cost averaging during market dips. With so many companies and some colleges suspending matching contributions to retirement accounts, that argument has been conclusively discredited. Averaging doesn’t help you if you don’t get the money to buy low. If you only get matching contributions during high times, then by definition, you’re buying high, which is a loser’s strategy.)
Working at a community colleges makes some dilemmas easier. Although we have a tenure system, we don’t have a research expectation. That means that cutting travel funding or course releases may be frustrating and annoying, but it doesn’t directly threaten anybody’s ability to earn tenure. Tenure is earned by teaching well and by doing enough college service to carry your weight. The cuts we’ve enacted, as distasteful as they’ve been, haven’t threatened either of those. The same could not be said of, say, eliminating research leaves for junior faculty while leaving the publication requirement intact.
Were I in a similar position at a college with a serious publication requirement for faculty, I’d advocate adjusting tenure expectations to match available resources. Research expectations for tenure have ratcheted so comically high for so long that a little downshifting wouldn’t hurt, and it would carry the added virtue of basic fairness. (At least, fairness within the confines of the tenure system itself, but that’s another post.) It would also allow some recognition, albeit unintentional, of the complete collapse of scholarly publishing. How, exactly, you’re supposed to get published when the presses are closing and you aren’t able to travel is beyond me. A belated recognition of reality is better than no recognition at all.
Here, my version of that is leaving class sizes alone. Even though there are obvious short-term savings to be had by stuffing the classrooms fuller, it strikes me as watering down our core function. Letting go of a special project is one thing; letting go of attentive teaching is something else altogether. So pet projects come in for more scrutiny than usual, and many good ones don’t make the cut. But English comp doesn’t get any bigger, and nobody’s quest for tenure is preemptively doomed. First things first.
Wise and worldly readers – have you seen cuts on your campus that strike at the heart of the mission, or that make tenure effectively impossible?
Among other things, it makes me glad that I work where I work. We have our financial issues, God knows, but we haven’t done anything as drastic and destructive as furloughs or salary cuts, let alone layoffs. We’ve cut travel and release time, as well as a whole bunch of back-office expenses that faculty tend not to notice but that actually matter quite a bit. We have a salary freeze, which is annoying, but nowhere near as annoying as furloughs or cuts. (For the record, cuts are worse than furloughs. Future salary increases are percentages of base pay. Furloughs don’t affect base pay, but cuts do. You won’t see a difference at the time, but you’ll see it down the road.)
(Also for the record, while I’m at it, I recall that one of the selling points of the 401(k) or 403(b) was supposed to be dollar-cost averaging during market dips. With so many companies and some colleges suspending matching contributions to retirement accounts, that argument has been conclusively discredited. Averaging doesn’t help you if you don’t get the money to buy low. If you only get matching contributions during high times, then by definition, you’re buying high, which is a loser’s strategy.)
Working at a community colleges makes some dilemmas easier. Although we have a tenure system, we don’t have a research expectation. That means that cutting travel funding or course releases may be frustrating and annoying, but it doesn’t directly threaten anybody’s ability to earn tenure. Tenure is earned by teaching well and by doing enough college service to carry your weight. The cuts we’ve enacted, as distasteful as they’ve been, haven’t threatened either of those. The same could not be said of, say, eliminating research leaves for junior faculty while leaving the publication requirement intact.
Were I in a similar position at a college with a serious publication requirement for faculty, I’d advocate adjusting tenure expectations to match available resources. Research expectations for tenure have ratcheted so comically high for so long that a little downshifting wouldn’t hurt, and it would carry the added virtue of basic fairness. (At least, fairness within the confines of the tenure system itself, but that’s another post.) It would also allow some recognition, albeit unintentional, of the complete collapse of scholarly publishing. How, exactly, you’re supposed to get published when the presses are closing and you aren’t able to travel is beyond me. A belated recognition of reality is better than no recognition at all.
Here, my version of that is leaving class sizes alone. Even though there are obvious short-term savings to be had by stuffing the classrooms fuller, it strikes me as watering down our core function. Letting go of a special project is one thing; letting go of attentive teaching is something else altogether. So pet projects come in for more scrutiny than usual, and many good ones don’t make the cut. But English comp doesn’t get any bigger, and nobody’s quest for tenure is preemptively doomed. First things first.
Wise and worldly readers – have you seen cuts on your campus that strike at the heart of the mission, or that make tenure effectively impossible?
Monday, June 22, 2009
Passages
Last week was a three-hanky special.
The Girl had two graduations: one from her gymnastics class, and one from her preschool. And I had my first Father's Day since Dad died.
The gymnastics class was easy. They had a little performance for the parents, complete with loud music and bright outfits. (TG rocked the blue tutu.) Parents were everywhere, wielding all manner of camera and video technology. (I brought the 'flip video' thing, which is about the size of a pack of cigarettes. I'm just old enough to think of James Bond when I use it.) TG did some balance-beam walking and a few tuck-and-rolls. Actual exchange:
The Wife: Why do they call it a tuck and roll? Why not just call it a somersault?
TG (slowly): Because first you tuck, and then you woll.
So that's that.
The preschool graduation was harder on the parents. They did a little slideshow of highlights from the year, complete with a soundtrack of sad ballads; by the end, the parents were reduced to quivering piles of jello. (The highlight for us was a picture from the day The Boy came in to read the class a story. He must have been memorable, because as soon as that picture came up, several other kids yelled “that's TB!”) The kids also did a few songs and skits, with construction-paper props and lots of percussion instruments.
Even at this age, you can see distinct personalities in each kid. One kid was the class ham. Another was utterly terrified of going up there. A few of the girls are already a little princess-y, and some of the boys were a little more rough-and-tumble than others. TG did her parents proud, holding her ground and not getting distracted. The teachers have commented before that she's the moral compass of the class, which I don't mind admitting pleases me endlessly.
The contrast between the parents and the kids was striking. For the kids, it was just another day, albeit with parents there. For the parents, it was tears and hugs and frantic exchanges of phone numbers. TG was more focused on the cake than on anything else. Bless her, she has no idea why we had such a hard time.
Father's Day was mostly lovely, with some sadness around the edges. The kids made crafts, which they presented while beaming with well-earned pride. TW took us out to brunch, and I got a couple of books I'd been looking forward to reading.
Over the last week or so, leading up to Father's Day, I couldn't help but think about Dad. It wouldn't be in long, focused ruminations, just the occasional thought that would give me pause.
The one that really threw me was when I realized that with both grandfathers and now Dad gone, I'm the oldest male in the family. I didn't expect that to happen just yet. Growing up, I had Dad, but I also had my Grandpa (on Mom's side) as a sort of role model. Now, it's just me. TB does have his Grandpa on TW's side, which helps tremendously, but this was the first Father's Day I didn't have to shop for.
That's not a crisis, it's not unique, it's not anything that plenty of others haven't gone through. I get that. It's just a little tough to reach the end of Father's Day and to realize, for the first time, that there isn't a phone call still to make.
Sorry to get maudlin. I'll get back to analytical/ironic tomorrow. I just couldn't do it today.
The Girl had two graduations: one from her gymnastics class, and one from her preschool. And I had my first Father's Day since Dad died.
The gymnastics class was easy. They had a little performance for the parents, complete with loud music and bright outfits. (TG rocked the blue tutu.) Parents were everywhere, wielding all manner of camera and video technology. (I brought the 'flip video' thing, which is about the size of a pack of cigarettes. I'm just old enough to think of James Bond when I use it.) TG did some balance-beam walking and a few tuck-and-rolls. Actual exchange:
The Wife: Why do they call it a tuck and roll? Why not just call it a somersault?
TG (slowly): Because first you tuck, and then you woll.
So that's that.
The preschool graduation was harder on the parents. They did a little slideshow of highlights from the year, complete with a soundtrack of sad ballads; by the end, the parents were reduced to quivering piles of jello. (The highlight for us was a picture from the day The Boy came in to read the class a story. He must have been memorable, because as soon as that picture came up, several other kids yelled “that's TB!”) The kids also did a few songs and skits, with construction-paper props and lots of percussion instruments.
Even at this age, you can see distinct personalities in each kid. One kid was the class ham. Another was utterly terrified of going up there. A few of the girls are already a little princess-y, and some of the boys were a little more rough-and-tumble than others. TG did her parents proud, holding her ground and not getting distracted. The teachers have commented before that she's the moral compass of the class, which I don't mind admitting pleases me endlessly.
The contrast between the parents and the kids was striking. For the kids, it was just another day, albeit with parents there. For the parents, it was tears and hugs and frantic exchanges of phone numbers. TG was more focused on the cake than on anything else. Bless her, she has no idea why we had such a hard time.
Father's Day was mostly lovely, with some sadness around the edges. The kids made crafts, which they presented while beaming with well-earned pride. TW took us out to brunch, and I got a couple of books I'd been looking forward to reading.
Over the last week or so, leading up to Father's Day, I couldn't help but think about Dad. It wouldn't be in long, focused ruminations, just the occasional thought that would give me pause.
The one that really threw me was when I realized that with both grandfathers and now Dad gone, I'm the oldest male in the family. I didn't expect that to happen just yet. Growing up, I had Dad, but I also had my Grandpa (on Mom's side) as a sort of role model. Now, it's just me. TB does have his Grandpa on TW's side, which helps tremendously, but this was the first Father's Day I didn't have to shop for.
That's not a crisis, it's not unique, it's not anything that plenty of others haven't gone through. I get that. It's just a little tough to reach the end of Father's Day and to realize, for the first time, that there isn't a phone call still to make.
Sorry to get maudlin. I'll get back to analytical/ironic tomorrow. I just couldn't do it today.
Friday, June 19, 2009
Ask the Administrator: Picking Winners
A new correspondent writes:
It's a great question; I wish more people would ask it. (Arne Duncan, I'm thinking of yoooouuuu...)
I'll admit to considerable uneasiness anytime I hear arguments like “X is the wave of the future. We need a program to prepare students for all those jobs!” Partially that's because I entered grad school in the early 1990's, prepared to capitalize on the Great Wave of Retirements; we all know how that wave turned out. Partially it's because I worked at Proprietary U during and after the dot-com boom, so I saw an entire industry go from “desperate for talent” to “desperate to survive” almost overnight. Partially it's because I'm seeing our Nursing grads suddenly struggle to find work, after many years during which new grads could write their own tickets. And partially it's because so many of the giant corporations of my youth are unrecognizable now, if they still exist at all. (Government Motors? Really?)
If I knew what the hot industry would be five years from now, I'd buy stock in it. I don't, and neither does anybody else. I read somewhere that at Clinton's economic summit in 1992, nobody used the word “internet.” (You'd think Al Gore would have!) Back then, Kodak thought its major competition was Polaroid. Remember Polaroid? Hell, remember Kodak?
At the root of my unease, I think, is the constant conflation of 'job training' with 'economic development.' They are not the same thing. In fact, they can actually be in conflict with each other.
'Job training' is very short-term and specific. It's teaching someone how to do basic tasks for a particular job, often with a particular employer. It usually leads to relatively entry-level work in industries that require more education to move up. The idea is to give people on the economic margins a quick path to a paycheck. It fits people for slots that already exist.
And that's where it hits its limits. It works only to the extent that it fits the jobs that actually exist. If the jobs aren't out there, the training doesn't amount to much.
Economic development doesn't result from filling pre-existing slots. It results from creating new ones.
Creating new ones requires people with initiative, some business know-how, drive, creativity, and great communication skills. It also requires time, access to capital, and some kind of safety net for failure. Although any given business can succeed abruptly, the payoff from an educated population accrues slowly and in the aggregate. It doesn't appear in statistics done six months after graduation.
Today I heard rumors of a forthcoming announcement from the Obama administration for more money for job training programs at community colleges. If anyone up there is listening, please please please keep in mind that the old training model doesn't fit large chunks of the new economy. In reality, the boundary between 'training' and 'transfer' is blurring, since more jobs require more education than they used to. And for long-term growth, as opposed to short-term patching, training isn't close to the answer.
Back at Proprietary U, we graduated gazillions of students into an industry that barely exists anymore. The only courses they took back then that are still relevant, oddly enough, are the general education classes. Industries come and go, but the basics – the ability to synthesize information, to connect the dots, to communicate – endure. Those are job skills. Let's not funnel the resources away from the source of actual long-term growth, in hopes of training more call center reps. Those can, and will, be outsourced.
My proposal for long-term prosperity: combine an educated population with national health insurance (since going without health insurance is a colossal barrier to starting a new business) and a focus on providing the kinds of public goods that lead to all manner of positive externalities – basic research, mass transit, that sort of thing. If that sounds a bit Scandinavian, well, Norway and Sweden aren't doing too badly these days. Iceland followed our model instead, and effectively collapsed. In places with plenty of smart people running around, where the cost of failure isn't so awful, it's not shocking that Nokias and Ericssons pop up. Here, we get Wal-Mart. We can train people to work at Wal-Mart, and there may be times when that's the least-bad short-term option. But it's not the same thing.
Now, to answer the actual question.
On the ground, we pick programs based largely on either needs expressed by local employers, or the availability of grants. Neither is perfect, but they're what we have.
Thanks for the question! I hope the Obama administration uses its initiative wisely.
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
With all this talk about green jobs and the more than usual
uncertainty about the shape of the future job market, I've been
curious of late about how community college deans, departments, and
counselors cope with the issue of occupational forecasting.
While I'm guessing CCs in Arizona are expanding their solar
installation programs and CCs in North Dakota focus on wind turbine
construction and maintenance, it strikes me that in much of the
country (I could be wrong on this) there may be a great deal of
uncertainty about where the jobs of the future will come from.
If that is the case, I'd be real curious as to how decisions are made
both within departments and across colleges as to which programs to
expand and promote to students and which to reduce and/or cut.
Any thoughts on this? Are there clear local occupational forecasts?
Is there a clear process as to how those decisions are made and
processed? I'd love to hear your thoughts on this.
It's a great question; I wish more people would ask it. (Arne Duncan, I'm thinking of yoooouuuu...)
I'll admit to considerable uneasiness anytime I hear arguments like “X is the wave of the future. We need a program to prepare students for all those jobs!” Partially that's because I entered grad school in the early 1990's, prepared to capitalize on the Great Wave of Retirements; we all know how that wave turned out. Partially it's because I worked at Proprietary U during and after the dot-com boom, so I saw an entire industry go from “desperate for talent” to “desperate to survive” almost overnight. Partially it's because I'm seeing our Nursing grads suddenly struggle to find work, after many years during which new grads could write their own tickets. And partially it's because so many of the giant corporations of my youth are unrecognizable now, if they still exist at all. (Government Motors? Really?)
If I knew what the hot industry would be five years from now, I'd buy stock in it. I don't, and neither does anybody else. I read somewhere that at Clinton's economic summit in 1992, nobody used the word “internet.” (You'd think Al Gore would have!) Back then, Kodak thought its major competition was Polaroid. Remember Polaroid? Hell, remember Kodak?
At the root of my unease, I think, is the constant conflation of 'job training' with 'economic development.' They are not the same thing. In fact, they can actually be in conflict with each other.
'Job training' is very short-term and specific. It's teaching someone how to do basic tasks for a particular job, often with a particular employer. It usually leads to relatively entry-level work in industries that require more education to move up. The idea is to give people on the economic margins a quick path to a paycheck. It fits people for slots that already exist.
And that's where it hits its limits. It works only to the extent that it fits the jobs that actually exist. If the jobs aren't out there, the training doesn't amount to much.
Economic development doesn't result from filling pre-existing slots. It results from creating new ones.
Creating new ones requires people with initiative, some business know-how, drive, creativity, and great communication skills. It also requires time, access to capital, and some kind of safety net for failure. Although any given business can succeed abruptly, the payoff from an educated population accrues slowly and in the aggregate. It doesn't appear in statistics done six months after graduation.
Today I heard rumors of a forthcoming announcement from the Obama administration for more money for job training programs at community colleges. If anyone up there is listening, please please please keep in mind that the old training model doesn't fit large chunks of the new economy. In reality, the boundary between 'training' and 'transfer' is blurring, since more jobs require more education than they used to. And for long-term growth, as opposed to short-term patching, training isn't close to the answer.
Back at Proprietary U, we graduated gazillions of students into an industry that barely exists anymore. The only courses they took back then that are still relevant, oddly enough, are the general education classes. Industries come and go, but the basics – the ability to synthesize information, to connect the dots, to communicate – endure. Those are job skills. Let's not funnel the resources away from the source of actual long-term growth, in hopes of training more call center reps. Those can, and will, be outsourced.
My proposal for long-term prosperity: combine an educated population with national health insurance (since going without health insurance is a colossal barrier to starting a new business) and a focus on providing the kinds of public goods that lead to all manner of positive externalities – basic research, mass transit, that sort of thing. If that sounds a bit Scandinavian, well, Norway and Sweden aren't doing too badly these days. Iceland followed our model instead, and effectively collapsed. In places with plenty of smart people running around, where the cost of failure isn't so awful, it's not shocking that Nokias and Ericssons pop up. Here, we get Wal-Mart. We can train people to work at Wal-Mart, and there may be times when that's the least-bad short-term option. But it's not the same thing.
Now, to answer the actual question.
On the ground, we pick programs based largely on either needs expressed by local employers, or the availability of grants. Neither is perfect, but they're what we have.
Thanks for the question! I hope the Obama administration uses its initiative wisely.
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Thursday, June 18, 2009
Reality Show?
Several commenters have recently suggested a Dean Dad reality show. It's not gonna happen – the pseudonym would be pretty much shot, and cameras don't do me any favors these days – but it's fun to think about.
The pilot episode:
Opening shot – the family at dinner.
(Farting sound)
TB, TG: DADDY!
DD: I didn't do it.
TW: Yes, you did.
(Cut to individual interviews)
DD: I really didn't.
TG: Silly Daddy.
TW: That's disgusting.
TB: (guilty smile)
montage, pop song, commercial break
Sweeping shot of college campus, students milling around, etc.
Shot of DD staring intently at computer screen.
Other dean walks in.
OD: Do you have that list of programs that haven't done their annual reviews yet?
DD: I think so. Let me check.
Shot of DD typing intently at computer. Montage, pop song, commercial break.
Return from break. Shots of autumnal campus, attractive students, city buses.
Speeded-up footage of middle-aged people walking into meeting room. Repeat three times.
Speeded-up footage of us walking out. Speeded-up music on soundtrack. Commercial break.
Return from break. Emperor's Theme from Star Wars plays against shot of office.
Kim Kardashian enters.
KK: I'm Kim! I just decided spontaneously to show up here unscripted for no particular reason!
DD: Who the %&^#(*)% are you?
KK's cell phone goes off. Ringtone of current pop hit. She leaves.
DD makes Jim Halpert face at camera. Montage, pop song, closing credits.
Scenes from next week:
Other Dean: Could you believe that?
Cut to DD, looking nonplussed. Angry faux-punk soundtrack.
Yeah, that should work. There's nothing teenagers like more than watching balding middle-aged people in offices. What could possibly go wrong?
The pilot episode:
Opening shot – the family at dinner.
(Farting sound)
TB, TG: DADDY!
DD: I didn't do it.
TW: Yes, you did.
(Cut to individual interviews)
DD: I really didn't.
TG: Silly Daddy.
TW: That's disgusting.
TB: (guilty smile)
montage, pop song, commercial break
Sweeping shot of college campus, students milling around, etc.
Shot of DD staring intently at computer screen.
Other dean walks in.
OD: Do you have that list of programs that haven't done their annual reviews yet?
DD: I think so. Let me check.
Shot of DD typing intently at computer. Montage, pop song, commercial break.
Return from break. Shots of autumnal campus, attractive students, city buses.
Speeded-up footage of middle-aged people walking into meeting room. Repeat three times.
Speeded-up footage of us walking out. Speeded-up music on soundtrack. Commercial break.
Return from break. Emperor's Theme from Star Wars plays against shot of office.
Kim Kardashian enters.
KK: I'm Kim! I just decided spontaneously to show up here unscripted for no particular reason!
DD: Who the %&^#(*)% are you?
KK's cell phone goes off. Ringtone of current pop hit. She leaves.
DD makes Jim Halpert face at camera. Montage, pop song, closing credits.
Scenes from next week:
Other Dean: Could you believe that?
Cut to DD, looking nonplussed. Angry faux-punk soundtrack.
Yeah, that should work. There's nothing teenagers like more than watching balding middle-aged people in offices. What could possibly go wrong?
Wednesday, June 17, 2009
The Holding Tank
Our summer enrollments are breaking records, and anecdotally, that seems to be a function mostly of two variables: a lack of summer jobs, and a more transfer-focused student body. Both of those variables largely track the recession.
In discussions of the cost of higher education, 'opportunity cost' comes up a lot. Basically, it's the money you would have made during the time you spent taking classes, had you worked instead of taking classes. It's the cost of money you didn't make. The kid who goes straight to work at 18 is probably more flush at 21 than the kid who went straight to college, since he didn't suffer the opportunity cost. Over time, there will usually be a more-than-compensating difference in future earnings, but at the moment, it's no contest.
I've been hearing complaints of a terrible lack of summer jobs, at the exact same time that I've been noticing record enrollments in our summer classes. In a perverse way, the Great Recession has effected a huge discount in the opportunity cost of education. If the choice is between work and school, that's one thing. If the choice is between unemployment and school, that's something else.
While I'd like to believe that the major driver of new enrollments is a general cultural enlightenment, it seems likelier that the 'holding tank' function of college is what's really at work. What better time to get the sheepskin than when there aren't any real jobs to be had anyway? You aren't missing anything. And if you time it right, you might emerge with a credential just as the market picks up again, giving you a ticket to ride the updraft in a way you couldn't otherwise.
In the US, the way we count 'unemployment' doesn't always match real life. ('Discouraged' workers don't count, for example, even though their unemployment is the source of their discouragement.) But one way it sort of matches real life is with students. We don't consider students to be unemployed, even if they're looking for work, and there's some validity to that. Studenthood is a kind of economic limbo. It's neither employed nor unemployed; it's just sort of hovering outside the market. Even for people from relatively moneyed backgrounds, there's an acceptance of 'student poverty' as a life stage. It's a phase during which poverty isn't held against you culturally or psychologically. Being unemployed and poor at 30 may feel like a verdict; being a student and poor at 20 is just following the script.
(Graduate student poverty falls between the two. It's still 'student,' and still expected at some level. But it's also at a later stage of life, and what seems cute at 20 just feels sad at 27. By the latter part of my grad school trek, the economic gap between me and most of the rest of my age cohort started to get pretty depressing. TW and I met when I was in grad school. After we got married, she confided that when she first saw my Gradmobile, she started to wonder. I couldn't blame her.)
Trading unmoored poverty for student poverty makes a lot of sense, even if you're still basically broke. It's a different script, and it offers more hope for a happy ending. Besides, student loans are much better deals than credit card debt, and they come with deferments tailor-made for recessions. They even cover health insurance, unlike most entry-level jobs. As holding tanks go, this isn't bad.
I just hope the recession breaks soon enough for everyone to pay back those loans. If not, we'll be hearing about the student loan bubble in a couple of years.
In discussions of the cost of higher education, 'opportunity cost' comes up a lot. Basically, it's the money you would have made during the time you spent taking classes, had you worked instead of taking classes. It's the cost of money you didn't make. The kid who goes straight to work at 18 is probably more flush at 21 than the kid who went straight to college, since he didn't suffer the opportunity cost. Over time, there will usually be a more-than-compensating difference in future earnings, but at the moment, it's no contest.
I've been hearing complaints of a terrible lack of summer jobs, at the exact same time that I've been noticing record enrollments in our summer classes. In a perverse way, the Great Recession has effected a huge discount in the opportunity cost of education. If the choice is between work and school, that's one thing. If the choice is between unemployment and school, that's something else.
While I'd like to believe that the major driver of new enrollments is a general cultural enlightenment, it seems likelier that the 'holding tank' function of college is what's really at work. What better time to get the sheepskin than when there aren't any real jobs to be had anyway? You aren't missing anything. And if you time it right, you might emerge with a credential just as the market picks up again, giving you a ticket to ride the updraft in a way you couldn't otherwise.
In the US, the way we count 'unemployment' doesn't always match real life. ('Discouraged' workers don't count, for example, even though their unemployment is the source of their discouragement.) But one way it sort of matches real life is with students. We don't consider students to be unemployed, even if they're looking for work, and there's some validity to that. Studenthood is a kind of economic limbo. It's neither employed nor unemployed; it's just sort of hovering outside the market. Even for people from relatively moneyed backgrounds, there's an acceptance of 'student poverty' as a life stage. It's a phase during which poverty isn't held against you culturally or psychologically. Being unemployed and poor at 30 may feel like a verdict; being a student and poor at 20 is just following the script.
(Graduate student poverty falls between the two. It's still 'student,' and still expected at some level. But it's also at a later stage of life, and what seems cute at 20 just feels sad at 27. By the latter part of my grad school trek, the economic gap between me and most of the rest of my age cohort started to get pretty depressing. TW and I met when I was in grad school. After we got married, she confided that when she first saw my Gradmobile, she started to wonder. I couldn't blame her.)
Trading unmoored poverty for student poverty makes a lot of sense, even if you're still basically broke. It's a different script, and it offers more hope for a happy ending. Besides, student loans are much better deals than credit card debt, and they come with deferments tailor-made for recessions. They even cover health insurance, unlike most entry-level jobs. As holding tanks go, this isn't bad.
I just hope the recession breaks soon enough for everyone to pay back those loans. If not, we'll be hearing about the student loan bubble in a couple of years.
Tuesday, June 16, 2009
The World Won't End in 2012
We're pretty sure it'll be fiscal year 2011.
This article struck a chord, since I've been hearing the exact same thing on campus.
State and local tax receipts have been plummeting for some time, and still are. (The rate of decline has slowed a bit, but in absolute terms, the trend is still downward.) For this fiscal year and next, we have stimulus money to cushion the decline. We're being cautious with how we use it, since we know it's non-recurring, but at least it's there to give us some time to think through what we're doing.
For FY 2011 – meaning, July 1 of 2010 – we're told the stimulus will be gone, at least from higher ed. And the odds of tax receipts having bounced back to 2007 levels by then are vanishingly low. I've been hearing phrases like “that's when the wheels fall off” and “that's when we hit the wall.”
I've seen three different reactions to the foretellings of doom, each rational in its own way.
One is simple denial. The future is unreadable, a year ago we thought we had money, who knows what might happen in another year? Besides, you people were all gloom-and-doom-y this year, and a pile of money just landed on us. All will be well, this too shall pass, so stop crying wolf. Use the stimulus money to compensate for this year's cut, and leave the future to the future.
Psychologically, it's understandable, but it's also incredibly dangerous. GM used this strategy for many years, and actually caught a few breaks along the way. But sooner or later the money fairy doesn't drop by anymore, and the truth hits. I'd like to avoid going the way of Pontiac.
The second I'd describe as “smoke 'em if ya got 'em!” I've heard some intelligent people argue that we should treat the coming year as a sort of Fat Tuesday, a last blowout before a long dry spell. (The article's mention of large numbers of administrators planning their own retirements to coincide with the end of the stimulus package strikes me as consistent with this. Stick around for the party, then go home.) If we're looking at several lean years to come, and possibly a lower baseline for many years beyond that, then let's fund sabbaticals and pet projects and Special Events while we still can. Get while the getting's good, because who knows when it'll be good again?
Again, there's some truth to this, but it strikes me as basically fatalistic. A year goes surprisingly fast – any parent of young children can attest to that – and then what? Squeezing off one more hit is not a plan.
The third, which I consider the best of a bad lot, takes the stimulus money as an opportunity to pay for things that lower the college's long-term operating costs. Money spent on, say, energy efficiency lowers our baseline expenses in future years, making it slightly easier to weather future cuts.
Psychologically, this is the eat-your-vegetables solution. It's unsatisfyingly pedestrian. Emotionally, it just doesn't match the scale of what we're facing. But it makes sense. Unlike the other two, it cuts future expenses. Better, it cuts them in ways that don't compromise our mission, and that can be sustained over time. If the predictions of gloom and doom turn out to be overstated, then the future windfall can go into cool stuff. If the predictions are spot-on, then the lower operating costs will at least cushion the blow. Unlike the other two, it doesn't rely on either luck or omniscience to bail us out.
Even if the third option works, though, the payoff in future savings won't be anywhere near the magnitude of the cuts we're hearing bandied about. This makes the sell harder, since even success would result in, at best, a mild palliative. Saving a couple hundred thousand is great, but if the state cuts several million, we're still in for a world of hurt.
I'm still hoping that the economy turns around faster than anyone expects, that tax revenues skyrocket, that we get the hell out of Iraq and realize the savings from single-payer health care. But the odds of all of that coming to pass before next July are not encouraging.
I'm too young to retire, so that's out. Maybe I can fall back on the relative safety of show biz...
This article struck a chord, since I've been hearing the exact same thing on campus.
State and local tax receipts have been plummeting for some time, and still are. (The rate of decline has slowed a bit, but in absolute terms, the trend is still downward.) For this fiscal year and next, we have stimulus money to cushion the decline. We're being cautious with how we use it, since we know it's non-recurring, but at least it's there to give us some time to think through what we're doing.
For FY 2011 – meaning, July 1 of 2010 – we're told the stimulus will be gone, at least from higher ed. And the odds of tax receipts having bounced back to 2007 levels by then are vanishingly low. I've been hearing phrases like “that's when the wheels fall off” and “that's when we hit the wall.”
I've seen three different reactions to the foretellings of doom, each rational in its own way.
One is simple denial. The future is unreadable, a year ago we thought we had money, who knows what might happen in another year? Besides, you people were all gloom-and-doom-y this year, and a pile of money just landed on us. All will be well, this too shall pass, so stop crying wolf. Use the stimulus money to compensate for this year's cut, and leave the future to the future.
Psychologically, it's understandable, but it's also incredibly dangerous. GM used this strategy for many years, and actually caught a few breaks along the way. But sooner or later the money fairy doesn't drop by anymore, and the truth hits. I'd like to avoid going the way of Pontiac.
The second I'd describe as “smoke 'em if ya got 'em!” I've heard some intelligent people argue that we should treat the coming year as a sort of Fat Tuesday, a last blowout before a long dry spell. (The article's mention of large numbers of administrators planning their own retirements to coincide with the end of the stimulus package strikes me as consistent with this. Stick around for the party, then go home.) If we're looking at several lean years to come, and possibly a lower baseline for many years beyond that, then let's fund sabbaticals and pet projects and Special Events while we still can. Get while the getting's good, because who knows when it'll be good again?
Again, there's some truth to this, but it strikes me as basically fatalistic. A year goes surprisingly fast – any parent of young children can attest to that – and then what? Squeezing off one more hit is not a plan.
The third, which I consider the best of a bad lot, takes the stimulus money as an opportunity to pay for things that lower the college's long-term operating costs. Money spent on, say, energy efficiency lowers our baseline expenses in future years, making it slightly easier to weather future cuts.
Psychologically, this is the eat-your-vegetables solution. It's unsatisfyingly pedestrian. Emotionally, it just doesn't match the scale of what we're facing. But it makes sense. Unlike the other two, it cuts future expenses. Better, it cuts them in ways that don't compromise our mission, and that can be sustained over time. If the predictions of gloom and doom turn out to be overstated, then the future windfall can go into cool stuff. If the predictions are spot-on, then the lower operating costs will at least cushion the blow. Unlike the other two, it doesn't rely on either luck or omniscience to bail us out.
Even if the third option works, though, the payoff in future savings won't be anywhere near the magnitude of the cuts we're hearing bandied about. This makes the sell harder, since even success would result in, at best, a mild palliative. Saving a couple hundred thousand is great, but if the state cuts several million, we're still in for a world of hurt.
I'm still hoping that the economy turns around faster than anyone expects, that tax revenues skyrocket, that we get the hell out of Iraq and realize the savings from single-payer health care. But the odds of all of that coming to pass before next July are not encouraging.
I'm too young to retire, so that's out. Maybe I can fall back on the relative safety of show biz...
Monday, June 15, 2009
Salaries – Public or Private (or both)?
The indefatigable Lesboprof has a thoughtful post up about whether salaries should be public information. She makes several great points in favor of publicity, including preventing discrimination and giving rookies a fair sense of the going rate.
I agree, but will take it even farther.
One really basic benefit of publicity is that it will frequently put the lie to the tiresome claims of 'bloated administrative salaries' that usually constitute the first salvo in academic politics. I make substantially less than my predecessor did several years ago, and that was true at my previous job as well. My counterparts here also make far less than you'd expect, given their qualifications, performance, and scope of responsibility. (This year's raise: 0.) Put that out there, and put the finger-pointing to rest. Structural problems are structural, not personal.
At colleges where that isn't the case – where the bloat is actually real – then shedding light can only help. It's a win-win either way.
(The only level at which this falls apart is with Presidents, since they typically get some substantial portion of their compensation in 'allowances' for housing, a car, etc. I'll admit not quite understanding this, since it seems bound to lead to issues. I'd rather take the equivalent in salary – even with the tax hit – to have the privilege of being able to stop for milk on the way home without filling out an expense report, or of being able to paint the flippin' living room without anybody's permission. But that's me.)
Public salaries also make it much harder for cowardly administrators to cut side deals. On behalf of those of us who are actually trying to do the right thing, this is good news. Yes, there are currencies other than money – course releases, office locations, travel money, etc. – but taking a really big one off the table can limit the abuses. Since public institutions aren't publicly traded, there's no issue of stock options substituting for salary, which is what led to so many abuses elsewhere. My cc doesn't, and couldn't, issue stock. What you get is what you see.
Public salaries can also serve as useful counterarguments to those in the popular press – I won't name any names here – who like to claim that academics are getting fat at the public trough. Look at what people actually make at the cc level. With a few exceptions in some very specific regions of the country, these numbers don't suggest any kind of boondoggle. If anything, including adjuncts in the overall list – I'd insist on that – should give a sense of just how inexpensively cc's generally are run. Yes, some of the four-year and graduate institutions might rather sweep that particular fact under the rug, but it's true. Cut our budget, and we start cutting functions.
At a more fundamental level, though, I'd love to get past the idea of academia as some sort of calling, and recognize that it's a job. Treat it as such. The 'calling' idea, I think, is part of why so many adjuncts allow themselves to be exploited for so long. They just can't imagine doing anything else, and/or don't want to admit defeat. (A calling is supposed to be deeply personal. If you can't get anywhere in your calling, what does that say about you? Rationally, that's crap, but psychologically, it's powerful.) As long as they hold pre-modern, romantic notions of the 'profession,' they're ripe for the picking. We need to disenchant the job, which means, among other things, putting it all out there. Yes, that may have a depressing effect on graduate school admissions. That would be a sign of success. The goal here is to stop talented young people from throwing themselves into the sausage grinder. Warning them upfront that even a 'win' – a tenure-track job – isn't all that much of a win economically might just dissuade some, which can only help.
I'll go farther, though. It's ludicrous that public institutions should be the only ones with open books. My modest proposal: open salaries by law for every employer in America. Let's see where the real bloat is. Hint: it ain't community colleges.
Lesboprof's arguments seem to me just as valid for the private sector as for the public. Rookies should be able to learn the going rate, and discrimination shouldn't be able to hide behind a corporate veil. In an era of government bailouts, the idea that professors making $45,000 a year are open to scrutiny but bankers making a dozen times that, aren't, is insane. Let's see where all the money is going – not just public sector money – before we start judging just exactly who's exploiting whom.
I've long suspected that the taboo against talking about salaries served certain interests over others. Here's a chance to see. How much does the talking head on Fox News get? How much does my functional equivalent at an HMO get? No more of this 'selective transparency' crap. Let's get it all out there, and have a real discussion about priorities.
Transparency is great, but not just for the public sector. I'm tired of uberwealthy commentators cherry-picking the occasional anomaly from the public sector for political purposes, while remaining immune from scrutiny themselves. Fair is fair. Open the books, and let the chips fall where they may. Let everybody get a sense of the going rate. Then let the real debate begin.
I agree, but will take it even farther.
One really basic benefit of publicity is that it will frequently put the lie to the tiresome claims of 'bloated administrative salaries' that usually constitute the first salvo in academic politics. I make substantially less than my predecessor did several years ago, and that was true at my previous job as well. My counterparts here also make far less than you'd expect, given their qualifications, performance, and scope of responsibility. (This year's raise: 0.) Put that out there, and put the finger-pointing to rest. Structural problems are structural, not personal.
At colleges where that isn't the case – where the bloat is actually real – then shedding light can only help. It's a win-win either way.
(The only level at which this falls apart is with Presidents, since they typically get some substantial portion of their compensation in 'allowances' for housing, a car, etc. I'll admit not quite understanding this, since it seems bound to lead to issues. I'd rather take the equivalent in salary – even with the tax hit – to have the privilege of being able to stop for milk on the way home without filling out an expense report, or of being able to paint the flippin' living room without anybody's permission. But that's me.)
Public salaries also make it much harder for cowardly administrators to cut side deals. On behalf of those of us who are actually trying to do the right thing, this is good news. Yes, there are currencies other than money – course releases, office locations, travel money, etc. – but taking a really big one off the table can limit the abuses. Since public institutions aren't publicly traded, there's no issue of stock options substituting for salary, which is what led to so many abuses elsewhere. My cc doesn't, and couldn't, issue stock. What you get is what you see.
Public salaries can also serve as useful counterarguments to those in the popular press – I won't name any names here – who like to claim that academics are getting fat at the public trough. Look at what people actually make at the cc level. With a few exceptions in some very specific regions of the country, these numbers don't suggest any kind of boondoggle. If anything, including adjuncts in the overall list – I'd insist on that – should give a sense of just how inexpensively cc's generally are run. Yes, some of the four-year and graduate institutions might rather sweep that particular fact under the rug, but it's true. Cut our budget, and we start cutting functions.
At a more fundamental level, though, I'd love to get past the idea of academia as some sort of calling, and recognize that it's a job. Treat it as such. The 'calling' idea, I think, is part of why so many adjuncts allow themselves to be exploited for so long. They just can't imagine doing anything else, and/or don't want to admit defeat. (A calling is supposed to be deeply personal. If you can't get anywhere in your calling, what does that say about you? Rationally, that's crap, but psychologically, it's powerful.) As long as they hold pre-modern, romantic notions of the 'profession,' they're ripe for the picking. We need to disenchant the job, which means, among other things, putting it all out there. Yes, that may have a depressing effect on graduate school admissions. That would be a sign of success. The goal here is to stop talented young people from throwing themselves into the sausage grinder. Warning them upfront that even a 'win' – a tenure-track job – isn't all that much of a win economically might just dissuade some, which can only help.
I'll go farther, though. It's ludicrous that public institutions should be the only ones with open books. My modest proposal: open salaries by law for every employer in America. Let's see where the real bloat is. Hint: it ain't community colleges.
Lesboprof's arguments seem to me just as valid for the private sector as for the public. Rookies should be able to learn the going rate, and discrimination shouldn't be able to hide behind a corporate veil. In an era of government bailouts, the idea that professors making $45,000 a year are open to scrutiny but bankers making a dozen times that, aren't, is insane. Let's see where all the money is going – not just public sector money – before we start judging just exactly who's exploiting whom.
I've long suspected that the taboo against talking about salaries served certain interests over others. Here's a chance to see. How much does the talking head on Fox News get? How much does my functional equivalent at an HMO get? No more of this 'selective transparency' crap. Let's get it all out there, and have a real discussion about priorities.
Transparency is great, but not just for the public sector. I'm tired of uberwealthy commentators cherry-picking the occasional anomaly from the public sector for political purposes, while remaining immune from scrutiny themselves. Fair is fair. Open the books, and let the chips fall where they may. Let everybody get a sense of the going rate. Then let the real debate begin.
Friday, June 12, 2009
The Seven Secrets of...
Every summer, blog traffic slows down a bit, which usually occasions some musings on better marketing. So, some musings on titles...
“The” is so much more powerful than “a,” or even a number. Compare:
A List of Ways to Run Better Meetings
Okay, it's clear, but I'm bored before I even get to the end of the phrase.
7 Ways to Run Better Meetings
The '7' suggests at least some level of discipline (or finitude), but it still lacks a certain urgency.
The 7 Secrets to Better Meetings
Much better. Did your tricks make the cut? Are there really only seven? Such authority! Such confidence! The “The” implies a finality to it, and invites guessing, which is a form of buy-in. And 'secrets' is so much better than 'ways.' Urgency and mystery! And anyone who knows 'secrets' must have special access to something!
The definite article (“the”) somehow lends both gravitas and urgency, which is a neat trick. It inspires a certain faith that the author actually knows something about something. Is the faith warranted? If you actually want to know, you're already hooked.
Of course, there's still the limiting factor of 'better meetings' as a topic. Since 'meetings' are kind of business-y, and the default business culture is male, it's useful to throw in gratuitous celebrity references from a category that officially appeals to men: athletes, the powerful, and hot chicks.
Joe Torre's 7 Secrets to Better Meetings
Okay, cool, there's a guy-approved celebrity with a track record in a guy-approved field, and I've still got the 'secrets' thing going. I've also got a bulletproof excuse for lots of anecdotes about famous people. In old media -- books -- this would be enough. But in internet land, if you really want traffic, it's all about the hot chicks.
Audrina Patridge Shares the 7 Secrets to Better Meetings
Close. Almost there. But there's still that 'better meetings' angle. Word order may help.
The 7 Secrets to Meeting Audrina Patridge
Okay, I'm in the ballpark, but it's a little stalker-y. I'm trying to generate readers, not restraining orders. A subtle change should do it.
Audrina Patridge's 7 Secret Meetings Revealed!
Perfect. Add some pictures and bullet points, use the phrase "web 2.0" a lot, and I'm good to go.
Or I could just make peace with the summer lull. Either way is good.
“The” is so much more powerful than “a,” or even a number. Compare:
A List of Ways to Run Better Meetings
Okay, it's clear, but I'm bored before I even get to the end of the phrase.
7 Ways to Run Better Meetings
The '7' suggests at least some level of discipline (or finitude), but it still lacks a certain urgency.
The 7 Secrets to Better Meetings
Much better. Did your tricks make the cut? Are there really only seven? Such authority! Such confidence! The “The” implies a finality to it, and invites guessing, which is a form of buy-in. And 'secrets' is so much better than 'ways.' Urgency and mystery! And anyone who knows 'secrets' must have special access to something!
The definite article (“the”) somehow lends both gravitas and urgency, which is a neat trick. It inspires a certain faith that the author actually knows something about something. Is the faith warranted? If you actually want to know, you're already hooked.
Of course, there's still the limiting factor of 'better meetings' as a topic. Since 'meetings' are kind of business-y, and the default business culture is male, it's useful to throw in gratuitous celebrity references from a category that officially appeals to men: athletes, the powerful, and hot chicks.
Joe Torre's 7 Secrets to Better Meetings
Okay, cool, there's a guy-approved celebrity with a track record in a guy-approved field, and I've still got the 'secrets' thing going. I've also got a bulletproof excuse for lots of anecdotes about famous people. In old media -- books -- this would be enough. But in internet land, if you really want traffic, it's all about the hot chicks.
Audrina Patridge Shares the 7 Secrets to Better Meetings
Close. Almost there. But there's still that 'better meetings' angle. Word order may help.
The 7 Secrets to Meeting Audrina Patridge
Okay, I'm in the ballpark, but it's a little stalker-y. I'm trying to generate readers, not restraining orders. A subtle change should do it.
Audrina Patridge's 7 Secret Meetings Revealed!
Perfect. Add some pictures and bullet points, use the phrase "web 2.0" a lot, and I'm good to go.
Or I could just make peace with the summer lull. Either way is good.
Thursday, June 11, 2009
Sometimes It Actually Works
Like Tolstoy's unhappy families, every bad meeting is bad in its own particular way. Some elements of lousy meetings are common enough to be recognizable from afar: domination by blowhards, poorly constructed agendas, leaders playing “guess what I'm thinking.” But even without the obvious hazards, meetings can go wrong in so many ways that those of us who endure more than most learn pretty quickly to lower our expectations.
Maybe that's why this one came as such a welcome surprise. Once in a while, the planets align, and a meeting you fully expect to be nothing more than pedestrian actually achieves something that could not have been achieved any other way.
It started inauspiciously. It was an end-of-year wrap-up for a task force. There was the usual perfunctory recap of the year, an outline of things to come next year, a few questions, some jokes, and a bit of news. Then someone brought up an email flame war that had ensued a few weeks earlier.
As with so many conflicts, it was both heartfelt and fundamentally stupid. It grew out of a real-life version of the old game “telephone.” A condensed version:
Group A is served by Program A, which does a good job. The leaders of Program B, which offers similar services, decide to target Group B, on the theory that Group A is already served. Someone on the front lines hears that Program A is for Group A and Program B for Group B. Member of Group A asks about Program B, and is told on front lines that “it's not for you.” Someone from Program A hears that members of Group A are being excluded from Program B, and charges discrimination. Program B offers irrelevant response, not having any idea where the charges came from. Long, very angry emails start flying. Personal grudges are given airtime under cover of the latest conflict. Sinister agendas are imputed. Nobody can exactly pin down just what the hell happened.
Yuck.
This meeting wasn't intended to address that, but the issue came up, and people from both A and B were there. And in one of those moments that people in my job live for...
People actually listened to each other, and pieced together what had happened. People admitted confusion, told their truths, and listened. And as the fragments of truth spilled out, we were able to put them together in a narrative that explained it all without ascribing bad intentions to anybody. After about forty minutes of discussion – much of it relatively animated – we realized that while there were clearly some communication mistakes, we didn't have to demonize anybody to explain them. Both programs were honestly trying to do the right thing. The issue was a lack of a shared context, which is fixable.
To normal people, this is probably about as exciting as toast. But to administrative types like me, this is what a clean win looks like. We all came out of that meeting with a clearer understanding of what had happened, able to both explain and discount the flame war, and able to take steps to prevent similar failings in the future. We were able to redirect our energies from internal politics to serving the students. And we experienced a meeting that actually worked.
I don't think that could have worked over email. When an entire group is bushwhacking together, there's an electricity in the room that just doesn't happen asynchronously. And the loss of interruptibility, intonation cues, and body language (among other things) in email can make it harder to convey a certain kind of productive confusion. (It can be done, but most people aren't terribly artful writers.) This group – spontaneously – took a chance on uncertainty, and won.
Sometimes it actually works. Even this jaded veteran of task force meetings had to smile.
Maybe that's why this one came as such a welcome surprise. Once in a while, the planets align, and a meeting you fully expect to be nothing more than pedestrian actually achieves something that could not have been achieved any other way.
It started inauspiciously. It was an end-of-year wrap-up for a task force. There was the usual perfunctory recap of the year, an outline of things to come next year, a few questions, some jokes, and a bit of news. Then someone brought up an email flame war that had ensued a few weeks earlier.
As with so many conflicts, it was both heartfelt and fundamentally stupid. It grew out of a real-life version of the old game “telephone.” A condensed version:
Group A is served by Program A, which does a good job. The leaders of Program B, which offers similar services, decide to target Group B, on the theory that Group A is already served. Someone on the front lines hears that Program A is for Group A and Program B for Group B. Member of Group A asks about Program B, and is told on front lines that “it's not for you.” Someone from Program A hears that members of Group A are being excluded from Program B, and charges discrimination. Program B offers irrelevant response, not having any idea where the charges came from. Long, very angry emails start flying. Personal grudges are given airtime under cover of the latest conflict. Sinister agendas are imputed. Nobody can exactly pin down just what the hell happened.
Yuck.
This meeting wasn't intended to address that, but the issue came up, and people from both A and B were there. And in one of those moments that people in my job live for...
People actually listened to each other, and pieced together what had happened. People admitted confusion, told their truths, and listened. And as the fragments of truth spilled out, we were able to put them together in a narrative that explained it all without ascribing bad intentions to anybody. After about forty minutes of discussion – much of it relatively animated – we realized that while there were clearly some communication mistakes, we didn't have to demonize anybody to explain them. Both programs were honestly trying to do the right thing. The issue was a lack of a shared context, which is fixable.
To normal people, this is probably about as exciting as toast. But to administrative types like me, this is what a clean win looks like. We all came out of that meeting with a clearer understanding of what had happened, able to both explain and discount the flame war, and able to take steps to prevent similar failings in the future. We were able to redirect our energies from internal politics to serving the students. And we experienced a meeting that actually worked.
I don't think that could have worked over email. When an entire group is bushwhacking together, there's an electricity in the room that just doesn't happen asynchronously. And the loss of interruptibility, intonation cues, and body language (among other things) in email can make it harder to convey a certain kind of productive confusion. (It can be done, but most people aren't terribly artful writers.) This group – spontaneously – took a chance on uncertainty, and won.
Sometimes it actually works. Even this jaded veteran of task force meetings had to smile.
Wednesday, June 10, 2009
Grade Inflation, Employee Edition
This time of year brings with it the annual flood of program reviews, employee evaluations, and end-of-year wrap-ups. (Between the academic year and the fiscal year, we hit the 'reset' button on July 1.) That means that the second half of June becomes an exercise in speed reading and diplomacy.
I'm noticing again a pronounced tendency towards internal grade inflation. In informal conversation, it's easy to get some fascinating three-dimensional portraits of employees. But in writing, almost everybody is practically perfect in every way. We've blasted right past Lake Wobegon and entered Mary Poppins territory.
When I've asked why people routinely give “walks on water” formal evaluations to nothing-special employees, I usually get one of the following:
- Less than stellar evaluations are bad for morale. Since you can't realistically fire them, you don't want to anger them. The only thing worse than a mediocre employee is a mediocre employee with an attitude.
- Hey, we don't pay very much, so let's offer praise. It's better than nothing.
- Okay, but let others go first. I don't want to be the asshole.
- I don't want to be accused of discrimination/favoritism/hypocrisy when I lower the boom. It's not worth it.
- We don't have the money for the professional development they'd need, so screw it.
These each have just enough truth in them to be annoying, but not nearly enough to carry the argument. And they all fail to address the very real long-term cost of not confronting issues when they arise.
Yes, people get cranky when they're told that they're falling short. Sometimes they sulk, sometimes they lash out, sometimes they retaliate. But when they aren't told, one of two things tends to happen. One group will secretly feel guilty that they're getting away with something, which leads to all kinds of weird psychodramas as they enact the punishment they kinda know they deserve on others. (Over the long term, this is death to morale anyway.) The rest will just go merrily on their way, blithely and falsely convinced that all is well. Repeat that cycle for several years, and you get some long-entrenched low performers you can't tell anything. Organizationally, this is a disaster.
(Besides, at a really basic level, dealing with occasional crankiness is simply part of the job. If you're unwilling to have the occasional difficult conversation, you have no business managing anybody.)
In the land of 'progressive discipline,' moving out a low performer becomes incredibly expensive and time-consuming when the paper trail from the past shows nothing but conflict-avoidant happy-face evaluations. In the meantime, disaster unfolds.
Addressing the issues in a timely way is fairer on both sides. It gives the employee fair warning that something is wrong, and in enough time to try to fix it. (Alternately, it gives the employee fair warning that s/he is being misinterpreted; I've seen that, too.) It gives the organization at least the possibility of improved performance. And it gives the manager the beginning of the necessary documentation for eventually moving somebody out, if the needed improvement isn't forthcoming.
The resource-based objections – either salary or professional development – are properly separate issues. If you aren't satisfied with your pay, the way to deal with that is not to do a half-assed job; it's to find another job. (One of the mottoes that got me through my adjuncting days was “For what they pay me, they're lucky if I show up sober!” But I always did.) In my observation, the people who complain the loudest about salaries usually aren't the ones who most deserve more. The most deserving usually move up or move on. And while it's conceptually possible that a little more professional development money could make the occasional difference, in practice it usually makes the most impact on people who are already good. It can make the good better, but I've never seen it make the marginal good. It's possible, but I've never seen it.
The 'you first' objection is the most frustrating. Since it's a tall order to shift an entire culture at once, it tends to happen in fits and starts, which is a nice way of saying 'unevenly.' The folks whose managers adopt the new candor will object, with some justification, that they're being singled out. It's the right long term move, but the transition is bumpy and difficult. Any suggestions from wise and worldly readers who've seen this transition happen successfully would be appreciated.
So I dive into the pile, prepared to read things that I know simply aren't true, and having to decide which battles are actually worth picking. July 1 can't come fast enough.
I'm noticing again a pronounced tendency towards internal grade inflation. In informal conversation, it's easy to get some fascinating three-dimensional portraits of employees. But in writing, almost everybody is practically perfect in every way. We've blasted right past Lake Wobegon and entered Mary Poppins territory.
When I've asked why people routinely give “walks on water” formal evaluations to nothing-special employees, I usually get one of the following:
- Less than stellar evaluations are bad for morale. Since you can't realistically fire them, you don't want to anger them. The only thing worse than a mediocre employee is a mediocre employee with an attitude.
- Hey, we don't pay very much, so let's offer praise. It's better than nothing.
- Okay, but let others go first. I don't want to be the asshole.
- I don't want to be accused of discrimination/favoritism/hypocrisy when I lower the boom. It's not worth it.
- We don't have the money for the professional development they'd need, so screw it.
These each have just enough truth in them to be annoying, but not nearly enough to carry the argument. And they all fail to address the very real long-term cost of not confronting issues when they arise.
Yes, people get cranky when they're told that they're falling short. Sometimes they sulk, sometimes they lash out, sometimes they retaliate. But when they aren't told, one of two things tends to happen. One group will secretly feel guilty that they're getting away with something, which leads to all kinds of weird psychodramas as they enact the punishment they kinda know they deserve on others. (Over the long term, this is death to morale anyway.) The rest will just go merrily on their way, blithely and falsely convinced that all is well. Repeat that cycle for several years, and you get some long-entrenched low performers you can't tell anything. Organizationally, this is a disaster.
(Besides, at a really basic level, dealing with occasional crankiness is simply part of the job. If you're unwilling to have the occasional difficult conversation, you have no business managing anybody.)
In the land of 'progressive discipline,' moving out a low performer becomes incredibly expensive and time-consuming when the paper trail from the past shows nothing but conflict-avoidant happy-face evaluations. In the meantime, disaster unfolds.
Addressing the issues in a timely way is fairer on both sides. It gives the employee fair warning that something is wrong, and in enough time to try to fix it. (Alternately, it gives the employee fair warning that s/he is being misinterpreted; I've seen that, too.) It gives the organization at least the possibility of improved performance. And it gives the manager the beginning of the necessary documentation for eventually moving somebody out, if the needed improvement isn't forthcoming.
The resource-based objections – either salary or professional development – are properly separate issues. If you aren't satisfied with your pay, the way to deal with that is not to do a half-assed job; it's to find another job. (One of the mottoes that got me through my adjuncting days was “For what they pay me, they're lucky if I show up sober!” But I always did.) In my observation, the people who complain the loudest about salaries usually aren't the ones who most deserve more. The most deserving usually move up or move on. And while it's conceptually possible that a little more professional development money could make the occasional difference, in practice it usually makes the most impact on people who are already good. It can make the good better, but I've never seen it make the marginal good. It's possible, but I've never seen it.
The 'you first' objection is the most frustrating. Since it's a tall order to shift an entire culture at once, it tends to happen in fits and starts, which is a nice way of saying 'unevenly.' The folks whose managers adopt the new candor will object, with some justification, that they're being singled out. It's the right long term move, but the transition is bumpy and difficult. Any suggestions from wise and worldly readers who've seen this transition happen successfully would be appreciated.
So I dive into the pile, prepared to read things that I know simply aren't true, and having to decide which battles are actually worth picking. July 1 can't come fast enough.
Tuesday, June 09, 2009
Growing Your Own
The Chronicle and IHE each have articles about succession planning for college administrators. They're both relatively supportive of hiring internal candidates, and for many of the same reasons. Internal candidates know their institutions and are known back; national searches are expensive and often unsuccessful; in the age of plummeting house values and two-career couples, the classic two-body problem is also a two-house problem, so now even successful candidates may not actually take the offer.
Those are all true, as far as they go, but they strike me as missing a lot. While I'm not quite as 'anti' as I was a few years ago, I'd still offer some caveats on internal succession planning.
First and most obviously, sticking with internal people guarantees inbreeding. (That's the flip side of 'continuity.') That can take the form of the old boys' network, certainly, and the usual diversity-based objections to closed networks all apply. But it isn't just about protected classes; it's about new perspectives. Someone who has been a part of a particular campus culture for a long time just can't see it with fresh eyes. I've been in meetings in which a newbie asked about a particular longstanding practice; when told that it couldn't be done, s/he responded that it had been done at her/his previous college. That kind of reality check may or may not be worth the risk at low levels, but it's incredibly valuable at high levels. A close variation on that is that the newbie brings the benefit of having lived through mistakes elsewhere that haven't been made yet at the new place.
Second, it's nearly never the case that there's a single person internally who's clearly right. Usually, there are several people who each think they're right. Competing for the role of 'heir apparent' can lead to really toxic and awful internal politics, diverting effort from the real work of the college. It can also foster unfounded senses of entitlement that lead to misplaced anger when things don't pan out.
Third, even when the internal candidates are relatively strong on their own merits, there can be times when the local culture is so poisoned with crosscutting histories that anybody from within will automatically be perceived – rightly or wrongly – as a champion of one faction against others. That's nobody's fault, but sometimes you just need someone completely new to cut through the clutter. While it's true that outsiders tend to have steeper learning curves, they have the relative advantage of not denying that they have learning curves. Unlike some internal candidates, they know that they don't know. I've seen plenty of local 'experts' find themselves shellshocked when they move up a level and discover that knowing everything there is to know about one department doesn't prepare them to be dean of several. At least with the new outsider, there's usually a period in which s/he's allowed to admit ignorance, which can lead to some remarkably productive clarifying conversations.
None of this is to deny that outsiders sometimes crash and burn, or that national searches are expensive, or that some internal candidates are entirely wonderful. It's just to say that moving to a presumed preference for internal candidates is probably much costlier than either article seems to assume.
All of that said, though, there's certainly an argument for developing the skills of talented internal people. Some of that may ultimately redound to the benefit of other places, as the newly-hot candidates take their skills elsewhere, but that's a cost of doing business. The alternative is to keep everybody ignorant, the better to control them. I'd rather build my people, and then take my chances that some of them will decide to move up at times that make sense for them, rather than for my college. It's more consistent with the ethos of an educational institution, and in the meantime you get amazing performance. When they go, you get connections at other places.
Some of that will involve 'professional development' as it's usually defined, but much of it (in my observation) involves rolling the dice on smart and curious people stepping into new roles. In fact, I'd argue that the root of the lack of good candidates for many administrative positions is precisely the lack of full-time faculty hiring over the last few decades. When the farm team shuts down, sooner or later the big club will run out of rookies. In the short term, they can go the free-agent route, but the entire pool is aging. While I'm skeptical of a hard application of succession planning, I do think there's a good long-term argument from 'succession' for hiring more full-time faculty. And there's certainly an argument for taking the occasional chance on, say, a promising but relatively untested rookie.
Wise and worldly readers – what do you think? Have you seen succession planning done well in a higher ed setting? If you did, how did it work?
Those are all true, as far as they go, but they strike me as missing a lot. While I'm not quite as 'anti' as I was a few years ago, I'd still offer some caveats on internal succession planning.
First and most obviously, sticking with internal people guarantees inbreeding. (That's the flip side of 'continuity.') That can take the form of the old boys' network, certainly, and the usual diversity-based objections to closed networks all apply. But it isn't just about protected classes; it's about new perspectives. Someone who has been a part of a particular campus culture for a long time just can't see it with fresh eyes. I've been in meetings in which a newbie asked about a particular longstanding practice; when told that it couldn't be done, s/he responded that it had been done at her/his previous college. That kind of reality check may or may not be worth the risk at low levels, but it's incredibly valuable at high levels. A close variation on that is that the newbie brings the benefit of having lived through mistakes elsewhere that haven't been made yet at the new place.
Second, it's nearly never the case that there's a single person internally who's clearly right. Usually, there are several people who each think they're right. Competing for the role of 'heir apparent' can lead to really toxic and awful internal politics, diverting effort from the real work of the college. It can also foster unfounded senses of entitlement that lead to misplaced anger when things don't pan out.
Third, even when the internal candidates are relatively strong on their own merits, there can be times when the local culture is so poisoned with crosscutting histories that anybody from within will automatically be perceived – rightly or wrongly – as a champion of one faction against others. That's nobody's fault, but sometimes you just need someone completely new to cut through the clutter. While it's true that outsiders tend to have steeper learning curves, they have the relative advantage of not denying that they have learning curves. Unlike some internal candidates, they know that they don't know. I've seen plenty of local 'experts' find themselves shellshocked when they move up a level and discover that knowing everything there is to know about one department doesn't prepare them to be dean of several. At least with the new outsider, there's usually a period in which s/he's allowed to admit ignorance, which can lead to some remarkably productive clarifying conversations.
None of this is to deny that outsiders sometimes crash and burn, or that national searches are expensive, or that some internal candidates are entirely wonderful. It's just to say that moving to a presumed preference for internal candidates is probably much costlier than either article seems to assume.
All of that said, though, there's certainly an argument for developing the skills of talented internal people. Some of that may ultimately redound to the benefit of other places, as the newly-hot candidates take their skills elsewhere, but that's a cost of doing business. The alternative is to keep everybody ignorant, the better to control them. I'd rather build my people, and then take my chances that some of them will decide to move up at times that make sense for them, rather than for my college. It's more consistent with the ethos of an educational institution, and in the meantime you get amazing performance. When they go, you get connections at other places.
Some of that will involve 'professional development' as it's usually defined, but much of it (in my observation) involves rolling the dice on smart and curious people stepping into new roles. In fact, I'd argue that the root of the lack of good candidates for many administrative positions is precisely the lack of full-time faculty hiring over the last few decades. When the farm team shuts down, sooner or later the big club will run out of rookies. In the short term, they can go the free-agent route, but the entire pool is aging. While I'm skeptical of a hard application of succession planning, I do think there's a good long-term argument from 'succession' for hiring more full-time faculty. And there's certainly an argument for taking the occasional chance on, say, a promising but relatively untested rookie.
Wise and worldly readers – what do you think? Have you seen succession planning done well in a higher ed setting? If you did, how did it work?
Monday, June 08, 2009
Monday Musings
Monday Musings
- In discussing a nearby park, TG asked me “why did they put a hill there when they designed the world?”
- We saw Up on Saturday, and I have to admit, it's one of the best movies I've ever seen. Pixar has a high batting average anyway – Cars and Finding Nemo were nifty, and The Incredibles was flat-out great – but this one had a sweetness to it that the others lacked. There's a short almost-silent mini-movie in the beginning that traces a couple's lifetime together that almost stands as a movie in itself. At the end of that set piece, an entire theater full of kids and candy wrappers was silent. But the movie also had plenty of jokes for the various age levels, wonderful voice acting, and a perfect ending. Very, very impressed.
- Vacation hopscotch has started. Although we admin types have 12-month calendars, we need breaks too, and we can't really take them during regular semesters. So when summer hits, people take their days. It makes sense, but it also means that scheduling meetings when you can get a full complement of attendees becomes much more challenging. I've already had several conversations along the lines of “I'm out that week, and she's out the next week, and then we're all here for a day, but then so-and-so will be out for a week, and then...” Things still get done, but it's slower and much more catch-as-catch-can.
- In a nod to web 2.0, and inspired mostly by Clancy Ratliff (culturecat), I've started a twitter feed (twitter.com/deandad). The first few tweets were more conceptual than event-based (“continued balding”), but that's hard to sustain, so it's defaulting to events. As with blogging, there's a rhythm that takes a little time to learn. From following a few other feeds, I'm noticing that the major difference from blogging is in the contours of storytelling. In a blog, each post is largely self-contained, either telling its own story or starting a discussion in the comments. On Twitter, each post is like a single musical note; the melody comes in the sequence of notes over time. It's a different style, but it's fun to try. (Documenting the quotidian while maintaining pseudonymity is a unique challenge; I'm not entirely sure how that will work.) I'll admit also enjoying the theme-of-the-day contests they have in the sidebar. Last week they did “three words after sex.” Somebody submitted “that cost what?”
- Last week on our library run, TB sat in the kids' section reading Diary of a Wimpy Kid and laughing to himself. I noticed two girls who looked a year or two older than TB watching him and whispering to each other. It didn't seem to be mockery; it looked more like goading. Eight years old, and he's already got it. Where he got it from is mysterious, but good for him. He has no idea, which I think is why it works.
- A contact at a private university mentioned to me that the students there are postponing graduation specifically so they won't have to start repaying their student loans while unemployed. Locally, our retention rate is climbing, even with students who just graduated sticking around to take a few more credits before transferring. (Some of the local destination colleges for transfers are willing to take the degree plus fifteen credits; we're having unprecedented numbers of students actually take them up on it.) The Great Recession is playing out in unanticipated ways.
- In discussing a nearby park, TG asked me “why did they put a hill there when they designed the world?”
- We saw Up on Saturday, and I have to admit, it's one of the best movies I've ever seen. Pixar has a high batting average anyway – Cars and Finding Nemo were nifty, and The Incredibles was flat-out great – but this one had a sweetness to it that the others lacked. There's a short almost-silent mini-movie in the beginning that traces a couple's lifetime together that almost stands as a movie in itself. At the end of that set piece, an entire theater full of kids and candy wrappers was silent. But the movie also had plenty of jokes for the various age levels, wonderful voice acting, and a perfect ending. Very, very impressed.
- Vacation hopscotch has started. Although we admin types have 12-month calendars, we need breaks too, and we can't really take them during regular semesters. So when summer hits, people take their days. It makes sense, but it also means that scheduling meetings when you can get a full complement of attendees becomes much more challenging. I've already had several conversations along the lines of “I'm out that week, and she's out the next week, and then we're all here for a day, but then so-and-so will be out for a week, and then...” Things still get done, but it's slower and much more catch-as-catch-can.
- In a nod to web 2.0, and inspired mostly by Clancy Ratliff (culturecat), I've started a twitter feed (twitter.com/deandad). The first few tweets were more conceptual than event-based (“continued balding”), but that's hard to sustain, so it's defaulting to events. As with blogging, there's a rhythm that takes a little time to learn. From following a few other feeds, I'm noticing that the major difference from blogging is in the contours of storytelling. In a blog, each post is largely self-contained, either telling its own story or starting a discussion in the comments. On Twitter, each post is like a single musical note; the melody comes in the sequence of notes over time. It's a different style, but it's fun to try. (Documenting the quotidian while maintaining pseudonymity is a unique challenge; I'm not entirely sure how that will work.) I'll admit also enjoying the theme-of-the-day contests they have in the sidebar. Last week they did “three words after sex.” Somebody submitted “that cost what?”
- Last week on our library run, TB sat in the kids' section reading Diary of a Wimpy Kid and laughing to himself. I noticed two girls who looked a year or two older than TB watching him and whispering to each other. It didn't seem to be mockery; it looked more like goading. Eight years old, and he's already got it. Where he got it from is mysterious, but good for him. He has no idea, which I think is why it works.
- A contact at a private university mentioned to me that the students there are postponing graduation specifically so they won't have to start repaying their student loans while unemployed. Locally, our retention rate is climbing, even with students who just graduated sticking around to take a few more credits before transferring. (Some of the local destination colleges for transfers are willing to take the degree plus fifteen credits; we're having unprecedented numbers of students actually take them up on it.) The Great Recession is playing out in unanticipated ways.
Friday, June 05, 2009
Assessing Professional Development
Most colleges set aside at least some 'professional development' money for faculty and staff. The idea behind it is that fields of expertise don't remain static, so for people to remain current, they sometimes have to be exposed to the latest developments. That can mean workshops, or conferences, or webinars, or subscriptions, or whatever, but the goal is to make sure that people don't rust in peace.
When there's a budget crunch, professional development is usually one of the first things to get cut. Part of that is because it's inherently variable anyway, unlike salaries or utilities. (In that sense, it's more like the 'snow removal' budget line than the 'salary' line. Any given winter can be more or less snowy than the one before it, so everybody understands that that particular line item is written in pencil.) And part of that is because the costs of cheaping out on professional development take a while to show, but the savings are immediate. When you're in free fall, the objection that “people might be a little less engaged five years from now” isn't terribly compelling. It's like telling a gunshot victim to quit smoking.
Since it's looking like we'll be short of funds for some time to come, I can foresee some pitched battles coming over what little professional development funding we haven't cut yet. And I can guess that those battles won't be terribly enlightening, because we haven't really figured out what makes some professional development activities better than others.
The traditional version, at least on the faculty side, goes something like this: you have x dollars to spend for the year on subscriptions, memberships, conference travel, and the like. If you get a paper accepted somewhere and go there to present it, you'll get a little more. The idea is that faculty are the experts in their respective fields, so they're likelier to know what they need professionally. Set a few basic criteria, make them show receipts, don't pay for alcohol or pay-per-view, and call it good.
That works tolerably well when money is plentiful. But when it's scarce, and you get requests totaling several multiples of what's available, “I want it because I want it and I'm the expert” doesn't work.
I recently heard someone ask what it would look like if we applied outcomes assessment to professional development. What if we somehow measured which expenditures generated the most bang for the buck, and prioritized accordingly?
I can answer that in two words: define 'bang.'
Although I've heard the phrase 'professional development' for years, I've never really heard a coherent theory behind it. In order to define 'bang,' we'd need to specify the purpose of PD. Instead, we're running on the old “I know it when I see it” model.
I'll make it concrete. Which conference would it make more sense for the college to support: the regional conference on teaching in a given field, or the regional conference of the major disciplinary organization for that same field?
I can envision arguments for either. The teaching conference is clearly more relevant to the college, since it's a teaching institution. And the disciplinary conference is clearly more relevant, since teachers need to know what they're teaching, and need to remain excited. So who wins?
The really evil social scientist in me says this can be resolved empirically. Give one group teaching-focused development, and another discipline-focused. Do that for a few years. Then compare their course completion rates, graduation rates, student evaluations, and any other measures you normally use to gauge teaching effectiveness. If one group clearly beats the other, then you have your policy.
Of course, you'd also have a political firestorm. Because at a really basic level, there's a tension between the view of faculty as employees and faculty as disciplinary ambassadors. The former suggests that PD is really another word for 'training,' and the latter suggests that it should be almost entirely self-directed. (In practice, that usually means that it's another word for 'travel.')
Locally, I've tried to delegate these decisions to a faculty committee, but they're running into the same conceptual brick wall I used to run into. The failure is more democratic, but it's still a failure.
Wise and worldly readers – have you seen a reasonably elegant and fair way to allocate scarce PD money? If so, what's it based on?
When there's a budget crunch, professional development is usually one of the first things to get cut. Part of that is because it's inherently variable anyway, unlike salaries or utilities. (In that sense, it's more like the 'snow removal' budget line than the 'salary' line. Any given winter can be more or less snowy than the one before it, so everybody understands that that particular line item is written in pencil.) And part of that is because the costs of cheaping out on professional development take a while to show, but the savings are immediate. When you're in free fall, the objection that “people might be a little less engaged five years from now” isn't terribly compelling. It's like telling a gunshot victim to quit smoking.
Since it's looking like we'll be short of funds for some time to come, I can foresee some pitched battles coming over what little professional development funding we haven't cut yet. And I can guess that those battles won't be terribly enlightening, because we haven't really figured out what makes some professional development activities better than others.
The traditional version, at least on the faculty side, goes something like this: you have x dollars to spend for the year on subscriptions, memberships, conference travel, and the like. If you get a paper accepted somewhere and go there to present it, you'll get a little more. The idea is that faculty are the experts in their respective fields, so they're likelier to know what they need professionally. Set a few basic criteria, make them show receipts, don't pay for alcohol or pay-per-view, and call it good.
That works tolerably well when money is plentiful. But when it's scarce, and you get requests totaling several multiples of what's available, “I want it because I want it and I'm the expert” doesn't work.
I recently heard someone ask what it would look like if we applied outcomes assessment to professional development. What if we somehow measured which expenditures generated the most bang for the buck, and prioritized accordingly?
I can answer that in two words: define 'bang.'
Although I've heard the phrase 'professional development' for years, I've never really heard a coherent theory behind it. In order to define 'bang,' we'd need to specify the purpose of PD. Instead, we're running on the old “I know it when I see it” model.
I'll make it concrete. Which conference would it make more sense for the college to support: the regional conference on teaching in a given field, or the regional conference of the major disciplinary organization for that same field?
I can envision arguments for either. The teaching conference is clearly more relevant to the college, since it's a teaching institution. And the disciplinary conference is clearly more relevant, since teachers need to know what they're teaching, and need to remain excited. So who wins?
The really evil social scientist in me says this can be resolved empirically. Give one group teaching-focused development, and another discipline-focused. Do that for a few years. Then compare their course completion rates, graduation rates, student evaluations, and any other measures you normally use to gauge teaching effectiveness. If one group clearly beats the other, then you have your policy.
Of course, you'd also have a political firestorm. Because at a really basic level, there's a tension between the view of faculty as employees and faculty as disciplinary ambassadors. The former suggests that PD is really another word for 'training,' and the latter suggests that it should be almost entirely self-directed. (In practice, that usually means that it's another word for 'travel.')
Locally, I've tried to delegate these decisions to a faculty committee, but they're running into the same conceptual brick wall I used to run into. The failure is more democratic, but it's still a failure.
Wise and worldly readers – have you seen a reasonably elegant and fair way to allocate scarce PD money? If so, what's it based on?
Thursday, June 04, 2009
Teaching and Sorting
Via Cold Spring Shops, I ran across this quote from conservative commentator David Frum:
In a way, it encapsulates a basic philosophical quandary for higher ed. Should our focus be on sorting the strong from the weak, or on making everybody strong?
Frum implies, correctly, that at least some of the wage premium attaching to college degrees comes from their relative scarcity. To the extent that seeing a degree program through to completion bespeaks, say, above-average tenacity and/or intelligence, it serves as a signal to prospective employers.
From that perspective, improving pass rates in developmental classes is actually counterproductive. Frum's position assumes that scarcity is the primary market value of a degree, so it follows logically that making degrees more common makes them less valuable. Anybody who pays attention to the rise of the professional adjunct has to concede that there's at least some truth to this.
From an educator's perspective, though, I'm struck that if Frum is right, then the actual content of what we teach doesn't matter much. (He appends the standard harumphing about The Classics, but it's really ancillary to his argument.) The real work of higher ed happens at the Admissions office. By that standard, community colleges aren't higher ed at all, since we don't exclude. Exclusion, rather than education, is the point. We could make students run through rows of tires on the ground if we wanted to; as long as fewer finish than start, we've done our job. As Thorstein Veblen noted a century ago, the 'signalling' achieved by a liberal arts education is that you're elite enough that you don't actually have to learn anything useful. Any schmuck can make a living with a skill; only the elite can afford to major in philosophy. Get in, get through, and get yours; the actual content of what you study is quite beside the point.
In the cc world, by contrast, the animating assumption is that the content of what we teach is both good in itself and likely to lead to economic growth. Even if degrees lose a certain exclusivity, the social and economic benefits of a more educated citizenry and workforce are likely to outweigh any losses from relative ubiquity. In other words, more educated workers are more productive workers over time. If the first two years of college become more common, this position implies, then we should expect to see more economic growth over time, since people will be more capable of doing more productive things. Content matters. Education, rather than exclusion, is the point. There may be some dislocations on the micro level -- what conservatives in other contexts like to call 'creative destruction' -- but there will be prosperity on the macro level. Put enough skilled and educated people together long enough, and sooner or later, you'll get sparks.
And of course, there are enough triumphs of underdogs to keep us going. Just because your parents aren't loaded doesn't mean you're stupid or without potential. Community colleges are the only realistic starting point for many people, some of whom parlay their hard work here into impressive careers. I'm at a loss to explain why that's a bad thing.
Much follows from which side you're on. If you believe that exclusivity is the point, then colleges built on second chances are debasing the currency. They're cheating. If you believe that education is the point, then giving people second (and third...) chances to bring up their games is an obvious public good, worthy of substantial public support.
From the perspective of exclusivity, something like 'outcomes assessment' just looks like misplaced priorities; it takes content entirely too seriously. From the perspective of education, it's absurd that we haven't been doing a better job of it as a matter of course. If teaching is our core function, why the hell wouldn't we try to improve how we do it? The indifference to the content of education, I think, is behind both the research university model and tenure. Both of those are built on an implied hostility to actual teaching, which makes sense if you assume that actual teaching is beside the point. Teach well or badly, whatever -- the kids will sort themselves out, and the cream will rise to the top. Meanwhile, there's prestige/fame/grant money to chase! Teaching is for adjuncts. We speak of research 'opportunities,' but of teaching 'loads' -- the language tells you what you need to know.
I can't deny that the 'exclusivity' perspective has a long history, a certain internal coherence, and a kind of intuitive appeal for those of us who navigated the system well. It explains market saturation in certain fields, and gives a handy excuse for cutting taxes on the wealthy. But it's wrong, and it's wrong all the way down. At a really fundamental level, either you believe that content matters, or you don't. Either you believe that everybody deserves a real shot, or you don't. Either you believe that education is a common good, or you believe that it's a private good. The rest follows.
Why are the wages of the college-educated declining? A big part of the answer is that the pool of college graduates is rapidly expanding. It’s not surprising that as college becomes more universal, the return on a college education falls.
As the number of job applicants with degrees rises, employers become more sophisticated in assessing the value of any particular degree. The degree itself matters less than the institution that granted it, the subject areas of concentration, and the grade point average earned. A 4.0 math degree from Cal Tech is a very different thing from a 2.8 communications degree from San Francisco State University.
In a way, it encapsulates a basic philosophical quandary for higher ed. Should our focus be on sorting the strong from the weak, or on making everybody strong?
Frum implies, correctly, that at least some of the wage premium attaching to college degrees comes from their relative scarcity. To the extent that seeing a degree program through to completion bespeaks, say, above-average tenacity and/or intelligence, it serves as a signal to prospective employers.
From that perspective, improving pass rates in developmental classes is actually counterproductive. Frum's position assumes that scarcity is the primary market value of a degree, so it follows logically that making degrees more common makes them less valuable. Anybody who pays attention to the rise of the professional adjunct has to concede that there's at least some truth to this.
From an educator's perspective, though, I'm struck that if Frum is right, then the actual content of what we teach doesn't matter much. (He appends the standard harumphing about The Classics, but it's really ancillary to his argument.) The real work of higher ed happens at the Admissions office. By that standard, community colleges aren't higher ed at all, since we don't exclude. Exclusion, rather than education, is the point. We could make students run through rows of tires on the ground if we wanted to; as long as fewer finish than start, we've done our job. As Thorstein Veblen noted a century ago, the 'signalling' achieved by a liberal arts education is that you're elite enough that you don't actually have to learn anything useful. Any schmuck can make a living with a skill; only the elite can afford to major in philosophy. Get in, get through, and get yours; the actual content of what you study is quite beside the point.
In the cc world, by contrast, the animating assumption is that the content of what we teach is both good in itself and likely to lead to economic growth. Even if degrees lose a certain exclusivity, the social and economic benefits of a more educated citizenry and workforce are likely to outweigh any losses from relative ubiquity. In other words, more educated workers are more productive workers over time. If the first two years of college become more common, this position implies, then we should expect to see more economic growth over time, since people will be more capable of doing more productive things. Content matters. Education, rather than exclusion, is the point. There may be some dislocations on the micro level -- what conservatives in other contexts like to call 'creative destruction' -- but there will be prosperity on the macro level. Put enough skilled and educated people together long enough, and sooner or later, you'll get sparks.
And of course, there are enough triumphs of underdogs to keep us going. Just because your parents aren't loaded doesn't mean you're stupid or without potential. Community colleges are the only realistic starting point for many people, some of whom parlay their hard work here into impressive careers. I'm at a loss to explain why that's a bad thing.
Much follows from which side you're on. If you believe that exclusivity is the point, then colleges built on second chances are debasing the currency. They're cheating. If you believe that education is the point, then giving people second (and third...) chances to bring up their games is an obvious public good, worthy of substantial public support.
From the perspective of exclusivity, something like 'outcomes assessment' just looks like misplaced priorities; it takes content entirely too seriously. From the perspective of education, it's absurd that we haven't been doing a better job of it as a matter of course. If teaching is our core function, why the hell wouldn't we try to improve how we do it? The indifference to the content of education, I think, is behind both the research university model and tenure. Both of those are built on an implied hostility to actual teaching, which makes sense if you assume that actual teaching is beside the point. Teach well or badly, whatever -- the kids will sort themselves out, and the cream will rise to the top. Meanwhile, there's prestige/fame/grant money to chase! Teaching is for adjuncts. We speak of research 'opportunities,' but of teaching 'loads' -- the language tells you what you need to know.
I can't deny that the 'exclusivity' perspective has a long history, a certain internal coherence, and a kind of intuitive appeal for those of us who navigated the system well. It explains market saturation in certain fields, and gives a handy excuse for cutting taxes on the wealthy. But it's wrong, and it's wrong all the way down. At a really fundamental level, either you believe that content matters, or you don't. Either you believe that everybody deserves a real shot, or you don't. Either you believe that education is a common good, or you believe that it's a private good. The rest follows.
Wednesday, June 03, 2009
Ask the Administrator: Decoding Mixed Signals
A regular correspondent writes:
First, congratulations on your level-headed assessment of the situation. It would be easy to assume that one person (or side) is the entire problem. That's sometimes true, but it's unhelpful to start from that. Let's assume that each party to the situation means well on his own terms.
Why would an aloof/politic dean choose a pit bull Associate Dean? A few possibilities leap to mind:
Pit bulls are often detail-oriented, and therefore could shore up the dean's weak flank. When I was at Proprietary U, I chose a detail-oriented AD for exactly that reason. (Luckily for me, he was also a decent human being.) Sometimes the downside of detail-orientation can be pettiness or lack of finesse.
The need for someone to be the bad guy. Many high-level people can't bring themselves to deal with difficult situations, so they hire a designated pit bull to do the dirty work for them. (For example, I see some of that in Barack Obama hiring Rahm Emmanuel.) I've found it's usually better just to face the issue yourself, since pit bulls never quite get it right, but that's me. This is a popular, if flawed, strategy.
Lack of good candidates. Sometimes the answer to “why was someone with this glaring flaw hired?” is “the other candidates were even worse.”
Lapse in judgment. It happens. Or, related, the pit bull interviewed misleadingly well.
The larger issue, though, seems to be that you find the dean inscrutable. He seems to be focusing on some predetermined set of priorities, but you don't know what it is, or why he's doing it. In the absence of a clear understanding, it's easy for projected fears to fill the vacuum. And when the AD acts like a 'henchman' (love that term!), that doesn't exactly soothe the fears.
My guess, fwiw, is that if you can get the dean to explain his priorities in some sort of reasonably public and interactive setting – one in which people can ask questions – some of what currently seems bizarre may become more reasonable, and therefore less threatening. (Alternately, you may discover that he's actually Dr. Evil, and you aren't projecting. I tend to doubt that, but one never can tell. If, in fact, he is Dr. Evil, then at least you know what you have to do.) The trick will be to develop an ad hoc venue in which the discussion is exploratory, rather than accusatory.
I've been in situations similar to this. I was once the go-between between a VP who defined the term 'entrepreneurial' to mean 'inventive,' and a faculty who defined the term 'entrepreneurial' to mean 'for-profit.' When he urged them to be 'entrepreneurial,' they recoiled in horror. What he said, and what they heard, were wildly different, but it took a while to figure that out. I became a sort of translator, and later an informal speechwriter, trying to get him to convey his (substantively positive) message in a way that wouldn't generate unnecessary hostility. It was a frustrating role, and never a terribly successful one after the initial damage was done, but someone had to do it.
I wouldn't be surprised to see that something similar is happening here. The dean is acting according to goals he considers reasonable. Whether they are or not, I don't know. The AD is acting in peremptory ways in the service of inscrutable goals – in the vernacular, he's being a dick. But he (probably) doesn't know that. He thinks he's acting efficiently in the service of the dean's goals, and in a sense, that might even be true. The key is to get those goals understood.
In the best case, the initial clarification might be followed by mutual refining of those goals. It may be that some of what he's trying to do doesn't fit the context, but he's too far removed to know that. Or it may be that even if the goals are reasonable, the ways he's pursuing them aren't. And nobody buys into goals they don't know.
Admittedly, many admin types have too much ego to react other-than-defensively to questions like that. But if this one is wise enough to pick his battles, he may just be wise enough to recognize an opportunity to gain trust – and therefore effectiveness – when it presents itself.
(Another lesson of academic administration that took me a while to figure out: your main currency is trust. If you can build that, you can get things done. If not, well, good luck with that...)
Good luck! I'd love to hear if this approach gains any traction.
Wise and worldly readers – what do you think?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
What do you do when the combination of a dean with a tentative style and an associate dean with a bullheaded style leads colleagues to draw the wrong conclusion about institutional politics?
Over a few years, a new dean has tried to dance along the line between pushing a few critical priorities and not wanting to waste his political capital on noncritical issues. Because I think he truly is aware that all a dean's words are coercive to some degree, he has been reluctant to respond to internal conflicts that forcefully, even though the conflicts go to the heart of some of the college's problems. They're just not as urgent as HIS critical priorities.
Last year, the dean hired an associate dean from elsewhere in the state, and the new associate dean is about as subtle and indirect as Antonin Scalia. The style might be a breath of fresh air in the current climate, but that hasn't been how all faculty have received it. To some, they have a dean who avoids some things and a new associate dean who's a bull in a china shop. So they've concluded that the associate dean is the newly-hired henchman for a passive-aggressive leader. Some things that should have been smoothed over within days (about teaching assignments, tenure standards, chairs unwilling to be flexible on individual issues, and the like) have simmered, boiled over, and burned on the stove.
The result of what I think is miscommunication (though I may be wrong and everyone may be evil spies from Ruritania)? A group of faculty who often distrusts half of what their dean does (and distrusts when he is not imposing things by fiat), and a set of college administrators who are focusing like a laser on predetermined priorities... and may be ignoring important problems under their nose. Any ideas for jolting either the distrusting faculty or the dean's office into changing their perspectives?
First, congratulations on your level-headed assessment of the situation. It would be easy to assume that one person (or side) is the entire problem. That's sometimes true, but it's unhelpful to start from that. Let's assume that each party to the situation means well on his own terms.
Why would an aloof/politic dean choose a pit bull Associate Dean? A few possibilities leap to mind:
Pit bulls are often detail-oriented, and therefore could shore up the dean's weak flank. When I was at Proprietary U, I chose a detail-oriented AD for exactly that reason. (Luckily for me, he was also a decent human being.) Sometimes the downside of detail-orientation can be pettiness or lack of finesse.
The need for someone to be the bad guy. Many high-level people can't bring themselves to deal with difficult situations, so they hire a designated pit bull to do the dirty work for them. (For example, I see some of that in Barack Obama hiring Rahm Emmanuel.) I've found it's usually better just to face the issue yourself, since pit bulls never quite get it right, but that's me. This is a popular, if flawed, strategy.
Lack of good candidates. Sometimes the answer to “why was someone with this glaring flaw hired?” is “the other candidates were even worse.”
Lapse in judgment. It happens. Or, related, the pit bull interviewed misleadingly well.
The larger issue, though, seems to be that you find the dean inscrutable. He seems to be focusing on some predetermined set of priorities, but you don't know what it is, or why he's doing it. In the absence of a clear understanding, it's easy for projected fears to fill the vacuum. And when the AD acts like a 'henchman' (love that term!), that doesn't exactly soothe the fears.
My guess, fwiw, is that if you can get the dean to explain his priorities in some sort of reasonably public and interactive setting – one in which people can ask questions – some of what currently seems bizarre may become more reasonable, and therefore less threatening. (Alternately, you may discover that he's actually Dr. Evil, and you aren't projecting. I tend to doubt that, but one never can tell. If, in fact, he is Dr. Evil, then at least you know what you have to do.) The trick will be to develop an ad hoc venue in which the discussion is exploratory, rather than accusatory.
I've been in situations similar to this. I was once the go-between between a VP who defined the term 'entrepreneurial' to mean 'inventive,' and a faculty who defined the term 'entrepreneurial' to mean 'for-profit.' When he urged them to be 'entrepreneurial,' they recoiled in horror. What he said, and what they heard, were wildly different, but it took a while to figure that out. I became a sort of translator, and later an informal speechwriter, trying to get him to convey his (substantively positive) message in a way that wouldn't generate unnecessary hostility. It was a frustrating role, and never a terribly successful one after the initial damage was done, but someone had to do it.
I wouldn't be surprised to see that something similar is happening here. The dean is acting according to goals he considers reasonable. Whether they are or not, I don't know. The AD is acting in peremptory ways in the service of inscrutable goals – in the vernacular, he's being a dick. But he (probably) doesn't know that. He thinks he's acting efficiently in the service of the dean's goals, and in a sense, that might even be true. The key is to get those goals understood.
In the best case, the initial clarification might be followed by mutual refining of those goals. It may be that some of what he's trying to do doesn't fit the context, but he's too far removed to know that. Or it may be that even if the goals are reasonable, the ways he's pursuing them aren't. And nobody buys into goals they don't know.
Admittedly, many admin types have too much ego to react other-than-defensively to questions like that. But if this one is wise enough to pick his battles, he may just be wise enough to recognize an opportunity to gain trust – and therefore effectiveness – when it presents itself.
(Another lesson of academic administration that took me a while to figure out: your main currency is trust. If you can build that, you can get things done. If not, well, good luck with that...)
Good luck! I'd love to hear if this approach gains any traction.
Wise and worldly readers – what do you think?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Tuesday, June 02, 2009
Ask the Administrator: Cheap Shots in Student Course Evaluations
A Canadian correspondent writes:
Some caveats, then a few thoughts.
First, I don't know how Canadian labor law differs from American, so I can only take this case as illustrative in this context. Whatever translation needs to occur needs to occur.
Second, different collective bargaining agreements stipulate different rules for how student course evaluations should be handled. I've literally never heard of one that actually banned their use in evaluation altogether, since that would seem to defeat their utility, but anything's possible. Rather than taking a union rep's word for it (or a dean's word, for that matter), get a hold of the contract and look up the relevant language yourself. I wouldn't be surprised to discover that what's being presented as a blanket policy is actually more circumscribed.
Now that those are out of the way...
If student course evaluations there are anything like they are in every context I've ever seen, they're anonymous. By itself, that would seem to rule out “taking action against the complainer.” Besides, the student grapevine is fast and skeptical; if word leaked out that 'anonymous' course evaluations are, in fact, not anonymous, I'd expect to see serious hellfire and brimstone. As with the occasional anonymous troll on a blog, some people will use anonymity to spew bile for reasons of their own. It comes with the territory.
In your shoes, I'd alert your union rep that a student made the comment, and that you're concerned that your supervisor will take it at face value. If your supervisor brings it up, express mystification, and ask for substantiation. You may not be able to bring action against the accuser, but you're certainly well within your rights to ask for anything resembling evidence. If none is forthcoming, then a supervisor with half a brain will write it off to the usual student kvetching.
If the comment cited an actual incident, things would be very different. We're obligated to follow up on specific allegations of biased actions against members of protected classes. The key word there is 'action.' If a student mentions that, say, the professor singled out members of a given group for harsher scrutiny, that would automatically trigger an investigation. (We don't have the discretion, legally, to blow it off, even if it seems patently absurd.) But there's a difference between an actual incident and a blanket allegation of bias. In the absence of a specified incident, there's nothing to investigate, and therefore no cause of action. If the student's comment was limited to “he's a racist,” and offered literally no examples of the alleged racism in action, then it's of no more value than “he's a jerk” or “I don't like him.” (It's the difference between “he's a thief” and “he stole this car on this day.” The latter triggers an investigation; the former doesn't.)
Now it's always possible that a given supervisor could be looking for an excuse to fire you, or could have a hair trigger generally, or could just be paranoid beyond belief. People aren't always reasonable. But if that's true, then that would surface eventually anyway. Putting your union rep on alert early could help insulate you, since the union would know to jump on anything abrupt.
From my side of the desk, an isolated comment like that without even a hint of specificity would fall under the same category as “he's mean” or “he assigns too much homework.” In other words, by itself, it's just surface noise. If it were part of a pattern, I'd be curious, but a single comment is just that.
What I absolutely would not recommend is trying to figure out who the student was, and going after him. This is where you get to be the grownup.
Good luck!
Wise and worldly readers, how have you seen issues like these handled? How do you think they should be handled?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
The CC has dramatically different standards and practices from the other schools [at which I've taught], but the one that concerns me at this time this. An anonymous student made a libelous comment about me on their evaluation. It is a serious one.This person wrote that I 'am a racist'. I want this remark removed from the evaluations. (It is not true, of course, and there were no racial incidents in my class at all.) Even though we have a union which told me that the evaluations may not be used by our supervisors to determine our status, my supervisor has in fact quoted a previous student comment in a review session. I have not heard anything from her about this one, which was forwarded to me late last week. I don't think she has viewed it.
Since the statement was gratuitous, mentioned no incident, and was only posted by one student, would it be better to let it blow over and say nothing, or is it necessary for me to ensure that it is removed from my file? ( I have no idea whether there are letters from other students in the same file. ) The college's standards and practices have very specific penalties for frivolous and irrelevant complaints, including taking action against the complainer. The culture of this country is different from what I am used to; students who told me they liked and loved the class wrote nothing at all in the evaluation. Only the complainers wrote anything. My evaluation percentages are nearly equal with those of the other professors in the department in all but one category, though a little lower than last semester's (which were higher than the departmental average.) It was my first year here and this was to be expected.
Some caveats, then a few thoughts.
First, I don't know how Canadian labor law differs from American, so I can only take this case as illustrative in this context. Whatever translation needs to occur needs to occur.
Second, different collective bargaining agreements stipulate different rules for how student course evaluations should be handled. I've literally never heard of one that actually banned their use in evaluation altogether, since that would seem to defeat their utility, but anything's possible. Rather than taking a union rep's word for it (or a dean's word, for that matter), get a hold of the contract and look up the relevant language yourself. I wouldn't be surprised to discover that what's being presented as a blanket policy is actually more circumscribed.
Now that those are out of the way...
If student course evaluations there are anything like they are in every context I've ever seen, they're anonymous. By itself, that would seem to rule out “taking action against the complainer.” Besides, the student grapevine is fast and skeptical; if word leaked out that 'anonymous' course evaluations are, in fact, not anonymous, I'd expect to see serious hellfire and brimstone. As with the occasional anonymous troll on a blog, some people will use anonymity to spew bile for reasons of their own. It comes with the territory.
In your shoes, I'd alert your union rep that a student made the comment, and that you're concerned that your supervisor will take it at face value. If your supervisor brings it up, express mystification, and ask for substantiation. You may not be able to bring action against the accuser, but you're certainly well within your rights to ask for anything resembling evidence. If none is forthcoming, then a supervisor with half a brain will write it off to the usual student kvetching.
If the comment cited an actual incident, things would be very different. We're obligated to follow up on specific allegations of biased actions against members of protected classes. The key word there is 'action.' If a student mentions that, say, the professor singled out members of a given group for harsher scrutiny, that would automatically trigger an investigation. (We don't have the discretion, legally, to blow it off, even if it seems patently absurd.) But there's a difference between an actual incident and a blanket allegation of bias. In the absence of a specified incident, there's nothing to investigate, and therefore no cause of action. If the student's comment was limited to “he's a racist,” and offered literally no examples of the alleged racism in action, then it's of no more value than “he's a jerk” or “I don't like him.” (It's the difference between “he's a thief” and “he stole this car on this day.” The latter triggers an investigation; the former doesn't.)
Now it's always possible that a given supervisor could be looking for an excuse to fire you, or could have a hair trigger generally, or could just be paranoid beyond belief. People aren't always reasonable. But if that's true, then that would surface eventually anyway. Putting your union rep on alert early could help insulate you, since the union would know to jump on anything abrupt.
From my side of the desk, an isolated comment like that without even a hint of specificity would fall under the same category as “he's mean” or “he assigns too much homework.” In other words, by itself, it's just surface noise. If it were part of a pattern, I'd be curious, but a single comment is just that.
What I absolutely would not recommend is trying to figure out who the student was, and going after him. This is where you get to be the grownup.
Good luck!
Wise and worldly readers, how have you seen issues like these handled? How do you think they should be handled?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Monday, June 01, 2009
Changing Barometers
Back in the 80's and early 90's, the way you could tell if a speaker was losing the audience was through the coughing index. The louder and more frequent the coughing, the more bored was the audience. (Newspaper crinkling was another good index. The drearier the presentation, the greater the proportion of the audience doing crossword puzzles.)
Sometime in the 90's, electronics slowly supplanted the coughing index. I attribute this to the confluence of portable consumer electronics with smoking bans. Before the general social consensus coalesced around the (correct) notion that anyone whose ringtone went off in a quiet public setting was either an emergency surgeon or a colossal douchebag, it was relatively commonplace to have phones go off, and sometimes even to have people take the calls. (I remember once watching a movie in a theater with maybe a dozen people total. Some guy's cell went off, and he loudly took the call. His first line was “nothing, what are you doing?” A lesser man than I would have committed justifiable homicide.) Now it's more common for people to look embarrassed and sheepishly fumble for what seems like hours to try to turn the flippin' thing off. Why they put it where it would take them hours is beyond me, but it happens a lot.
Judging by the last several public events I've attended, the new barometer is the Crying Baby index. How long are the parents willing to let their babies shriek before finally taking them outside?
Crying babies are uniquely challenging for everyone else. It's easy to get mad at the idiot whose cell phone starts blaring “My Humps” in the middle of a solemn ceremony. But getting mad at a harried young parent just seems mean. And having been that parent, I know that there's a tension between hope that the tantrum is almost over, desire not to cave to bad behavior, and trying to estimate just how much you're annoying everybody else.
In fairness, sometimes babies enliven occasions. Nine or ten years ago TW and I saw a comedian open a public event in a gym. A baby girl started laughing loudly and randomly, presumably for reasons of her own, but her laugh was irresistible. Eventually, even the comedian started commenting on it. She got more laughs than he did, but she certainly helped the evening along.
It may be a function of the events I attend, but it seems like babies are far more common at public events than they used to be. They're fun to look at, and they have a way of brightening the moods of almost everyone around them. As a parent, they're incredibly high-risk; I shudder at the memory of administering the sniff test in public, only to find that the kid failed it. (If you haven't sniffed your kid's butt in public at least once, you aren't really a parent.) And the sheer volume of paraphernalia they require is amazing. But for the rest of us, other people's babies can be low-stress mood brighteners.
At graduation, they're everywhere, and utterly charming. Multiple generations show up together, in remarkable plumage, beaming with pride. I still melt a little every time I see a new grad in the crowd after the ceremony, hugging her kid. I got my degrees when I was young and childless, mostly out of a sense that it would be exponentially harder as a parent. As a parent now, I'm convinced that it's true. How these students get through working, studying, and parenting at the same time, I honestly don't know. But it's an honor and a hoot to watch them celebrate after the graduation ceremony. The grandparents smile, the students smile, and the babies laugh and smile because everyone else does.
Getting to watch that every single year is a perk of the job. Let the speakers worry about the crying babies during the speeches -- I like watching the laughing families afterwards. As barometers go, it's a good one.
Sometime in the 90's, electronics slowly supplanted the coughing index. I attribute this to the confluence of portable consumer electronics with smoking bans. Before the general social consensus coalesced around the (correct) notion that anyone whose ringtone went off in a quiet public setting was either an emergency surgeon or a colossal douchebag, it was relatively commonplace to have phones go off, and sometimes even to have people take the calls. (I remember once watching a movie in a theater with maybe a dozen people total. Some guy's cell went off, and he loudly took the call. His first line was “nothing, what are you doing?” A lesser man than I would have committed justifiable homicide.) Now it's more common for people to look embarrassed and sheepishly fumble for what seems like hours to try to turn the flippin' thing off. Why they put it where it would take them hours is beyond me, but it happens a lot.
Judging by the last several public events I've attended, the new barometer is the Crying Baby index. How long are the parents willing to let their babies shriek before finally taking them outside?
Crying babies are uniquely challenging for everyone else. It's easy to get mad at the idiot whose cell phone starts blaring “My Humps” in the middle of a solemn ceremony. But getting mad at a harried young parent just seems mean. And having been that parent, I know that there's a tension between hope that the tantrum is almost over, desire not to cave to bad behavior, and trying to estimate just how much you're annoying everybody else.
In fairness, sometimes babies enliven occasions. Nine or ten years ago TW and I saw a comedian open a public event in a gym. A baby girl started laughing loudly and randomly, presumably for reasons of her own, but her laugh was irresistible. Eventually, even the comedian started commenting on it. She got more laughs than he did, but she certainly helped the evening along.
It may be a function of the events I attend, but it seems like babies are far more common at public events than they used to be. They're fun to look at, and they have a way of brightening the moods of almost everyone around them. As a parent, they're incredibly high-risk; I shudder at the memory of administering the sniff test in public, only to find that the kid failed it. (If you haven't sniffed your kid's butt in public at least once, you aren't really a parent.) And the sheer volume of paraphernalia they require is amazing. But for the rest of us, other people's babies can be low-stress mood brighteners.
At graduation, they're everywhere, and utterly charming. Multiple generations show up together, in remarkable plumage, beaming with pride. I still melt a little every time I see a new grad in the crowd after the ceremony, hugging her kid. I got my degrees when I was young and childless, mostly out of a sense that it would be exponentially harder as a parent. As a parent now, I'm convinced that it's true. How these students get through working, studying, and parenting at the same time, I honestly don't know. But it's an honor and a hoot to watch them celebrate after the graduation ceremony. The grandparents smile, the students smile, and the babies laugh and smile because everyone else does.
Getting to watch that every single year is a perk of the job. Let the speakers worry about the crying babies during the speeches -- I like watching the laughing families afterwards. As barometers go, it's a good one.
Subscribe to:
Posts (Atom)