Thursday, August 31, 2006

 

Cousin Oliver

Over at acade(me), there's a fascinating discussion of the paradoxical position of a new, young professor walking into a department dominated by a much older cohort. She was hired, in part, to bring new energy and perspectives to the department, but upon arrival, got the message in many unsubtle ways that she is not to rock the boat. She is to be just more of the same, albeit with a younger face.

Been there, done that, got the t-shirt.

When I was hired to my current position, the then-VP made it very clear that I was to be an agent of change, the person to bring the next generation of faculty to the college. Then we didn't hire anybody for a year and a half, during which time that VP left, to be replaced by an interim VP who didn't want to make any decisions to bind her successor. So this (relatively) young Dean, hired to be an agent of change, had to sit on his hands for a year and a half and whistle a happy tune.

It was a fascinating sociological experience, in that my previous college had a much younger cohort of employees, so I went from average-age to conspicuously-young overnight. Although I was unable to effect much meaningful change at all in the first year and a half, that didn't stop some of the old guard from seeing me as a mortal threat. So I got the worst of both worlds – all the enmity that a change agent normally generates, without any of the actual accomplishment. “Frustrating” doesn't begin to capture it. I was to be the new young packaging on some very old product. A fresh new spokesmodel for a creaky regime. Cousin Oliver in the final season of The Brady Bunch. Not a good thing.

Since then, it has been a bumpy ride, but at least it has been moving. A few departures and retirements in key positions removed some major obstacles. (I made damn sure to have input on the selection of the new permanent VP.) A serious fiscal crisis has brought some clarity to the need for certain kinds of change. Some well-chosen faculty hires (fewer than have left, but still...) have, cumulatively, changed the tone in several departments. A few victories led to some credibility, which made subsequent victories easier. Some of the folks who, initially, wouldn't give me the time of day, have made peace with the fact of my presence. (Now that I've been here for several years and multiple VP's, it's getting harder to sustain the illusion that I'm some sort of temp.) Over time, sheer exposure makes it harder to paint me as some sort of demon (or, alternately, some sort of lightweight).

I think it was Hegel who said something to the effect that “there is nothing in the essence of an object that does not manifest itself in the series of its appearances.” In other words, over time, what you are will become obvious. The comfort level of the old guard with each other, even with those they can't stand, derives from familiarity. After many years, it's possible to read Bob's outburst as “that's just Bob being Bob again,” and not to get overly bothered by it. As the new kid, they don't have a good read on you, and the devil you know beats the devil you don't. But over time, the newness wears off, the real strengths start to surface, and some folks retire or leave. When the sands of time move, lines drawn in the sand move, too.

Some well-meaning folks give newbies the advice to “keep your head down.” I disagree. Don't be needlessly provocative, certainly, but hold your head up high. Nothing evaporates the idiotic fantasies that the old guard will project onto you like exposure to reality. Show your strengths without apology, and win, slowly, by taking the high road. It won't work quickly, necessarily, but over time the naysayers just start to look out-of-touch and slightly pathetic. Once you've become a fact on the ground, which doesn't happen overnight, even those whose hearts and minds are still fighting the battles of the 1970's will have to deal with you.

The beauty of taking the high road is that it's incredibly hard to defeat. The danger is that the payoff is slow, and you need at least a modicum of security (in whatever form) to be willing to invest the time. It's also incredibly difficult to fake, which is exactly why it works.

The most interesting and successful new faculty here have made their marks without actually going toe-to-toe with anybody. Rather than fighting old battles, they've changed the subject. It can work, and it has the heartening effect of disarming some of the less pleasant combatants altogether.

I once heard a new hire to an old department compared to a new lamp in an old room: it shines light into some previously-dark corners, showing where dust gathered without anybody noticing. Sometimes what gets exposed isn't flattering or pleasant, and the human tendency to shoot the messenger will surface. But it's bullshit, and over time, if that light keeps shining, that bullshit will get progressively harder to sustain.

Cousin Oliver had no substance and didn't last. If you're confident that you have substance, take the high road. Bring the new perspectives in a classy way, but bring them. Over time, with patience, you'll be surprised at what can shift.

Wednesday, August 30, 2006

 

Open Enrollment

Every so often, I indulge my libertarian side. This is one of those times.

Not a week goes by that I don't get a 'course substitution' request, in which a student asks to substitute one course for another as a graduation requirement, on the basis of claimed 'misadvisement.' Since we, as a college, take a relatively invasive role in student advisement, we inadvertently cultivate a sort of learned helplessness among the students when it comes to navigating requirements.

Placement exams are even worse. We're constantly having debates over the proper cut scores to get into various levels of English, math, or foreign languages, how much remediation to require, at what point a student can move from ESL to English, and the like. Students are always asking for exceptions, finding work-arounds, and generally trying to evade our various checkpoints. Some students manage to slip through. When they do, one of two things happens: either they fail and blame us for letting them slip through our systems, or they pass and tell their friends that our safeguards are meant to be broken. Either way, we're wrong.

This isn't unique to my current school; I saw it at my previous one, too. Curricular gatekeeping (in the sense of blocking access to courses) is labor intensive, hit-and-miss, and frustrating to all involved. This is especially true at the entry levels. (I've never seen a mad stampede to get into Differential Equations or Social Psychology. The issues are most frequent at the bridge from remedial English to composition 1, or remedial math to college algebra.)

So, my radical libertarian proposal: make academic advisement, placement exams, and tutoring optional for the student, and let students sign up for what they want (at least at the 100 level and below).* If a student overestimates his own ability, takes a too-difficult course, and crashes and burns, that's the student's problem. Ban the word 'misadvisement' from the college.

Savvy students will seek out the advisement they need anyway. (Really savvy students will figure most of the answers out by – gasp – reading the catalog. RTFM, as the computer geeks used to say.) But a great many students will make their own choices, and I say, that's fine. As a community college, we're supposed to be open-enrollment. I say, let's be open enrollment, and let the students sink or swim based on their own performance.

Before I'm written off as a burned-out quasi-social-Darwinist, I'll clarify that I'm not proposing abolishing remedial courses, academic advisement, or placement tests. The college should continue to offer all of these. I just don't think it should require them. Offer, yes; require, no. A student who seeks out academic advisement of her own volition should be able to find it relatively easily, and a student who seeks out tutoring help in writing or math should be able to find it. I just don't think that forcing the issue is worth it.

Will some students make stupid choices and fail a whole bunch of classes? Yup. But that's part of the learning process, too. A kid who will resist and resent required advisement under the current system might actively seek it out after getting his ass handed to him a few times in classes he couldn't begin to comprehend.

Would this disproportionately impact the least-advantaged students? Maybe, but that's not a given. If we give a placement test that gives a falsely-low impression of a student's abilities, and we mistakenly require that student to go into remediation she doesn't really need, that extra time and tuition come at a real cost to the student. In my experience, this happens more than most of us care to admit.

Would retention go down? Maybe, at first, but I don't see that as obvious, either. Students often walk away when they're told they need remediation. Maybe some of them know something we don't. I'm just not confident enough in our placement exams to say that they should be binding. Advisory, sure; if a kid took a little Spanish in high school and isn't sure which level to enroll in here, a quick barometer could do some good. A returning adult student who hasn't done math since graduating high school twenty years ago might need a reality check on her math skills, and a voluntary test can provide that.

Would the quality of classroom interaction drop, as the badly-unqualified slip in? To me, this is the only truly troubling objection. That said, enough students slip through now that I'm not sure how much more damage could actually be done.

What do you think?


*I can see space constraints justifying pre-reqs for certain upper-level courses that require specialized facilities, like organic chemistry. And there are good reasons to require, say, Anatomy and Physiology before a student enrolls in Nursing. I'm talking here about the ubiquitous, foundational disciplines.

Tuesday, August 29, 2006

 

Strategic Planning

Have you ever been involved in a 'strategic planning' process that actually worked?

I've been through it now at two colleges, and from several different angles. It usually runs something like this:

1.Debate airy abstractions so platitudinous that nobody can even fake caring
2.Break out those abstractions into measurable 'metrics'
3.Reduce those metrics to variations on what you're already doing
4.Pile on the gerunds
5.Assign 'point people'
6.File and ignore

They're awful to read, since they're always written in outline form, as if to imply some sort of deductive logic. In some cases, any proposals from below have to refer to actual numbered goals in the strategic plan (“bringing my graduate advisor to campus as a speaker will help the college achieve goal #4, fostering an active intellectual community”). Savvy practitioners of internal politics get good at defining terms to mean what they want them to mean, entirely independently of any planning process. And serious institutional self-criticism is verboten in a strategic plan, so ideas conceived as remedies for particular problems outlive the problems they were supposed to solve, taking on weird lives of their own. Nature abhors a vacuum, so folks will invent reasons for policies that lack them. Those reasons, in turn, will lead to new strategic plans.

Yuck.

I'm becoming more convinced that strategy from the top should be simple and clear – set no more than two or three priorities, dedicate some resources to those priorities, and set the internal incentives accordingly. Let the organization spend its time getting the implementation right, since that's where things usually fall apart. If the incentives are right and the goals are clear, those who are capable of learning will respond, eventually, and those who aren't will become less relevant over time. If you spend your time parsing six missions and twelve visions, and the internal incentives don't move, you're wasting your breath.

In academia, of course, tenure (and chronic cash-poverty) makes setting incentives uniquely difficult. But that's no reason not to try. Even in a tenured and unionized setting, I've found that some people will respond to simple and clear messages, especially when the few goodies that are available are lined up accordingly. (Most won't, of course, and some take a perverse pleasure in using their bulletproof status to claim the moral high ground against any change whatsoever. I can understand loafing, but claiming the moral high ground while loafing still pisses me off. Sigh.) The temptation to default to unthinking conflict-avoidance mode – that is, to pacify the loudest whiners – will always be there, but once you go down that road, there's no end to it. A little intestinal fortitude upfront will pay off over time.

Moving to discussion of actual execution of the plan, the next inevitable move is blame-shifting. Why is enrollment sliding? It's obviously the fault of admissions/financial aid/proprietary schools/sunspots/tectonic plates/unforgiving demographics/market trends/kids today, so they have to change. Once they fit what we consider proper, then we can talk.

Uh, no.

It's about adapting. Which, I think, is what's so frustrating about strategic plans as they're usually done. With their superficial rigor, they aspire to a level of control of reality that they'll never have. A simple plan, well-executed in changing ways, is far superior to a long and complicated plan with subsections and three different synonyms for 'goals.' The more complicated the plan, the more time will be spent parsing its verbiage, rather than paying attention to the outside world.

Here's a measurable goal: no college's strategic plan should be longer than the Constitution of the United States. Colleges can be complicated, but sheesh.

Have you seen a planning process that really worked? I'm honestly asking. If there's a method out there that has actually worked, I'd love to steal from it.

Monday, August 28, 2006

 

Shocking Revelation

It's hard to keep secrets from a spouse. Marriage, especially with young children, involves so much exhausted time together that sooner or later one's guard slips, and anything rattling around in the back of your mind will slip out.

Such a slip happened recently. The Wife confessed to harboring an attraction to...

wait for it...

Tucker Carlson.

(shudder)

so cold, so very cold...

I'm an adult. I've had relationships before. I know that eyes wander, that shiny objects crossing your field of vision will briefly register, but Tucker Carlson?

Trying to control the damage, she explained that she likes him more since he lost the bowtie (so how long has this been going on, anyway?...), and that the attraction is entirely physical. She ignores what he actually says. He's just a piece of man-candy, a Republican stud-horse roaming the plains of basic cable, desperately in search of a haircut.

I projectile-vomit just thinking about it.

She has confessed crushes before, but none of them was anywhere near so objectionable. In childhood, she admits, there was a crush on Randolph Mantooth, from Emergency. (My answer to that would be Elizabeth Montgomery, from Bewitched.) Harrison Ford, Keifer Sutherland, and Jake Gyllenhaal have all elicited her approval, but none of those bothered me, either. And she tolerates my crushes on Chris Jansing, Winona Ryder, and Maggie Gyllenhaal, so that's cool. (We have to figure out a way to sneak into a Gyllenhaal family reunion. Hmm.)

(We just heard that this season, Tucker will be on Dancing With the Stars. TW says she will watch through her fingers. I'm planning to spend lots of time in other rooms.)

Ever the WASP, my first instinct is just to sit in silent judgment of her, letting the daggers from my eyes and the icicles from my word balloons (“if you like that sort of thing”) do a number on her self-esteem. But I don't want to be divorced, and it's hard to maintain that kind of distance when a five-year-old and a two-year-old are running around.

Clearly, a retaliatory crush is in order. But on whom? The Wife doesn't care about politics, so I can't just watch Fox News, pick some blonde plutocrat, and elicit the same reaction. (“Honey, have you ever noticed how cute Monica Crowley is when she talks about Nixon?” Nope. Can't do it.) Lindsay Lohan or Nicky Hilton would only elicit mild disappointment, rather than the intestinal convulsions that are so clearly called for. Angelina Jolie is simply assumed. No, I need one that would strike her as really gross.

Cher would work, except I'm heterosexual. Courtney Love would gross her out, but she'd gross me out, too. No, I need someone really disturbing, someone who would generate the “I don't know you at all” response to mirror my own.

Hmm.

Martha Stewart? No. Katie Couric? Huh-uh. Hillary Clinton? Getting warmer, but no.

Wait, I've got it!

Jessica Simpson.

The divorced/virgin, Christian/slut, celebrity-for-no-particular-reason herself. That'll work. And I hear she's available!

Revenge shall be mine...

Friday, August 25, 2006

 

Memo to the Scientific Community

To: Scientists

From: Dean Dad

Re: Pluto

I have been fond of you folks for a long time. I remember sitting through Cosmos, watching Carl Sagan juxtapose himself against the infinity of the universe, and wondering which was more tedious, but not holding it against you. At times, in grad school, when attempting an especially tight corner in my 1989 Tercel hatchback, I would reflect on my high school physics teacher's claim that centrifugal force was 'ficititious,' and decide that she must have been smoking some of Sagan's stash. That, too, I could forgive.

You folks have given us the internet, without which blogging would be a bit more tedious. (Anyone else out there remember 'zines?) You've given us central air, which I consider an advance on the order of a major vaccine. Hell, you've given us vaccines.

But now you go and take away an entire *(#%# PLANET?

I don't think so.

The Boy has nine (count 'em!) glow-in-the-dark planets hanging from his ceiling, arranged in order around the overhead light, which doubles as the sun. I ain't takin' Pluto down. You people can just stick that in your telescopes and smoke it, just like that Sagan guy. And look what happened to him!

That is all.

 

Ask the Administrator: Do They Need Doctorates?

A fellow blogger writes:

I'm the new director of graduate studies for our humble MA program, and a number of graduate students in our program are convinced that you can make a career of teaching composition or literature in a CC with only an MA. My understanding is that you can, at best, get part-time or adjunct positions with only an MA, that these days, CC's want Ph.D.'s too, because the market is glutted with them. Now, for some of them – the ones who are geographically limited and have spouses with good-paying jobs – adjuncting would be fine. But I want to make sure that the ones who want full-time, career-track jobs (tenure-track or the equivalent) get good advice. Should I be advising them that CC comp and lit teachers still need the Ph.D.? Or are they right? Can they make it with the MA alone?


Yes.

Sorry.

Yes, it's possible to get a full-time tenure track lit or comp position at a cc with only an MA. Possible, but not bloody likely.

I'll have to ask the blogosphere how this plays out in different parts of the country. I suspect that in certain parts – say, the rural Midwest – the degree glut is less pronounced than it is on the coasts. And certainly every college has its individual quirks.

Speaking from what I've seen in Northeastern suburbia, I'd say that it's unusual for someone with only an MA to get hired in a glutted field like English or history. (In fields like Nursing or business, it's much more common.) We've hired some folks at ABD status, which shows up as 'MA' on the website, but don't let that fool you. In those cases, successful attainment of the doctorate is usually a condition for tenure. We make a distinction between ABD's and terminal MA's.*

Many of our senior faculty have terminal MA's, but they were hired in a different time. The unfortunate upshot is that we have 'credential compression' while we have to avoid 'salary compression,' with the unintended consequence that new folks with doctorates get paid at the level that new folks with terminal MA's were paid, back in the day. It's not fair, at some level, but it's what the market will bear and what our budgets will bear.**

I'm guessing that some of them are hoping to parlay 'loyal adjunct' status into 'tenure track' status. Again, it's possible, but not likely. Given the choice between a newly-minted doctorate and a terminal MA who has been adjuncting for years and not publishing, what would you do? In very glutted fields like English, it would be extraordinary for someone with a terminal MA to break through.

For those whose ambitions top out at adjuncting, a terminal MA is perfectly fine. Alternately, some of those might want to look at some kind of 'alternate route' teaching certification and become high school English teachers, where lacking a doctorate is the normal state of things. Thirty years ago, a terminal MA probably wouldn't have precluded a tenure-track position at a cc. Now, it pretty much does, at least in the parts of the country with plenty of doctorally-qualified candidates hanging around.

Faithful readers: does this hold true in your neck of the woods?

Have a question? Ask the Administrator at ccdean (at) myway (dot) com.

*The faculty union contract actually contains a clause defining a 'doctoral equivalent' for pay bonus purposes. X number of graduate credits beyond the Master's earns you 'doctoral equivalent' status. I have a major philosophical issue with that – to me, either you've wrestled the bear or you haven't – but there it is. We also grant 'doctoral equivalent' status for JD's and MFA's, on the theory that they're terminal degrees in their respective fields.

**To the libertarians out there: yes, I know, what the market will bear is fair by definition. I've heard the argument, I've read the argument, I know. I just don't buy it. In the real world, options are limited, and choosing the 'least bad' option doesn't make it good.

Thursday, August 24, 2006

 

Dibs

IHE did a piece last week on an adjunct alleging age discrimination because she was passed over for a series of full-time positions that eventually went to younger candidates. Several commenters called this piece to my attention, apparently to make the point that my thesis about discrimination against the young was misplaced.

I don't know enough about the specific case referenced in the article to say whether the allegations in that particular case are true. What I will say is that the article speaks to a different, if related, issue.

Many colleges have a cohort of long-term adjuncts. These are people qualified to teach, who have taught classes at a given school for years or even decades. Many of them are outstanding teachers, and they've demonstrated loyalty to their departments. No argument there. In my experience, many long-term adjuncts are the spouses of executives, so low pay is not an issue for them. Some have full-time jobs elsewhere, and pick up a course or two each semester as a sort of hobby that happens to give them some walking-around money. (We have one of those – a high school teacher by day – who has taught one evening course per semester here since the early 1970's. He loves what he does, and we're lucky to have him.) And some are dedicated academics who are desperately looking for full-time teaching jobs. I worry about the last group, especially since I was once a member of it.

Underlying the article and several comments has been an assumption that long-term adjuncts earn 'dibs' on full-time positions when they become available. I strenuously disagree.

Assuming that a department exercises meaningful quality control when re-hiring adjuncts – and that's a HUGE assumption – I agree that someone who has stuck around over time has earned respect as a teacher. That's not the same as agreeing that he has earned first refusal on a full-time job.

The best way to explain my objection is to play out the 'dibs' scenario. What if we adopted a 'take a number' system, like a deli?

First, and most obviously, we would exclude anybody whose research is recent. We'd guarantee that undergraduates would never get full-time professors whose graduate training is current. By drawing only from the immediate geographic area, we'd prevent people with new contacts and different kinds of training from making the cut. So from a purely academic/substantive point of view, it's a terrible idea.

Second, we would, in effect, indefinitely extend the poverty of graduate school. The hard sciences have 'postdocs,' in which recent Ph.D.'s earn low adult salaries while looking for professorships. In the humanities and social sciences, and at teaching institutions generally, this is not the case. If seven years of doctoral training were to be followed in all cases by 5-15 years of mandatory adjuncting, circling the airport while waiting for a runway to clear, the human impact on the people involved would be devastating. Right now, the folks who spend years waiting for a runway to clear are doing so by choice, even if the options from which they have to choose suck. Adopting a 'take a number' system would make it mandatory. The most economically vulnerable – young parents, people from working-class backgrounds without parental wealth to fall back on – would drop out, regardless of the quality of their work. We would be the poorer for losing those perspectives.

Third, we would effectively allocate teaching positions based on either wealth of spouse or tolerance of poverty, rather than on ability. The first would be a drastic and offensive historical step backwards, and the second even more so. If we want professors to be respected as professionals, they should be able to make adult livings.

Fourth, we would freeze institutions into hiring decisions made with different criteria, many years ago. Needs change. For example, when a given adjunct was hired twenty years ago, the main desiderata were ability to teach in a classroom and ability to show up for the timeslots nobody else wanted. Now, we need someone who is fluent in online teaching. Is it automatically the case that somebody hired in 1986 to teach a couple of evening sections is the best candidate? I say, if she is, she'll win in a fair fight. (And if the fight truly isn't fair, then by all means, use the relevant legal channels.) But if she isn't, the college should be free to hire someone more capable of doing the job that needs to be done now.

Fifth, and this is what I always get flamed for but it's still true, there's something to be said for having generational diversity on your faculty (just as there is for racial or gender diversity). If we build in so many hoops that nobody under 40 gets hired for anything ever, the blind spots will simply get worse. If you prefer arguments from race or gender, then we'd simply be extending the reign of the old white guys and executive wives for another generation. Departmental inbreeding, already a serious problem, would get even worse.

When a college has a full-time opening, the fair and reasonable thing to do is to post it publicly and have an open search. That means nobody has first dibs. If the loyal long-term adjunct is the best candidate, great – I've hired a few of them myself. But if not, the college needs to be free to pick the best candidate. Choosing somebody with new and interesting research, a new set of professional contacts, and training in the latest methodological innovations is not prima facie evidence of ageism. It just isn't. It's doing what is supposed to be done.

Am I being a cruel and heartless bastard to the long-suffering adjuncts out there? No. I say it's much more cruel and heartless to feed people false hope, the better to keep exploiting them. If people would drop the 'dibs' fantasy and face the reality of the marketplace, chances are, some would find other lines of work. In the long term, that's probably the best outcome for some people. Others would stop beating themselves up and make peace with their situations, and that's fine, too. The hard and objectionable fact is that there are fewer full-time jobs than people who want them, so some people are going to be shut out. Should that be decided based on who brings the most to the table, or based on who can tolerate a life of mac-and-cheese the longest?

Wednesday, August 23, 2006

 

Ask the Administrator: References for Admin Job Searches

A gratifyingly-frequent correspondent, currently a department chair, writes:


I've been having some problems with the entirely closed environment at [current college],
which means that new people and outsiders are regularly stonewalled and
sand-bagged. So I decided to serve out the rest of the academic year,
and then move on.

As I prepare to go on the job market again, I'm wondering what to do
about recommendation letters. There are a number of administrative
positions, such as directorships and chairships, for which I feel
highly qualified. Should I have a recommendation letter that speaks to
my administrative skills? The only people who can speak directly to my
ability to lead a department are my faculty. But how weird is it to
have a recommendation from essentially one of my employees
saying "She's a great leader!"? Will search committees think that I
threatened or cajoled them into writing a letter for me? I do have
two friends who are chairs of their respective departments, but they
don't know much about my discipline. Should I go to them instead?
Finally, what are skills I should highlight in my job letters for
administrative positions?



It's a pet peeve of mine that criteria and conventions that make sense in faculty searches are often taken as normative for all job searches. Letters of recommendation strike me as problematic generally, but they're especially out-of-place for administrative positions. (I did an earlier rant against letters of recommendation generally here.)

For graduate students, newly-minted Ph.D.'s, or people in 'visiting' positions, there's nothing weird in looking for a tenure-track faculty job. It's what you're supposed to be doing. Asking your dissertation advisor and any other muckety-mucks you can corral to write you letters is expected, role-appropriate, and The Way Things Are Done. Whether it makes sense as a common practice is another issue, but it doesn't put the candidate in a weird position.

Once you're in a tenure-track or administrative position, though, asking for letters from people you work with is letting them in on a secret, which is that you're thinking of leaving. This information can, and sometimes will, be used against you. (It can work in your favor only once you actually have an offer in hand. Then you can play the 'counter-offer' game. Until you have an offer in hand, you're acutely vulnerable.)

Many administrative positions will ask instead for a list of references, usually with a caveat to the effect that references will be contacted only for candidates at the 'finalist' stage. This is better, but still risky; you could, theoretically, keep your search effectively secret as long as you aren't really in the running anyway. Once someone decides you're viable, though, the calls will be made, and your secret will be revealed. If it results in an offer, great. If you make it to the finalist stage but don't 'win,' though, you're actually compromised at your current job.

When I was on the admin market at my previous college (the search that brought me here), I had a few colleagues with whom I was particularly close, and we all shared a sort of 'siege' mentality about working there. Since several of us were looking for different jobs at the same time, we formed a sort of circle of recommendations, founded on a sort of honor among thieves. It worked, obviously, and I've been a reliable recommender for folks who are still stuck there. Here, though, I don't have a clue how it would work.

Making matters worse, many places require that at least one recommender be someone to whom you report directly. (I've been told that the standard expectation of three letters should be distributed as follows: one supervisor, one peer, one underling.) I was spared that in this search, simply by blind luck. Asking your Dean to write you a letter sends a pretty unambiguous message that you have one foot out the door. If the other foot doesn't follow in short order, your remaining time at your current college could be quite tricky.

I don't know an elegant solution to this on the individual level. In your particular case, if you still have good friends at your previous college, it wouldn't be unreasonable for one of the three references to come from there. Systemically, the obvious solution is to do away with letters of recommendation above the assistant-professor level (except for basic verification of employment, which could be done confidentially through HR). I would strongly encourage anybody on a search committee for an admin position to consider dropping the recommendation requirement, to see if it generates a larger and better pool of applications, much as dropping the SAT requirement frequently results in larger and better applicant pools for entering classes. (I'm guessing it would.)

In terms of what to emphasize in your application, keep in mind that being hired to an admin position is very different from being hired to a faculty position. Many excellent professors make lousy managers. The single-mindedness that can make for a brilliant scholar can also make for the manager from hell. As long as you can satisfy the relevant 'faculty experience' requirement, I'd focus mostly on your talents as a manager. Have you dealt successfully with difficult people? Have you seen projects through to fruition, and were those projects successful? Did you manage change well? Have you handled crises? What budget-management experience have you had? What do you do well that others don't?

I wouldn't worry about being discipline-specific; other chairs, if you can trust them, would be good candidates as references. I don't much care whether the chair of the Art department is the best artist in the department, or even the best teacher. I care that s/he is the best multitasker, the cool head in the crisis who is patient with the annoying little details of running a department (is there enough modeling clay? Did the lab assistants get the right-to-know training on time? Is the kiln properly vented, and were the relevant papers filed on time? Are student issues handled promptly and intelligently?). Those skills are position-specific, rather than discipline-specific.

Wise and thoughtful readers: do you know a graceful way to handle the recommendations issue if you're in a job that you aren't 'supposed' to be leaving?

Have a question? Ask the Administrator at ccdean (at) myway (dot) com.

Tuesday, August 22, 2006

 

Upgrades and Loss

The Wife is getting a little restless with the stay-at-home thing, so she's starting to put out feelers for part-time jobs. I think it's a great idea, since I'm ready to go postal after just a few days at home. A few years, and things would get ugly.

Her resume and cover letter are on a 3.5 inch floppy disk. Our computer doesn't have a floppy drive (memory key, check; cd writer, check; floppy, nope). My work computer doesn't have a floppy drive. Library computers don't have floppy drives. It's remarkably hard to find a floppy drive.

Since the format isn't really all that old, I'm sure I'll dig up something, somewhere. But I can't help but wonder about a larger issue.

I have hardcover books from the 1920's. They're perfectly readable now, if a bit dusty. I've got an embarrassingly large collection of nonfiction paperbacks dating from the 1960's and 1970's, all still perfectly readable. But a disk from 2003 necessitates a freakin' scavenger hunt to be read.

Given that the point of information technology is managing information, this strikes me as perverse.

As the rate of format change accelerates, it's becoming remarkably easy to lose information in the gaps between generations of technology. Each memory upgrade requires a little more forgetting. If 2003 is ancient, what of stuff from the 1990's? 1980's?

Think of all the information effectively lost due to format changes: Betamax tapes, 8-tracks, LP's, reel-to-reel tapes (remember ampex?), 78's, kinescopes, Super 8 mm film. Soon, photographic film will be consigned to the dustbin of history, along with cassette tapes and floppy disks. (As someone who went to college in the 1980's, the idea that cassette tapes are fading from view is hard to process. Sure, they're fragile and cumbersome, they sound like shit and they're never rewound where you want them, but they marked an era! Anyone else remember that annoying 'bloop-bloop-bloop' at the beginning of prerecorded tapes? Remember 'metal' tapes?) Readers of a certain age: remember the floppy disks that were actually floppy? Big, black, 5 ¼ inch suckers that made a satisfying 'whup whup' sound when you waved them in the air? They held about twenty pages of prose, when they held anything at all. (I had a professor in college who couldn't understand why his disks never worked, until he mentioned that he kept them stuck to his refrigerator with a magnet.) Hell, I remember dual-floppy IBM pc's in the college computer center wherein I'd insert the 'program disk' in one drive and the 'document disk' in the other. Of course, I also remember President Reagan. Kids today...

But I digress.

Last year, in an attempt to satisfy my jones for weird-ass music, I signed up for one of those online music subscription services. (It rhymes with 'wahoo.') For a monthly or yearly fee, I could download as much music as my mp3 player could hold. The music came with DRM encoding, which is basically a digital self-destruct mechanism that would kick in if I hadn't kept my subscription up to date. I was psyched about it at the time, since it allowed me to try out the latest offerings from whomever struck my fancy. I couldn't burn CD's, but since I mostly listen in the car, that didn't really bother me.

A few months into my annual subscription, the *$%^(&#%# folks at 'wahoo' decided to double the price. So I'm cancelling, which means losing access to everything I've been listening to for the last year. It's not catastrophic, certainly, but it's slightly unnerving. I knew it was essentially a rental service, but since I didn't anticipate the price summarily doubling, I didn't envision dropping it. Psychologically, I hadn't quite made the switch. Meanwhile, CD's I bought in the early 1990's are still perfectly listenable.

Blogging is a disposable medium too, though I take some comfort in the 'archives' section. At least with blogging, I can always print it out and keep it, if the mood strikes. (It hasn't, but it's nice to know it's possible.) Still, I think of many other blogs as part of my repertoire, and when one of them is taken offline, I'm helpless. There's no blog repository, as far as I know. (Techno-savvy readers are invited to correct me on this one.) If Bitch were to take down her blog, I'd lose a major cultural touchstone. With magazines or newspapers, we at least have libraries. With blogs and websites, not so much.

Future historians will have their hands full. Reading letters from 19th century figures isn't that hard, once you track them down. But how will future historians recover emails from long-lost IT systems? Two or three 'migrations' down the road, how will anything be recoverable? How will they listen to podcasts or audiobooks when the technology used to 'read' them is, itself, long gone? If someone decides, in 2015, to do a history of blogging, will much of the early (current) material even be accessible?

The Wife will get her resume and cover letter back, one way or another. Luckily, it's only going back three years. Much more than that, and we'd have been better off if they had just been typed.

Monday, August 21, 2006

 

A Quart of Liberty

At my previous college, I was one of the very few liberal-arts types to make it into administration. That, and my selective pickiness about grammar, made me the go-to guy for administrators when there was a grammatical or syntactical issue. I remember the then-registrar calling me from out of the blue to ask what an “abstract noun” was. I gave the example of liberty. It exists, certainly, but it's not like you can measure a quart of liberty.

The recent move to impose national standardized tests on higher ed strikes me as trying to package liberty in quarts.

An alert reader sent me a link to this article from the Washington Monthly about measuring student learning at various colleges. Taking a consumerist view as given, the article blames complacent pointy-headed intellectuals for trying to hide their rent-seeking behavior behind an ink cloud of evasions. If we were to subject colleges to rigorous measures of student learning, the article claims, soon the last would be first, the smug ivy leaguers put in their place, costs cut, and enlightenment achieved across the land.

Well, no.

A few top-of-my-head objections:

- If you're measuring outcomes at graduation, you're largely measuring selectivity of admissions. Is it more admirable to take a kid from 'remedial' to 'educated,' or to take a kid who's already brilliant and simply not mess her up? If you only look at outputs, you're largely measuring inputs. Lo and behold, colleges that admit lots of smart rich kids from tony prep schools will do the best job. You heard it here first.

- Different majors build different skills. A chemical engineering major will probably develop a different skill set than a drama major. (If not, at least one of those programs is doing something wrong.) Getting around the wild diversity of majors would require focusing on 'lowest common denominator' skills, much as our K-12 system does now. This strikes me as backwards. Our higher ed system is the envy of the world. Our K-12 system is, at best, limping. If anything, the emulation should go the other way 'round.

- Is there really a consensus on what students should learn in college? Did we have that conversation when I was in the bathroom? I don't remember that. Should we teach students how to make money, how to be critical citizens, or how to find themselves? Should we measure success by starting salaries, philosophical sophistication, sensitivity to diversity, public speaking skills, or math tests? Before we figure out how to measure, shouldn't we first figure out what to measure?

- (A quick aside on starting salaries: these fluctuate far more than the quality of education delivered. For example, IT grads in 1999 could write their own tickets. In 2003, they couldn't get arrested. Neither has much to do with the quality of instruction in college IT programs.)

- Doesn't the GRE already exist? I mean, if we really just want to look at lowest-common-denominator skills, isn't the instrument already out there? Just set a few scores across the categories as graduation requirements, and presto! No Undergraduate Left Behind.

- Since when did we all agree that educating undergraduates is the sole purpose of American higher education? I didn't get that memo. Research universities perform important roles in pursuing breakthroughs in many key fields of human endeavor. Community colleges help displaced workers retrain for other careers, sometimes eschewing academic degrees for vocational certificates. As different as they are, both of these functions serve the public. Are we suddenly to just discount or ignore these functions? If so, why?

- Since when did we all agree that students all want the same thing? Simply put, they don't. Some want to learn everything they can in a given field. Some want to learn enough to get a job that pays well, but no more than that. Some see college as primarily a social experience with a secondary credentialing function; classes are, at best, an ancillary nuisance. The best party school may not be the best teaching school, which may not be the best research school or football school or whatever else.

- How would religiously-affiliated colleges fit with this? If a college sets 'leading a spiritual life' as one of its primary missions, how do we measure that? (“Students at BYU are 15% more spiritual than students at LSU.”) Given the prominence of religiously-affiliated colleges and universities in America, and the diversity of expressions of faith, this is not a trivial concern.

Besides, you'd have to be living in a cave not to discern that the real agenda behind this movement is cost-cutting. It's punitive, and would be executed accordingly.

None of that is to deny some of the central charges animating this movement. Yes, tenure protects some egregiously ridiculous people. Yes, large lecture halls are crappy learning environments. Yes, tuition at some colleges is going up much faster than family income. (To the people most concerned about that, I say, HELLO! CC'S HERE! HAALLLOOO!!!!) Yes, at many colleges, people are hired to teach, but fired for not doing research. Yes, much of the research that is produced is absurd, or annoyingly esoteric, or even just wrong. Yes, the adjunct trend is offensive on a great many levels. Yes, the internal politics of colleges and universities often stymie productive reforms. (Astute readers of this blog may have noticed me spouting off on that every now and then.) Yes, college reputations are often hard to trace to anything resembling a 'cause.' All of that, granted.

But to respond with a call for a mandatory systemwide lobotomy just doesn't help, except possibly as a bogeyman. A single blunt instrument is inappropriate for such a diverse field. It's a pseudo-populist gesture designed to elicit knee-jerk affirmation from people who know a little but feel a lot.

I'd much rather engage in the (considerably harder) work of re-engineering our ways of doing business to achieve stronger outcomes appropriate to each kind of institution. That may well involve looking closely at what we reward (and whether anybody should be made bulletproof), at how we recruit, at what and how we teach. Regular readers know my impatience with the status quo on many of these. The way to do that, it seems to me, is first to accept that different colleges have different missions. Until we can sell the public on that, we'll be stuck playing defense against one-size-fits-all solutions like this. It's easier to measure a quart of snake oil than a quart of education.

Friday, August 18, 2006

 

Academic Rubbernecking

You know how, when you're passing a nasty accident on the highway, you just have to look? How even when you know the milk is sour, you have to smell it anyway? How you get an itch, and you know that scratching it will only make it itchier, but you scratch it anyway? How you put your face two inches from your toddler's butt to she if she's carrying a load, when you have every reason to believe that she is? (That great marital moment: “Eewww, she's stinky! Here honey, you smell her!”)

You know that awful, masochistic part of your brain? The part that you'd think evolution would have corrected by now, and intelligent design wouldn't have put there in the first place?

Mine acted up recently, so against every better judgment, I took a gander at the major journals in my scholarly discipline. I do this every so often.

Bad idea.

Without giving too much away, I'll just say that my field has a way of chasing down new ideas, cornering them, and beating them with shovels until all that's left is a bad smell. The kind of field where you could swap this year's conference program with one from 1994 and not notice the difference, except for a few dead people.

Back in my grad student days, I faithfully attended every conference my meager funds would allow. I worried about what to wear, until I noticed that absolutely everybody there looked like they'd worn their clothes during the spin cycle. I shared rides, slept on floors, and eschewed official 'conference hotels' for less expensive and much seedier environs. (On one memorable trip, I slept on the floor of the house belonging to my roommate's bud, who later achieved a sort of notoriety as a pioneering web pornographer. I choose not to devote too much thought to this.) Helpful traveling tip: although they're geographically close, the 'tenderloin' district in San Francisco has a very different flavor than does Market Street. Trust me on this one.

What I remember most about the conferences was being nervous all the time. The presentations were always, without exception, dreadful beyond belief. The hallway-nametag-dance, merely amusing in grad school, became grotesquely demoralizing during my days at Proprietary U. The book fair was the only refuge, but even there, I was constantly aware of both my limited funds and the grotesque mating dances of (prospective) authors and publishers going on around me. The contrast between grandiose titles and mundane content became a sort of running joke. (“Lesbians! Monkeys! Flaming Cheese!: A methodological critique of Anderson-Hysgaard's neo-Foucauldian problematic,” $29.95 in paperback from Up Against the Wall Motherfucker! Press, an imprint of Ballantine. Desk copies free with course adoption.)

The mandatory 'networking,' of course, was the worst. All the same faces every year, the junior ones angling and the senior ones conspicuously drunk. I learned that my advisor favored Scotch or Bourbon, but the Charismatic Leader preferred martinis. Grad students went with beer, trying to be cosmopolitan by getting the darkest possible brew. The ability to tolerate liquid pumpernickel was taken as a sign of sophistication. I once silenced a table by ordering a Yuengling.* You'd think I had suggested a round of dwarf-tossing.

The last time I attended the major national conference for my field was my first year at the cc. (That was before the crackdown on out-of-state travel.) The reactions to a cc nametag were so toxic that I haven't returned, and don't plan to. Life is too short to be treated like the turd in the punchbowl. Still, out of a vestigial sense of obligation (I'm big on those), I check the program each year to see what's going on. I feel the familiar spike in my blood pressure, remind myself that I don't have to do that anymore, and get back to work.

Until next year around this time...

*For my money, still the finest four-dollar six-pack on the market. Of course, if I really wanted to make a point, I should have ordered a Genny Cream. Culturally, that would have amounted to standing on the table and favoring the group with an a cappella version of “Cotton-Eyed Joe” punctuated only by the 'ting' of tobacco juice hitting the spittoon. Maybe next year...

Thursday, August 17, 2006

 

SUV's Should Be Transparent

Guest-blogging over at Bitch, Orange has a lovely rant against SUV's. Among the charges leveled against the behemoths are:

- They get lousy mileage and deplete our dwindling oil supply.
- They pollute horribly.
- They contribute to global warming, both directly (emissions) and indirectly (gas mileage).
- They're tall and opaque, causing severe visibility problems for everyone else on the road.
- They're often driven by overentitled yuppie assholes.
- Minivans are better for carrying passengers and cargo, and they're safer.
- They don't help the safety of their drivers, and they endanger everyone else.

Other than the yuppie assholes part, I couldn't agree more. (I'm actually worried that as more used SUV's come on the market, we'll see them driven by 17 year olds. I'd prefer the yuppies.) In fact, I'll pile on some more.

- They cause severe visibility hazards in parking lots. I drive a compact car. When I come out of a store to find my car flanked by SUV's, I know that backing out of the parking space will be an adventure. Short of a periscope, there's no safe way to do it. I say, make the damn things transparent, or don't make them at all.

- They instill false confidence. Do you know how much 4WD helps with braking on snow or ice? None. Zero. Zip. 4WD helps with acceleration in mud. It does absolutely nothing to help you stop. That's why you see so many SUV's in ditches by the side of the road every time it snows. I have passed many a stranded Grand Cherokee in my little sedan over the years.

- As Keith Bradsher noted in his excellent High and Mighty, SUV's are generally the worst-built, least-reliable vehicles on the road. (This year's Consumer Reports auto issue reveals that this is still true.) They are the bastard children of regulatory loopholes and anthropologists gone bad.

- By becoming the Manly Alternative to minivans and station wagons, they've stigmatized minivans and station wagons. Now if you have more than two kids or you carry lots of stuff, you have to choose between an SUV and feeling like Pretty Polly Sunshine in her flowered dress.

One of the highlights of recent months occurred when I stopped for gas. When I pulled in, there was a Ford Excursion fueling up at the pump in front of mine. As I drove away, it was still fueling up. At three dollars a gallon, that has to hurt. Am I proud of chuckling “you dumb bastard” as I drove off? A little.

SUV's are textbook examples of what Thorstein Veblen called “honorific waste.” Their profligacy is precisely their appeal. Driving an Escalade sends the message that you're so ridiculously successful and important that you can afford to drive what amounts to a financial tapeworm. Frugality is for losers. Your vanity is far more important than, say, other people's ability to see. Let others worry about global warming and world peace and their fellow man; you're King of the Road, above such petty concerns. They're pop Nietzscheanism, which is almost the epitome of Bad Ideas.

I've never been much of a fan of pickup trucks, either, since I'm old enough to remember when they were the vanity vehicles of choice. But at least pickup trucks are easier to see around when you're pulling out of a parking space, and they're inarguably useful for certain applications (like hauling stuff).
If you live in a mostly-paved part of the country, I can't think of a single application (other than phallic symbol) for which SUV's are better suited than anything else.

Hummers are the expandio ad absurdum of SUV's. They're almost comical. I honestly worry about the psychological states of their drivers. If you aren't barreling over sand dunes with a howitzer, you have no business driving one of these. I call them Compensators. They scream to the world, “I have issues!” And they suck gas like it's going out of style, which, in a way, it is.

A very wise person once shared something with me. She said that houses appreciate, and cars depreciate. Therefore, it's okay to stretch when buying a house, but you don't want to spend too much on a car. (Exceptions exist at the extremes, but I'm talking about the middle class. This was also before the latest housing bubble, which is finally deflating.) The older I get, the more convinced I am that this is a good rule. When I see forty-thousand-dollar SUV's in apartment complex parking lots, I can't help but wonder.

Ford makes a hybrid SUV, which, to me, is like a lowfat twinkie. It's a gesture that says “if you really cared, you wouldn't buy this in the first place.” Honda makes a hybrid Civic. Ford is losing money and market share precipitously. Honda is gaining ground. It's almost as if 'cause' is somehow connected to 'effect.' I know that Detroit has never been great about connecting the dots, but sheesh.

Thanks, Orange, for the chance to second a fine rant.

Wednesday, August 16, 2006

 

Home Again

The Wife took The Boy and The Girl to a major amusement park out-of-state on Monday, returning yesterday. They went with two other Moms and four other kids, so it was a full-on assault.

By all reports, they had a great time at the park. TB was on his very best behavior, and TG was mostly a good sport.* TW brought the camera, and were it not for the burden of pseudonymity, this would be a photo essay. They spent the night at someone's house, and returned yesterday thoroughly exhausted.

It also meant that I was a bachelor for an evening.

The very rare bachelor evenings have a discernible rhythm. For the first hour or so when I get home from work, it's kind of nice. I can change clothes in peace, read the mail as soon as I bring it in, sneak a snack, and exhale.

Dinner is weird, since it's solo. I don't care much for that.

For the rest of the evening, I'm just off-balance. Nothing quite feels right. I'm constantly thinking about what I 'should' do to take advantage of the time. TB hates the sound when I grind coffee, so I use his absence to do that. I read the mail at the dinner table. By about 8, I realize just how much I've come to depend on the family for a kind of discipline. Even when it makes me grumpy, the routines we have keep me grounded. In the absence of those routines, I feel unmoored. Not sad, exactly, just lost.

I lived alone for about a year and a half after starting my first full-time teaching job, but before getting married. But that was before I had ever been married, so it was different. I didn't have them to miss.

They all greeted me with smiles and hugs when I got home from work yesterday. I was amazed at how much I missed them in a single day. The Girl was moody, The Boy was clingy, The Wife was exhausted, and it was great.

There's nothing like having the whole clan together. I may be tired, but I'm not lost.

*The exception came at night, as TB shared a 'tent' in the kitchen with his friend, whom I'll call Beelzebub (not his real name). Beelzebub and TB spent hours discussing farts, demonstrating farts, simulating farts, laughing at farts, and bragging about how loudly their Dads fart. Nice to know that I'm still on his mind, even when he's away...

Tuesday, August 15, 2006

 

Required Reading

It's a little, um, indiscreet, but it's also DFOT.

Kung Fu Monkey

Highlights:

I am absolutely buffaloed by the people who insist I man up and take it in the teeth for the great Clash of Civilizations -- "Come ON, people, this is the EPIC LAST WAR!! You just don't have the stones to face that fact head-on!" -- who at the whiff of an actual terror plot will, with no apparent sense of irony, transform and run around shrieking, eyes rolling and Hello Kitty panties flashing like Japanese schoolgirls who have just realized that the call is coming from inside the house!

I may have shared too much there.

To be honest, it's not like I'm a brave man. I'm not. At all. It just, well, it doesn't take that much strength of will not to be scared. Who the hell am I supposed to be scared of? Joseph Padilla, dirty bomber who didn't actually know how to build a bomb, had no allies or supplies, and against whom the government case is so weak they're now shuffling him from court to court to avoid the public embarassment of a trial? The fuckwits who were going to take down the Brooklyn Bridge with blowtorches? Richard Reid, the Zeppo of suicide bombers? The great Canadian plot that had organized over the internet, was penetrated by the Mounties on day one, and we were told had a TRUCK FULL OF EXPLOSIVES ... which they had bought from the Mounties in a sting operation but hey let's skip right over that. Or how about the "compound" of Christian cultists in Florida who were planning on blowing up the Sears Tower with ... kung fu?

And now these guys. As the initial "OH SWEET MOTHER OF GOD THEY CAN BLOW US UP WITH SNAPPLE BOTTLES!!" hysteria subsides, we discover that these guys had been under surveillance, completely penetrated, by no less than three major intelligence agencies. That they were planning on cell phones, and some of them openly travelled to Pakistan (way to keep the cover, Reilly, Ace of Spies). Hell, Chertoff knew about this two weeks ago, and the only reason that some people can scream this headline:

"The London Bombers were within DAYS of trying a dry run!!!"

-- was because MI-5, MI-6, and Scotland Yard let them get that close, so they could suck in the largest number of contacts (again, very spiffy police work). The fact that these wingnuts could have been rolled up, at will, at any time, seems to have competely escaped the media buzz.

This is terrorism's A-game? Sack up, people.

Again, this is not to do anything less than marvel as cool, well-trained, ruthless law-enforcement professionals -- who spent decades honing their craft chasing my IRA cousins -- execute their job magnificently. Should we take this seriously? DAMN STRAIGHT we take this seriously. Left unchecked, these terror-fanboy bastards would have gone down in history. These cretins' intent was monstrous; they should, and will, all go to jail for a very long time. This is the part where we all breathe a sigh of relief that there are some actual professionals working the job in some countries.


(skip)

Maybe it's just, I cast my eyes back on the last century ...

FDR: Oh, I'm sorry, was wiping out our entire Pacific fleet supposed to intimidate us? We have nothing to fear but fear itself, and right now we're coming to kick your ass with brand new destroyers riveted by waitresses. How's that going to feel?

CHURCHILL: Yeah, you keep bombing us. We'll be in the pub, flipping you off. I'm slapping Rolls-Royce engines into untested flying coffins to knock you out of the skies, and then I'm sending angry Welshmen to burn your country from the Rhine to the Polish border.

US. NOW: BE AFRAID!! Oh God, the Brown Bad people could strike any moment! They could strike ... NOW!! AHHHH. Okay, how about .. NOW!! AAGAGAHAHAHHAG! Quick, do whatever we tell you, and believe whatever we tell you, or YOU WILL BE KILLED BY BROWN PEOPLE!! PUT DOWN THAT SIPPY CUP!!

... and I'm just a little tired of being on the wrong side of that historical arc.


Yup.

 

The Gray Ceiling

A commenter to yesterday's post alerted me to this article about “the gray ceiling.” It's a newish term used to describe workplaces in which the promotion and/or employment prospects of Gen X'ers are stymied by a huge cohort of Boomers (and pre-Boomers) who just won't leave.

It's a nice piece. It makes the cogent point that, unlike other glass ceilings, the gray ceiling is particularly immune to legislative or judicial remedy. If a reasonably large company promotes only men or only whites, it will have a hard time defending itself against a discrimination claim. On the other hand, if it never puts anybody under 50 in a position of real responsibility,the courts wouldn't even recognize a claim.

Under federal law, age discrimination is actionable only when it hurts the old. When it hurts the young (say, through excessive and/or specious requirements for experience), it's hunky-dory. Hiring for diversity in race or gender is smiled-upon; hiring for diversity in age is actionable.*

I saw this firsthand last year, when I served on (but didn't chair) a search committee for a very senior position. The chair of the committee, one of my very favorite people here, drew up a template for awarding points to the c.v.'s we received, so we could decide which candidates to invite to campus for interviews. So far, so good. The job posting listed a few requirements and several preferences, so we had a pretty good idea of what criteria to use for the first cut.

The eye-opener for me was how the 'experience' category was coded. Experience at the desired level was valued linearly, so the five years between 15 and 20 counted for as much as the first five years. (The template itself has been relegated to the back files, so I'm working from memory.)

To my mind, this is blatant age discrimination. I'm willing to concede that the first few years of experience at a given level count for something; certainly I made some judgment calls in my first year of deaning that I wouldn't make now. But it strains belief to say that the marginal gain in proficiency at a job from year 11 to year 15 is as dramatic as the gain from year 1 to year 5. After a certain point, I'd wager, people have pretty much learned what they're going to learn at a given position. After that, they're mostly repeating themselves. I'll take it farther. There's a point somewhere along the line – I won't venture to say where – at which performance probably actually starts to drop. Not to the same degree for everybody, and probably not always at the same point, but it's there. We all have blind spots. As those blind spots get ignored for longer and longer, nasty stuff can fester in them. The occasional change at least changes the blind spots, so stuff long-ignored can be addressed, and the stuff newly ignored will probably be able to coast on capital for at least a little while.

The thirty-something hotshots in the article, by and large, got around the gray ceiling by leaving their companies and starting new ones. In much of the private sector, that's an option. Not so much for academia. As much as I'd enjoy setting up Dean Dad University, and running it according to my twisted ideology and devilish whims, it ain't gonna happen. (There's a movie coming out about a bunch of 18 year olds who set up their own college, so they can spend their parents' money on beer for four years. Because certainly no such thing ever happens at established colleges.) The barriers to entry are astonishing, and the price competition with the public sector suicidal. Absent the support of some major sponsor (a religious denomination, the government, or profit-seeking investors), I just don't see it.**

(The other great exception for academia is tenure. Lifetime tenure, combined with no mandatory retirement, is a recipe for a gray ceiling. I've written on that before, so all I'll say is that a reasonable proposal might be to have tenure expire at 70. People can still be full-time faculty after 70, but only if they win a fair fight for the job. If that strikes you as unreasonable, ask yourself why.)

I can see the appeal of requiring excessive amounts of experience. It's easily quantified, it's measurable, there's a presumption of relevance, and nobody ever got in trouble for doing it. Since the class most hurt by it lacks the legal standing to argue a case, there's a certain “no harm, no foul” appeal. But that still doesn't make it right.


*The exception to this is when the gray ceiling also happens to coincide with disallowed versions of discrimination. You have to pick the right kinds of bias.

** I am open to being proved wrong on this. Messrs. Gates and Buffett are welcome to contact me with offers of pornographic amounts of money. I'll be happy to name buildings after them. Perhaps the Bill Gates entomology lab, in honor of all those Microsoft bugs? The Gates and Buffett memorial center for Anti-Trust Studies?

Monday, August 14, 2006

 

Little Things Mean a Lot

The college has an employee recognition dinner every year, to honor employees who have worked here for ten, twenty, or thirty years. (No forties this year.) A quick breakdown of the proportion of each group that is faculty:

30 years: 6 out of 8 are faculty

20 years: 4 out of 11 are faculty

10 years: 1 out of 12 is faculty

The decline is in absolute numbers, as well as percentage. Any speculation as to the cultural consequences to the college of such a top-heavy distribution is probably correct.

On the home front…

This weekend, The Boy successfully executed the “pull my finger” joke! I’m so proud! He is SO ready for Kindergarten.

He also told his first successful joke. “This guy goes to the doctor. He has a carrot stuck in his ear and a banana in his nose. The doctor says, ‘you’re not eating properly.’” A perfectly fine joke, and well told. That’s my boy!

Friday, August 11, 2006

 

Musings on 'Affordability'

The other day I heard one of my favorite economic commentators, Amelia Tyagi, say that about one-third of Americans live in housing they can’t afford. It gave me pause.

If they can’t afford it, how do they live there?

I think the key issue is the definition of ‘affordable.’

‘Affordable’ has multiple meanings. The most basic, literal meaning is the absolute one: do you have enough money to pay for something or not? By that definition, almost nobody lives in unaffordable housing (and those that do are likely to get evicted in the very near future). But that’s an extreme definition, one that can be stretched so far that it’s effectively meaningless. If I make 40k a year after taxes, can I spend 35k on housing? By the most literal definition, yes. By any reasonable definition, of course not.

A more commonplace definition is a price that someone can pay and still have enough to live at a satisfactory level. Obviously, what is a ‘satisfactory level’ will vary from person to person. (Usually, for housing, it’s defined as something like not spending more than 28-35% of income. On the coasts, many of us are in the 40-50% range, because the alternative is living in slums.) That’s where a term like ‘affordable’ becomes both contentious and helpful.

There are many reasons I’m not a libertarian. One of those is that it’s simply not true that other people’s consumption patterns are none of my concern. Of course they are.

Take housing. When other people are willing to take out (or extend) interest-only and negative-amortization mortgages, they push up the price of houses. If I understand math well enough to know the stupidity of taking out that kind of loan, I will find that my prudence will be punished by my being relegated to neighborhoods I want no part of. Since other people have abandoned traditional (or long-standing legal) restraints, I am forced either to live (comparatively) low on the increasingly-polarized economic scale or to take outsized risk. Neither is reasonable. We used to have strict requirements about credit-worthiness, down payments, and amortization, precisely to prevent the kind of runup in precarious lending (and house prices) that has happened over the last few years.

This is not a new concept. “Sumptuary laws” used to prescribe acceptable maximum prices for all manner of goods and services, on the theory that there is a generally-accepted correct price for any given item. Those have largely gone by the boards over the years, on the (generally) correct theory that the market is smarter than any given legislature. Now, the theory goes, excessive prices will be punished by consumers who will either shop elsewhere or substitute other purchases.

The theory breaks down at the ‘macro’ level, where we use the Fed to control interest rates and money supply – a sort of supply-side sumptuary law. We need the Fed to do that, because left to its own devices, the market tends to go unproductively haywire.

When there was something resembling a viable Democratic party, the grand compromise was to do away with most sumptuary laws, but to enact public policies that tended to lead to a relatively football-shaped income distribution and to provide pretty good public schools in most areas. The compromise, and it was a good one, was that certain basic necessities of life were within the reach of most people, but ‘frills’ were left to the open market. (In some relatively egregious cases, frills were even subject to ‘luxury taxes.’ These have been replaced mostly by ‘sin taxes,’ which are taxes on the luxuries available to the working and middle class.) Since the implosion of the Democratic party (late 1970’s, give or take), we’ve fallen into a pattern of removing just about all restraints on income, consumption, and debt. (Interest rates charged by credit card companies now would have fallen afoul of ‘usury’ laws not long ago. Minimum equity or down-payment requirements for mortgages have been lifted. The minimum wage has been left flat for nearly a decade. The financial-services sector has been deregulated to an unprecedented degree. Guaranteed pensions are going the way of the dodo. Unsurprisingly, the Gini index, which measures economic inequality, is at Gilded Age levels.)

(The exceptions to the deregulation trend – agricultural subsidies and defense contracting – both happen to flow overwhelmingly to red states. An astonishing coincidence. Apparently, risk is only for Democrats.)

If people were the fully rational economic actors that economic theory takes us to be, the trend towards deregulation would be mostly positive. But we’re not. An entire scholarly genre (“behavioral economics”) has developed to explain the various ways that people act counter to the ways economic theory says we’ll act. Salesmen have known this stuff for years: “No Payments for 90 Days!” works because many people falsely assume that the pain of spending restraint in the future will be much less than the pain of spending restraint now. Participation rates in 401(k) plans increase markedly when participation, rather than non-participation, is the ‘default’ setting. People are likelier to attend events when they bought tickets than when they were given tickets, even if the event itself is no more appealing. (That’s called the ‘sunk cost’ fallacy.) These have all been documented in the scholarly literature, but they’re also familiar to anybody who has ever dealt with salesmen. Regulations used to exist to prevent the worst abuses that the predictable human failings would allow. Now, it’s devil take the hindmost.

So we’re remaking a society along the lines of a theory that we know, objectively and empirically, to be false. We’re relying on a rationality that we know not to exist. Why the hell would we do that?

Hmm.

Thursday, August 10, 2006

 

The Tattletale Taboo

How do you explain the tattletale taboo in terms that a 5 year old can understand?

The Boy asked me recently why he shouldn’t tell us or his teacher when his friends are doing something bad. I tried to distinguish between ‘really’ bad (hurting someone) and ‘kinda’ bad (making faces), but he didn’t get it and I didn’t want to undermine any hope of ever enforcing any rules ever again.

Tattling is in the eye of the beholder. Yesterday, while I was at work, The Wife took TB and The Girl to a nearby lake, along with another Mom and some friends of TB’s. TB spat some lake water at his friend, who complained to her mother and TW. TW was glad that the friend complained, so she could reprimand TB. The friend’s mother reprimanded the friend for tattling. But if she hadn’t tattled, TB would have gotten away with unacceptable behavior. What the friend’s Mom took as tattling, we took as useful reportage.

The funny thing about the tattletale taboo is that it gets harder to explain, the more you look at it. It isn’t just about group loyalty, since the taboo applies even in very transient, anonymous groups. It isn’t just about friend loyalty, since I know I’ve withheld saying anything when witnessing total strangers do things they shouldn’t.

I don’t want TB (or TG, when she gets old enough) to be a whiner, incapable of handling the normal stresses of everyday life. But I don’t want him to tolerate truly awful stuff, either, since if you do that long enough, you start to think it’s because you deserve it. How do you teach “deal with it” and “don’t stand for it!” at the same time?

Wednesday, August 09, 2006

 

Good Thing I Got That Doctorate!

A peek behind the curtain, at yesterday’s activities:

- Get call from department chair, saying that the storage room we’re going to use while moving a program from one room to another has miscellaneous stuff in it. He wants to move it out. I get a verbal description of the stuff in question.

- I call the director of facilities, to see if he knows the origin of the stuff. He doesn’t, but suggests I call Security to make sure there’s no fire hazard.

- One does not deal with Security if one does not have to.

- I call the budget director, who doubles as unofficial guru of all things to do with rooms, to see if he knows the origin of the stuff in question.

- He wants a thicker description of the offending items.

- I track down the department chair, who is busily boxing items for moving. He gamely escorts me to the storage room, which is more of a closet. We examine the stuff for about ten minutes, while I take notes. The stuff includes a five-foot-by-four-foot block of white Styrofoam, with a label helpfully saying “large foam.” It also includes a stray piano bench, and some carpet-pad remnants that smell vaguely of death. I ask how long the stuff has been there. The chair assures me that in the several years they’ve used that area, it has always been there.

- Alertly, I wander over to the Music department, on the theory that it’s the likeliest home for a piano bench. They confirm that there was once a reason for a piano bench down there, but they’d like it back. They have no idea about the large foam or the carpet pad remnants.

- I call back the budget director, explaining the large foam, the piano bench, and the carpet pad remnants. He tells me to fill out the requisite work orders to have the piano bench moved to Music, and the foam and pad discarded.

- We also have to fill out disposal forms, with four levels of signatures, for the large foam and the pad remnants, even though they appear on nobody’s inventory.

- Since the Chair of Music is on vacation, we have to wait before having the piano bench transferred.

And that’s why it’s important that deans have faculty experience and earned doctorates.

Tuesday, August 08, 2006

 

Ask the Administrator: A Bird in the Hand...

A place-bound correspondent writes:

I applied for a tenure-track position at a CC this spring, and was told that I was the top candidate, but the TT search was cancelled, and they are now authorized to offer me a one-year position. There is a possibility the one-year position will be renewed, or that they will conduct the TT search again, in which case the department chair told me I'd be the top candidate. Here's my dilemma: I'm currently in a full-time administrative position (with teaching duties) at the director level. I've never taught at a CC before. If I took the one-year position, I'd be giving up a certain amount of job security (and the familiarity of a four-year institution). But I'd really like to move into faculty (I just earned my Ph.D.), and I think I would like a CC, although that remains to be seen. I'm willing to continue my research if I must, but my heart's not really in it. I'm trying to figure out if the risk is worth it. I wish I knew the likelihood that either the one-year would get renewed or that the department would get authorized to do the TT search again (and that I'd remain the top candidate). Any insights or advice?


In a subsequent email, she clarified that the cc is relatively local, and that she’s place-bound for the foreseeable future.

I’ve been doing this long enough to know that when a college changes a tenure-track line to a one-year line late in the game, it’s a sign of trouble (usually financial). The college wants you to believe that the switch is little more than a formality, but from an admin perspective, I can tell you that those switches aren’t made lightly. Red flag number 1.

Red flag number 2 is the assumption by the chair (maybe even in naïve good faith) that you’d automatically be the top candidate if the line goes tenure-track in a year. It’s possible, of course, but every public college I know would be obligated to do a new full search for what would amount to a new position. You came out on top this time; there’s no guarantee that you would next time. That’s assuming that the line gets converted at all. (Renewal to a second year is another story; that wouldn’t require a new search, but again, there’s no guarantee it would happen at all.)

Right now, you hold the cards. You have a job, so you don’t need the cc position to feed yourself. The department wants you, even if the college overall is lukewarm on the position (which is almost certainly NOT a judgment about you). You have options.

If you take the cc job, though, you suddenly become the powerless one. You lose the secure gig, and suddenly getting renewed or converted is a bread-and-butter issue for you. Your bargaining power is shot, and the fact that you’re place-bound means that, if the cc doesn’t come through after the first year, you could well be left without anything desirable next year. (Some locations have more opportunities than others, so your mileage may vary.) Even if it comes through with a renewal or t-t conversion, your bargaining power for salary is essentially zero, since they’ve got you right where they want you.

I’m also concerned that you’ve never taught at a cc before. Depending on the four-year school at which you’re currently working, you may find the teaching environment surprisingly different. (You’ll also find the teaching load surprisingly high.) If you discover that it isn’t to your taste, next year, you’re stuck.

You’re in the enviable position right now of being able to play hardball. Use it. My two cents: tell the cc it’s the tenure-track position they advertised, or nothing. The beauty of that is that you can actually follow through on it either way. You have a good job now, so if the cc just can’t commit, you’re no worse off than you are now. Next year, you can use your current job as a perch from which to find something else you really want.

Teaching is a great job, but it’s a job. It’s not a religious calling, and you shouldn’t feel the need to wear the hair shirt. If the college can’t commit to not firing you next Spring, I wouldn’t leave a solid job for it.

Have a question? Ask the Administrator at ccdean (at) myway (dot) com.

Monday, August 07, 2006

 

Bad Trip, Good Vacation

It seemed like a good idea: pile the kids into the car, and head for the ocean. Spend some time on the beach, playing in the sand and frolicking in the water, with brief breaks for ice cream and/or pizza. What could possibly go wrong?

Ugh.

In brief: the hotel’s air conditioning was ineffective even when the electricity wasn’t cut off (!), the breeze came off the mainland towards the ocean (thereby destroying any cooling effect), the sand flies were thicker and more aggressive than I had ever seen, The Girl was attacked by a wasp/bee/dragonfly/big honkin’ insect that got itself caught in her hair, The Boy and The Girl were utterly spooked by the waves and refused to go near the water, the beach was hotter and muggier than any I’ve ever experienced, the shower stall in the hotel room was small enough that I thwacked my elbow on the sliding door, the room was hotter than the outdoors even with the a/c running, and the power went out.

We came home the same day we left. The original plan was to stay for several days.

Ridiculous.

We’ve stayed at cheap and cheesy beach motels before, so our aesthetic expectations were fairly low. But for it to be hotter inside than outside when it’s 100 degrees outside just ain’t right. (Celsius conversion – think body temperature, plus one.) We were worried that The Girl would get heatstroke if we stayed.

I’ve mentioned before that I don’t do heat, as a rule, and I consider central air one of the great scientific advances of human history. This is even more true now.

My new rule: ‘window units’ are not air conditioning. They are gestures toward air conditioning. They’re a way of saying “we agree, in concept, that air conditioning would be useful here.” We’ve decided that future vacations will be at hotels with national names and central air. If any independent types want to get my attention, they are free to install and advertise central air. Otherwise, I’m not havin’ it.

Once we got home, the rest of the vacation was better. A few vignettes from the last few days:

The Boy: Can you say ‘no’?

The Girl: No!

The Boy: Can you say ‘yes’?

The Girl: Uh-huh!

This weekend I took The Boy’s training wheels off his bike and we took him to the parking lot of the nearby elementary school. Verbalizing to a five-year-old how to stay balanced on a moving bike is harder than you’d think. This is especially true when the five-year-old in question is something of a selective listener. My lower back hasn’t quite recovered yet, but these are the things parents do.

We’ve also been working on reading. TB and I sat down on Sunday with Danny and the Dinosaur, and he read me the first thirty pages, working his phonetics just as hard as he could. He needed help with a few words (“night,” “knock”), and the speed was low, but hey. Bit by bit. And his satisfied cackle when he finished a page was totally worth it.

The Girl is becoming pickier at dinnertime, often refusing to eat unless dinner comes with a show. So I put some food on her spoon and hold it up to my ear, pretending to react as it tells me it wants to be in her belly. Then I do what we’ve called “the belly dance,” in which I do my distinctive vocalese rendering (in every sense of the word) of Thelonious Monk’s “Well You Needn’t” while dancing the spoon around until it reaches her mouth. (I base my version on Randy Weston’s, from his Portraits of Monk.) This usually works for about a half-dozen bites.

We had one meal free of the belly dance. Through the generous help of The Wife’s parents, she and I actually had a grownups-only night out. Words cannot convey the glory of sitting in a real restaurant with the most attractive woman in the place, eating something unpronounceable and not cutting anybody else’s food. We even went to a play, which was great fun and surprisingly good. It felt like we were regressing to our twenties, but in a good way.

Now it’s back to the salt mines. Here’s hoping the a/c works…

Wednesday, August 02, 2006

 

Testing the Limits of Sunscreen

For the next few days, the gang will be focusing on little more than sun, sand, and gloriously awful cheap food. Since I won't be posting again until Monday, here are a few of my faves from earlier this year (sort of a "Best of Dean Dad"):

Elephants, A Play in One Scene

Men in Hats (Compare and Contrast Curious George and Brokeback Mountain)

Shards, or Life in a Northern Town

See ya Monday.

Tuesday, August 01, 2006

 

Wisdom in a Booster Seat

The Boy, in the back seat on the way to grocery shopping, upon hearing a song on XM Kids about self-esteem:

“I do like myself. I just like other people more.”

In a way, it’s wisdom.

 

Ask the Administrator: Observing Outside Your Discipline

A gratifyingly prolific correspondent asks:


At a CC, how are teaching observations vs. Student
feedback used in tenure evaluations? How do you (if
you do) handle observations of folks not in your
discipline?


I'm thinking of my dean observing me teaching
statistics (I also taught Spanish, but no one ever
observed me doing that). She would come in and take
notes, which would, somehow, be used someway... She
didn't know stats to save her life. Two semesters
later I was tutoring her through the same level course
that she was taking elsewhere.

Basically: Most of us (ya'll) were faculty in a
discipline. In your academic training you probably
got a 1-credit seminar on how to teach your
discipline. You got some on-the-job mentoring, but
most of us (ya'll) never read education research. How
would you know what's good practice and what's not?
And, how do you determine if your faculty are enacting
it?

I left the CC before ever having a tenure-meeting, so
never did find out how that file was used.

I've never taken seriously those little course-eval
forms. I only know a few students who ever did.
Oddly though, I prided myself on getting high marks.


You’re overly generous in your assumptions about teacher training. I still remember the sum total of my training before being thrown in front of an intimidatingly-large class: the prof for whom I was t.a.’ing, an extremely famous personage many people have heard of, declared in his typical Olympian fashion, “you’ll be fine.”

Thanks. I wouldn’t have thought of that, since I am a complete idiot.

Even a one-credit seminar would have been more substantive than that.

The heart of your question, though, is about observing classes outside one’s own discipline. I do this all the time, since my own discipline is a very small percentage of my administrative jurisdiction. Some of the other fields are close enough that I don’t feel out of my depth, especially since the classes are all at the 100 or 200 level, but some of the fields are simply beyond anything I’ve done before. (Example: I’ve observed Intro to Italian. My only Italian comes from menus and the florid cursing of friends’ grandparents.) And no, I don’t have formal training in education research. That’s not unusual for someone in my position. In fact, the higher up the food chain you go, the more it will be true that you’ll be responsible for disciplines of which you know little or nothing. No college president is master of every discipline at the college, and it would be unreasonable to expect otherwise.

In a way, a certain lack of familiarity can help. Since I don’t know enough to fill in the gaps myself, I can approximate the perspective of a student fairly well. Can I follow what’s going on? If yes, and if the students are clearly getting it, then I assume that all is well. In my own discipline, I have the handicap of being able to follow what’s going on even if the teacher is flopping, so it might be harder to notice a partial or confusing explanation.

(It’s the same reason that the best players don’t often make the best coaches. Ted Williams was a lousy manager, since he couldn’t understand how anybody couldn’t hit. Someone who knows what it is to struggle can make a better instructor.)

A former colleague of mine used to say of teaching, “stand on your head and spit wooden nickels if it helps.” There’s something to that; either a class works or it doesn’t, whether it’s done in my style or someone else’s. I’ve seen instructors use approaches that I would never use, but they’ve worked, and that’s what matters. I don’t pretend to have a monopoly on pedagogical wisdom.

I’ve used my own observations as reality checks. In my experience, student evaluations are usually closer to the mark than most of us would like to admit, but they can be swayed by accents, humor, and grading. (Students will forgive almost anything other than an accent.) Department chair evaluations are always, without exception, glowing, which makes them useless. My own visit to a class helps me place the other inputs into perspective. At my previous school, I twice had the experience of visiting classes that always got low scores from students, only to come away highly impressed with the teaching: in one case, he was a hard grader, and in the other, she had an accent.

Has anybody been denied tenure based solely on my observation? Nope. It’s one input among others, which is probably about right. I’d hate to give it up, since then I’d be entirely at the mercy of people who have clear self-interest in particular outcomes, but it’s certainly not decisive in itself.

Have a question? Ask the Administrator at ccdean (at) myway (dot) com.

This page is powered by Blogger. Isn't yours?