The Girl dressed as a Tootsie Roll, and The Boy as a Jawa from Star Wars.
They were endearing, but not scary. I had front door duty.
Fears change, with age. If I were to dress up as something really scary, I might go as:
- The Program That Wouldn’t Die. I’d be a zombie with low enrollments, high fixed costs, a powerful ally, and a political minefield. “Funds! Eat Funds!”
- MOOCman. 90 percent of my costume would be missing by the end of the night, since that’s their attrition rate.
- Hatchet Harry, the Human Budget Cut. Picture a really angry accountant wearing a tricorner hat, like the Tea Partiers. Or maybe Santa Claus with a suit on backwards, to symbolize a midyear budget cut.
- The Politician with a Brilliant Idea. I’d have a lightbulb suspended over a dunce cap.
- An Extended Power Outage. Dress all in black. It’s a New York reference twice.
- A standardized test. I could wrap myself in bubble wrap, popping one out of every four bubbles randomly.
- A glob of cholesterol. It might put a damper on the whole ‘candy’ thing, though.
- A home contractor. I’d show up, then leave unexplained for weeks, then show up again, then vanish again, leaving an awful mess in my wake.
- An “Explanation of Benefits” from an HMO. I’d wear a twisted glob of spaghetti that doesn’t smell quite right.
- Comcast! I could dress, and walk, like Mr. Magoo.
- My hairline. But I wouldn’t want to get arrested for indecent exposure.
Wise and worldly readers, what costume would you find truly scary?
In which a veteran of cultural studies seminars in the 1990's moves into academic administration and finds himself a married suburban father of two. Foucault, plus lawn care. For private comments, I can be reached at deandad at gmail dot com. The opinions expressed here are my own and not those of my employer.
Wednesday, October 31, 2012
Monday, October 29, 2012
And Then, The Scramble
As Sandy continues to rage, I’m already anticipating some messy rescheduling issues as people stream back.
Bureaucratically, the cleanest form of natural disaster is the kind that hits everyone at the same time, and from which everyone emerges at the same time. The messiest ones are the ones from which some people are up and running the next day, while others are unable to show up for several days running. (Anything involving downed trees tends to play out that way; some neighborhoods are barely affected, while others take days to dig out. Some places keep electricity the entire time, while others are out for a week.) It’s hard to penalize students for living on the wrong street, but it’s also hard to extend infinite flexibility while still upholding the integrity of the course.
Of course, students aren’t the only people affected. When professors can’t make it to campus -- again, through no fault of their own -- students lose time. In disciplines with labs or studios, the points of vulnerability multiply: if the professor makes it in but the lab tech doesn’t, then there are real limits on what the class can do. In classes with group work -- particularly presentations -- penalizing the students who showed up for the one who didn’t just violates common sense.
For some sorts of classes and some sorts of disasters, the internet is a savior. If a fairly traditional class has an online component -- which is becoming more common -- then a given week’s lessons can be adapted to online delivery to avoid losing time. This works especially well in January, when the typical disaster is a snowstorm and most people still have power.
But when power is spotty, the internet doesn’t help.
Every time something like this happens, there’s a call for a Policy That Will Solve Everything. I understand the impulse, but it’s hard to imagine what that would look like. “Don’t penalize students for missing class this week” would be pretty heavyhanded, and would set the kind of precedent that even a levelheaded sort would find alarming. “Treat these absences as you would any other” is heavyhanded in the other direction, and is still much more directive about how faculty teach than I think an administration ought to be. There’s quite a gap between what I personally think would be a good idea, and what I’d be comfortable having The Administration announce as a policy. Academic freedom covers a lot of ground.
We’ll probably fall back on “use your best judgment” by default. That doesn’t provide the clarity or consistency that would be ideal, but it’s hard to imagine something that would that wouldn’t be overly directive.
Thinking out loud, this may be a good topic for a future professional development workshop. If everyone is allowed to make their own calls, it’s probably a good idea to at least have some open discussion before the next disaster about the ideas to consider when making those calls.
Wise and worldly readers, have you found or seen graceful ways to deal with students fairly in the wake of an unevenly-distributed recovery from a disaster? If so, how did it work?
Bureaucratically, the cleanest form of natural disaster is the kind that hits everyone at the same time, and from which everyone emerges at the same time. The messiest ones are the ones from which some people are up and running the next day, while others are unable to show up for several days running. (Anything involving downed trees tends to play out that way; some neighborhoods are barely affected, while others take days to dig out. Some places keep electricity the entire time, while others are out for a week.) It’s hard to penalize students for living on the wrong street, but it’s also hard to extend infinite flexibility while still upholding the integrity of the course.
Of course, students aren’t the only people affected. When professors can’t make it to campus -- again, through no fault of their own -- students lose time. In disciplines with labs or studios, the points of vulnerability multiply: if the professor makes it in but the lab tech doesn’t, then there are real limits on what the class can do. In classes with group work -- particularly presentations -- penalizing the students who showed up for the one who didn’t just violates common sense.
For some sorts of classes and some sorts of disasters, the internet is a savior. If a fairly traditional class has an online component -- which is becoming more common -- then a given week’s lessons can be adapted to online delivery to avoid losing time. This works especially well in January, when the typical disaster is a snowstorm and most people still have power.
But when power is spotty, the internet doesn’t help.
Every time something like this happens, there’s a call for a Policy That Will Solve Everything. I understand the impulse, but it’s hard to imagine what that would look like. “Don’t penalize students for missing class this week” would be pretty heavyhanded, and would set the kind of precedent that even a levelheaded sort would find alarming. “Treat these absences as you would any other” is heavyhanded in the other direction, and is still much more directive about how faculty teach than I think an administration ought to be. There’s quite a gap between what I personally think would be a good idea, and what I’d be comfortable having The Administration announce as a policy. Academic freedom covers a lot of ground.
We’ll probably fall back on “use your best judgment” by default. That doesn’t provide the clarity or consistency that would be ideal, but it’s hard to imagine something that would that wouldn’t be overly directive.
Thinking out loud, this may be a good topic for a future professional development workshop. If everyone is allowed to make their own calls, it’s probably a good idea to at least have some open discussion before the next disaster about the ideas to consider when making those calls.
Wise and worldly readers, have you found or seen graceful ways to deal with students fairly in the wake of an unevenly-distributed recovery from a disaster? If so, how did it work?
Sunday, October 28, 2012
Time Travel
Like about 70 million other people, we’re in the path of Hurricane Sandy. As of this writing, we still have power, but after last year’s catastrophe, we’re expecting to lose it for a while. (If this week’s blogging gets spotty, that’s why.) Given some warning, we spent the weekend preparing.
It has been an exercise in time travel.
When TW and I were kids, the only reason that schools closed was snow. A good blizzard, or maybe a stray ice storm, would do it; otherwise, we went. The Boy and The Girl didn’t believe me when I told them that; bless their short time horizons, they think annual hurricanes are normal. They don’t see anything odd in “frankenstorms” or “thundersnow” or the other weird weather hybrids that have been popping up with unnerving frequency. They think annual extended blackouts are normal. The reliable power that I remember as a kid has become an historical artifact.
When the power goes out, most of the recent technological advances quickly become irrelevant. Anything internet-based is inaccessible without electricity, and batteries drain pretty quick. (Last year, even the local cell towers went dead, so I couldn’t even use tethering to compensate for dead wifi.) The old copper land line went away years ago, replaced by the cable version that goes down when the electricity does. Even television is out.
Last year, our lifeline was radio. We had enough batteries to keep the radio going as needed; since then, we’ve picked up a hand-cranked one. If you ever want to feel really old-fashioned, crank a radio. It’s one step above churning butter.
Which brings me to refrigeration, or the lack thereof. Last year we were caught off-guard, so just finding unspoiled food became a full-time focus. This year, with warning, we were able to stockpile peanut butter, granola, bagels, cereal, and even the juice-box sized milk boxes that don’t require refrigeration. Luckily we don’t have well water, so at least we don’t lose water.
Without electricity, there’s nothing to power the blower that makes the furnace relevant, so the house gets cold fast. The fireplace keeps one room relatively warm, but “high-maintenance” doesn’t begin to cover it. The occasional ornamental fire is one thing; actually using the thing for heat is something else altogether.
Even light is an issue. When it gets dark before dinner, and your battery supply is finite, and you don’t know how long it’ll be before the power comes back, you have to ration light.
Last year, the power came back in a geographic patchwork, rather than all at once. (That makes sense, given that the issue was downed lines.) That meant a sort of foraging, as we looked for places with heat and, ideally, cooked food. We were lucky before that we got gas the night before everything went dead, so we didn’t have to wait in the gas lines we saw. This time, we made sure to get gas and cash. There’s something vaguely Mad Max about it, but there it is.
On Sunday the projected path of Sandy showed it moving north through western New York, crossing Lake Ontario northward towards Toronto. That may not mean much to many people, but to those of us who grew up along Lake Ontario, the idea of a storm moving north across the Lake is deeply weird. They don’t do that. They move either south or east. I don’t remember ever seeing one move north. It’s such a given that it never occurred to me that it was a given until I saw it violated. Toronto will get lake effect rain from Rochester? That. Is. Not. Normal.
As folks who know me can attest, I like my gadgets. I’m a fan of technological progress, and I have little patience for those who try to argue that, say, ditto machines were superior to photocopiers. But as grid failures become more common -- whether through climate change, deregulation-driven neglect, increased demand, or some combination thereof -- I find myself relying more often on newspapers, radio, cash, and firewood.
In a way, we’ve mastered backwards time travel technology. We use it every time the grid goes down. The kids think it has always been that way. It’s up to TW and me, as ambassadors from the past, to explain that no, it wasn’t.
It has been an exercise in time travel.
When TW and I were kids, the only reason that schools closed was snow. A good blizzard, or maybe a stray ice storm, would do it; otherwise, we went. The Boy and The Girl didn’t believe me when I told them that; bless their short time horizons, they think annual hurricanes are normal. They don’t see anything odd in “frankenstorms” or “thundersnow” or the other weird weather hybrids that have been popping up with unnerving frequency. They think annual extended blackouts are normal. The reliable power that I remember as a kid has become an historical artifact.
When the power goes out, most of the recent technological advances quickly become irrelevant. Anything internet-based is inaccessible without electricity, and batteries drain pretty quick. (Last year, even the local cell towers went dead, so I couldn’t even use tethering to compensate for dead wifi.) The old copper land line went away years ago, replaced by the cable version that goes down when the electricity does. Even television is out.
Last year, our lifeline was radio. We had enough batteries to keep the radio going as needed; since then, we’ve picked up a hand-cranked one. If you ever want to feel really old-fashioned, crank a radio. It’s one step above churning butter.
Which brings me to refrigeration, or the lack thereof. Last year we were caught off-guard, so just finding unspoiled food became a full-time focus. This year, with warning, we were able to stockpile peanut butter, granola, bagels, cereal, and even the juice-box sized milk boxes that don’t require refrigeration. Luckily we don’t have well water, so at least we don’t lose water.
Without electricity, there’s nothing to power the blower that makes the furnace relevant, so the house gets cold fast. The fireplace keeps one room relatively warm, but “high-maintenance” doesn’t begin to cover it. The occasional ornamental fire is one thing; actually using the thing for heat is something else altogether.
Even light is an issue. When it gets dark before dinner, and your battery supply is finite, and you don’t know how long it’ll be before the power comes back, you have to ration light.
Last year, the power came back in a geographic patchwork, rather than all at once. (That makes sense, given that the issue was downed lines.) That meant a sort of foraging, as we looked for places with heat and, ideally, cooked food. We were lucky before that we got gas the night before everything went dead, so we didn’t have to wait in the gas lines we saw. This time, we made sure to get gas and cash. There’s something vaguely Mad Max about it, but there it is.
On Sunday the projected path of Sandy showed it moving north through western New York, crossing Lake Ontario northward towards Toronto. That may not mean much to many people, but to those of us who grew up along Lake Ontario, the idea of a storm moving north across the Lake is deeply weird. They don’t do that. They move either south or east. I don’t remember ever seeing one move north. It’s such a given that it never occurred to me that it was a given until I saw it violated. Toronto will get lake effect rain from Rochester? That. Is. Not. Normal.
As folks who know me can attest, I like my gadgets. I’m a fan of technological progress, and I have little patience for those who try to argue that, say, ditto machines were superior to photocopiers. But as grid failures become more common -- whether through climate change, deregulation-driven neglect, increased demand, or some combination thereof -- I find myself relying more often on newspapers, radio, cash, and firewood.
In a way, we’ve mastered backwards time travel technology. We use it every time the grid goes down. The kids think it has always been that way. It’s up to TW and me, as ambassadors from the past, to explain that no, it wasn’t.
Thursday, October 25, 2012
Friday Fragments
This piece on the implications for higher ed in the election is well worth a read. Among other things, it helps to explain the thinking behind the abrupt cut in student lifetime Pell grant eligibility from 18 semesters to 12. Apparently, Republicans wanted to cut funding for the program, and Democrats wanted to preserve the maximum value of a grant, so the compromise was to keep the dollar cap but reduce the number of students eligible.
-----------
The Wife: You can be anything you want to be.
The Girl: Except a bird!
-----------
Workforce training matters, but the first order of business is getting the economy rolling. When people go through programs with no job waiting for them on the other end, it doesn’t help much.
In my darker moments, I wonder if some of the political consensus on training as the answer is a function of a sort of venn diagram; it’s one of the few areas of overlap between the parties. The conservatives don’t want to stimulate aggregate demand, since that might involve a short-term sacrifice by their base. And the liberals are too timid to push the full Keynesian treatment. So we pretend that the employment freefall from 2007 to 2009 was a function of tectonic shifts in job skills, rather than the inevitable collapse of what amounted to a Ponzi scheme. If the problem is unskilled workers, then we all know, more or less, what needs to be done. If the problem is unregulated finance capital, then the political consensus evaporates.
I’m glad to support training programs in fields with genuine prospects for graduates. But in fields with limited or negative demand, it’s hard to argue that the obstacle to prosperity is a lack of workers. Any long-suffering adjunct knows that.
----------
Since The Wife started working in the local elementary school, she has been coming home with stories about the things the kids there say and do.
Yesterday at dinner she shared with us the way she says the Pledge of Allegiance in class.
I pledge allegiance to the flag
Ryan, look at the flag!
of the United States of America
Look at the flag! Over there!
And to the republic for which it stands
Get your hands out of your pants!
One nation, under God
Put that down!
Indivisible, with liberty
Madison! Hailey! MADISON!!
And justice for all
Sit down, Tyler. Sit down, Tyler. Tyler? Tyler!
There’s the pledge as written, and there’s the pledge as performed by a roomful of squirmy first graders.
Somehow, the latter is more reassuring. Any lit critters looking for a “reader response” exercise in its rawest form should watch some first graders in action sometime.
-----------
The Wife: You can be anything you want to be.
The Girl: Except a bird!
-----------
Workforce training matters, but the first order of business is getting the economy rolling. When people go through programs with no job waiting for them on the other end, it doesn’t help much.
In my darker moments, I wonder if some of the political consensus on training as the answer is a function of a sort of venn diagram; it’s one of the few areas of overlap between the parties. The conservatives don’t want to stimulate aggregate demand, since that might involve a short-term sacrifice by their base. And the liberals are too timid to push the full Keynesian treatment. So we pretend that the employment freefall from 2007 to 2009 was a function of tectonic shifts in job skills, rather than the inevitable collapse of what amounted to a Ponzi scheme. If the problem is unskilled workers, then we all know, more or less, what needs to be done. If the problem is unregulated finance capital, then the political consensus evaporates.
I’m glad to support training programs in fields with genuine prospects for graduates. But in fields with limited or negative demand, it’s hard to argue that the obstacle to prosperity is a lack of workers. Any long-suffering adjunct knows that.
----------
Since The Wife started working in the local elementary school, she has been coming home with stories about the things the kids there say and do.
Yesterday at dinner she shared with us the way she says the Pledge of Allegiance in class.
I pledge allegiance to the flag
Ryan, look at the flag!
of the United States of America
Look at the flag! Over there!
And to the republic for which it stands
Get your hands out of your pants!
One nation, under God
Put that down!
Indivisible, with liberty
Madison! Hailey! MADISON!!
And justice for all
Sit down, Tyler. Sit down, Tyler. Tyler? Tyler!
There’s the pledge as written, and there’s the pledge as performed by a roomful of squirmy first graders.
Somehow, the latter is more reassuring. Any lit critters looking for a “reader response” exercise in its rawest form should watch some first graders in action sometime.
Wednesday, October 24, 2012
Overheard in the Locker Room
One of the consolations of middle age is that it brings the power of invisibility. That brings with it a certain amount of unintentional eavesdropping.
Earlier this week, as I was getting changed in the locker room before work, I overheard a retiree -- I’d put him around 70 -- talking to a student who I’d put around 19. The exchange:
Retiree: Enjoy yourself now, young man. Once you start working and join the real world, the party’s over, yes, sir.
Student: Actually, I work about 45 hours a week now. I have afternoon shifts at (local employer).
Retiree: You do? When do you do your homework?
Student: (laughs) It’s hard.
The exchange, as short as it was, gave me pause. From the tone of it, I don’t think either man was kidding or pranking; it sounded pretty straightforward. But the assumption gap between the two was glaring.
The older man seemed to assume that college was a relatively carefree time in which a young man could spent most of his time, well, being young. That’s a popular image of college, and there’s ample historical precedent for it.
But the younger man is living in a very different world. For him, college is one set of time commitments among others, and his days are all about time management. Just the fact that he was in the gym at dark o’clock in the morning suggested a certain density to his day; at that age, I was dead to the world at that hour. (I assume he has some “being young” time in there somewhere; some things don’t change.)
The perception gap between them matters, I think, because the older man’s cohort has far more political power than the younger man’s. Among the people who actually make the decisions that impact everyone, the idea of college as a sort of sybaritic retreat is still the default assumption. And they make decisions based on that. Cut the Pell lifetime limit by a third? Sure, why not? They’re just goofing off anyway...
But they’re not. They’re working harder than most of us did at that age, at greater cost and greater risk.
For some reason, that message still comes as a surprise to many, even to those of us in positions to know better. It’s easy to fall into “kids today...” laments if you don’t look very hard, or if you look backwards with rose-colored glasses. (I went to college before the era of handheld internet devices. Kids did crosswords in class. Distraction is not new.) That’s harmless enough when it’s confined to cranky observations about, say, pajama pants in class. But when we base public policy on it, it’s destructive.
Is there a better narrative out there to describe the world as current students actually experience it? Preferably one that doesn’t involve eavesdropping in locker rooms?
Earlier this week, as I was getting changed in the locker room before work, I overheard a retiree -- I’d put him around 70 -- talking to a student who I’d put around 19. The exchange:
Retiree: Enjoy yourself now, young man. Once you start working and join the real world, the party’s over, yes, sir.
Student: Actually, I work about 45 hours a week now. I have afternoon shifts at (local employer).
Retiree: You do? When do you do your homework?
Student: (laughs) It’s hard.
The exchange, as short as it was, gave me pause. From the tone of it, I don’t think either man was kidding or pranking; it sounded pretty straightforward. But the assumption gap between the two was glaring.
The older man seemed to assume that college was a relatively carefree time in which a young man could spent most of his time, well, being young. That’s a popular image of college, and there’s ample historical precedent for it.
But the younger man is living in a very different world. For him, college is one set of time commitments among others, and his days are all about time management. Just the fact that he was in the gym at dark o’clock in the morning suggested a certain density to his day; at that age, I was dead to the world at that hour. (I assume he has some “being young” time in there somewhere; some things don’t change.)
The perception gap between them matters, I think, because the older man’s cohort has far more political power than the younger man’s. Among the people who actually make the decisions that impact everyone, the idea of college as a sort of sybaritic retreat is still the default assumption. And they make decisions based on that. Cut the Pell lifetime limit by a third? Sure, why not? They’re just goofing off anyway...
But they’re not. They’re working harder than most of us did at that age, at greater cost and greater risk.
For some reason, that message still comes as a surprise to many, even to those of us in positions to know better. It’s easy to fall into “kids today...” laments if you don’t look very hard, or if you look backwards with rose-colored glasses. (I went to college before the era of handheld internet devices. Kids did crosswords in class. Distraction is not new.) That’s harmless enough when it’s confined to cranky observations about, say, pajama pants in class. But when we base public policy on it, it’s destructive.
Is there a better narrative out there to describe the world as current students actually experience it? Preferably one that doesn’t involve eavesdropping in locker rooms?
Tuesday, October 23, 2012
“That’s an Implementation Issue”
Back in my feminist theory days -- yes, I had feminist theory days -- I remember learning that strict body/mind distinctions were suspect. In the halcyon days of postmodernism, we learned that clear fact/value distinctions were mystifications, that public/private splits were far more problematic than usually supposed, and that subject/object distinctions were almost entirely perspectival.
I was reminded of that this week in a discussion about a proposed program. When I raised a series of questions about the practicality of it, I was hit with the concept/implementation distinction. And I realized that from the perspective of someone responsible for budgeting and staffing, the distinction is false. A concept that can’t be implemented is a flawed concept.
That cuts against the grain of a certain kind of idealism. (Postmodernism did the same thing.) It suggests that the popular move of attacking from a position of presumed perfection is inherently suspect. The “critique from imagined perfection” erases the embodied reality of an institution and wishes away the messy realities of resource constraints, other perspectives, and hard-won laws and habits. At base, the critique from perfection is narcissistic; it presumes that the perspective of the critic is the only one free of contingency, or messy particulars. Or, what amounts to the same thing, that other people’s needs just don’t count.
That’s a difficult point to convey to the idealist, who thinks he’s being selfless. He thinks that the shining truth of the idea transcends any individual perspective, and that he’s just being clearsighted about it. The lefty version of that perspective assumes that justice involves approaching the ideal asymptotically; the conservative version assumes a falling away from it, and only hopes to slow the decline. But either way, the truth of the idea is presumed to exist independent of the people holding it.
But the presumption of an entitlement to bulldoze messy reality to fit a personally held idea is nothing if not selfish. Ideas are embodied, and bodies exist in contingent networks of power, resource, flaw, and need. Nobody is above that.
As anyone who spent time in the weeds of postmodernism knows, it’s possible to get lost and paralyzed in an infinite regression of what’s already implicated in what. But that, too, strikes me as a form of selfishness. It takes for granted the work of social construction, and attacks those constructs parasitically.
And here’s where I fled postmodernism for its American cousin, pragmatism. At some point, you have to make a decision if you actually want to get anything done. That doesn’t mean either denying contingencies or surrendering to them; it means accepting the reality of them and owning the decision to move anyway. It means rejecting both the “critique from perfection” and fatalism, and, not incidentally, noticing that the former is often just a dressed-up version of the latter.
From this perspective, the way to attack an existing practice or idea is to propose -- or, preferably, to develop -- a better one. An idea that relies on people to be superhuman is bound to fail, and therefore of little interest; I’d much rather hear about something that could actually work. That means doing the hard work of tending to the details. Do we really have that many classrooms available at 3:00? Would that class actually transfer? What are the financial aid implications? What about staffing? How would we sustain it when the grant runs out? How does this fit with students’ plans? Who would run it? How would it fit in the curriculum? What would we have to displace to make room for it?
Those aren’t technicalities to be waved away by the heroic leader. They’re the guts of the organization, each with its own history and reasons, and they matter. The folks who win my respect are the ones who come to grips with those issues and continue to move forward anyway. I’ve seen it done. Done well, it makes a tremendous difference. It’s harder than just opining from on high and passing dismissive judgment on mere mortals, but it carries the prospect of real, sustainable, positive results.
Pure, unadulterated certainty can be intoxicating and addictive. In small doses, in the right moments, it can provide some motivation. But the high comes at a cost, and the addict is every bit as selfish as any other addict. The feminist theorists had a great point when they noted that we’re all embodied, and flawed, and, in some sense, blinkered. The lesson I drew from that was a need for humility in the face of complicated, messy realities. But the humility isn’t in the service of fatalism or a flight to innocence and virtue. It’s in the service of making changes that aren’t doomed from the outset. The “beautiful loser” may be romantic, but I prefer wins we can actually implement.
I was reminded of that this week in a discussion about a proposed program. When I raised a series of questions about the practicality of it, I was hit with the concept/implementation distinction. And I realized that from the perspective of someone responsible for budgeting and staffing, the distinction is false. A concept that can’t be implemented is a flawed concept.
That cuts against the grain of a certain kind of idealism. (Postmodernism did the same thing.) It suggests that the popular move of attacking from a position of presumed perfection is inherently suspect. The “critique from imagined perfection” erases the embodied reality of an institution and wishes away the messy realities of resource constraints, other perspectives, and hard-won laws and habits. At base, the critique from perfection is narcissistic; it presumes that the perspective of the critic is the only one free of contingency, or messy particulars. Or, what amounts to the same thing, that other people’s needs just don’t count.
That’s a difficult point to convey to the idealist, who thinks he’s being selfless. He thinks that the shining truth of the idea transcends any individual perspective, and that he’s just being clearsighted about it. The lefty version of that perspective assumes that justice involves approaching the ideal asymptotically; the conservative version assumes a falling away from it, and only hopes to slow the decline. But either way, the truth of the idea is presumed to exist independent of the people holding it.
But the presumption of an entitlement to bulldoze messy reality to fit a personally held idea is nothing if not selfish. Ideas are embodied, and bodies exist in contingent networks of power, resource, flaw, and need. Nobody is above that.
As anyone who spent time in the weeds of postmodernism knows, it’s possible to get lost and paralyzed in an infinite regression of what’s already implicated in what. But that, too, strikes me as a form of selfishness. It takes for granted the work of social construction, and attacks those constructs parasitically.
And here’s where I fled postmodernism for its American cousin, pragmatism. At some point, you have to make a decision if you actually want to get anything done. That doesn’t mean either denying contingencies or surrendering to them; it means accepting the reality of them and owning the decision to move anyway. It means rejecting both the “critique from perfection” and fatalism, and, not incidentally, noticing that the former is often just a dressed-up version of the latter.
From this perspective, the way to attack an existing practice or idea is to propose -- or, preferably, to develop -- a better one. An idea that relies on people to be superhuman is bound to fail, and therefore of little interest; I’d much rather hear about something that could actually work. That means doing the hard work of tending to the details. Do we really have that many classrooms available at 3:00? Would that class actually transfer? What are the financial aid implications? What about staffing? How would we sustain it when the grant runs out? How does this fit with students’ plans? Who would run it? How would it fit in the curriculum? What would we have to displace to make room for it?
Those aren’t technicalities to be waved away by the heroic leader. They’re the guts of the organization, each with its own history and reasons, and they matter. The folks who win my respect are the ones who come to grips with those issues and continue to move forward anyway. I’ve seen it done. Done well, it makes a tremendous difference. It’s harder than just opining from on high and passing dismissive judgment on mere mortals, but it carries the prospect of real, sustainable, positive results.
Pure, unadulterated certainty can be intoxicating and addictive. In small doses, in the right moments, it can provide some motivation. But the high comes at a cost, and the addict is every bit as selfish as any other addict. The feminist theorists had a great point when they noted that we’re all embodied, and flawed, and, in some sense, blinkered. The lesson I drew from that was a need for humility in the face of complicated, messy realities. But the humility isn’t in the service of fatalism or a flight to innocence and virtue. It’s in the service of making changes that aren’t doomed from the outset. The “beautiful loser” may be romantic, but I prefer wins we can actually implement.
Monday, October 22, 2012
Telling the Right Story
Some movies don’t impress me much in the moment I’m watching them, but age well in the recollection. (“Fargo” was like that.) They typically have more going on than meets the eye, and the first impression doesn’t do them justice.
The CASE conference was like that for me. I enjoyed the conference, but one lesson from it has stubbornly stuck in my mind ever since. I don’t think I fully appreciated it in the moment.
It’s about telling the right story.
It’s hardly news that public higher education is under unprecedented scrutiny. Years of a rough job market for new graduates, combined with tuition increases, combined with a lingering sense that colleges are job programs for aging hippies, have put public colleges and universities in an unaccustomed spot. And I’m embarrassed to admit that the shift caught many of us off-guard.
The sector has fought political battles before, but they were different, and the scripts we developed back then don’t work now. In the 70’s, I’m told, the issues were about hippies and protests generally. In the 90’s, they were about diversity and multiculturalism. (Anyone remember the “culture wars?” Back when conservatives believed that the humanities mattered enough to fight about? Good times...) Now they’re about cost.
As a sector, we’re having a hard time finding the right script for this one.
The stories we told in past conflicts don’t help. “Free speech” is a fine defense when you’re accused of harboring liberals, but it doesn’t do much to address tuition increases. “Teach the conflict” may have been a useful way around the definition of the literary canon, but it’s pretty off-point when discussing budget cuts.
The first impulse is usually some variation on denial. “We’re just making up for state cuts” is true in the short term, but only partially true over the long term, and not helpful for students facing increased loan burdens and a tough job market. And given the reputational nature of higher ed, there’s a limit to how much bragging you want to do about austerity. (“Come to Compass Direction State. We’ve reduced the humanities to an online video!”)
We’ve used the “lifetime payoff” argument for a long time, generally to good effect. But that argument gets less convincing when the cost to the student goes up and entry-level opportunities go down. Yes, you may be better off in ten years, but if you need to the rent now, that’s of little comfort.
“Inspiring stories” are always good; the fundraisers are especially fond of them. They put a human face on success, they make abstractions accessible, and they give warm fuzzies all around. But the last few years suggest limits to the strategy, and it can inadvertently play into the myth that superpeople don’t need institutions in the first place. It can also inadvertently feed some pretty negative stereotypes about public colleges, especially community colleges. In the American political imagination, institutions that are closely identified with the poor quickly become poor themselves. Let’s not paint ourselves into a corner here.
President Obama is fond of the “educated workforce” argument, which is compelling to people who major in public policy. So we’ve locked up that vote. But it reduces education to training, and it makes us even more vulnerable to blame when a graduate crashes into a recession. I’d like to see much more focus on the “transfer” story, but we haven’t developed a good hook for that yet. And stories like “the second chance reverse transfer” are much too complicated to sell to a skeptical public.
Wise and worldly readers, have you seen or heard a better story for demonstrating the value of public higher ed to the public? Ideally something pithy, clear, true, and unlikely to bite back?
The CASE conference was like that for me. I enjoyed the conference, but one lesson from it has stubbornly stuck in my mind ever since. I don’t think I fully appreciated it in the moment.
It’s about telling the right story.
It’s hardly news that public higher education is under unprecedented scrutiny. Years of a rough job market for new graduates, combined with tuition increases, combined with a lingering sense that colleges are job programs for aging hippies, have put public colleges and universities in an unaccustomed spot. And I’m embarrassed to admit that the shift caught many of us off-guard.
The sector has fought political battles before, but they were different, and the scripts we developed back then don’t work now. In the 70’s, I’m told, the issues were about hippies and protests generally. In the 90’s, they were about diversity and multiculturalism. (Anyone remember the “culture wars?” Back when conservatives believed that the humanities mattered enough to fight about? Good times...) Now they’re about cost.
As a sector, we’re having a hard time finding the right script for this one.
The stories we told in past conflicts don’t help. “Free speech” is a fine defense when you’re accused of harboring liberals, but it doesn’t do much to address tuition increases. “Teach the conflict” may have been a useful way around the definition of the literary canon, but it’s pretty off-point when discussing budget cuts.
The first impulse is usually some variation on denial. “We’re just making up for state cuts” is true in the short term, but only partially true over the long term, and not helpful for students facing increased loan burdens and a tough job market. And given the reputational nature of higher ed, there’s a limit to how much bragging you want to do about austerity. (“Come to Compass Direction State. We’ve reduced the humanities to an online video!”)
We’ve used the “lifetime payoff” argument for a long time, generally to good effect. But that argument gets less convincing when the cost to the student goes up and entry-level opportunities go down. Yes, you may be better off in ten years, but if you need to the rent now, that’s of little comfort.
“Inspiring stories” are always good; the fundraisers are especially fond of them. They put a human face on success, they make abstractions accessible, and they give warm fuzzies all around. But the last few years suggest limits to the strategy, and it can inadvertently play into the myth that superpeople don’t need institutions in the first place. It can also inadvertently feed some pretty negative stereotypes about public colleges, especially community colleges. In the American political imagination, institutions that are closely identified with the poor quickly become poor themselves. Let’s not paint ourselves into a corner here.
President Obama is fond of the “educated workforce” argument, which is compelling to people who major in public policy. So we’ve locked up that vote. But it reduces education to training, and it makes us even more vulnerable to blame when a graduate crashes into a recession. I’d like to see much more focus on the “transfer” story, but we haven’t developed a good hook for that yet. And stories like “the second chance reverse transfer” are much too complicated to sell to a skeptical public.
Wise and worldly readers, have you seen or heard a better story for demonstrating the value of public higher ed to the public? Ideally something pithy, clear, true, and unlikely to bite back?
Sunday, October 21, 2012
Mad Scientists and Marshmallows
Last week I had the chance to talk to a group of new full-time faculty. Someone in the group asked me what I considered my goal as an administrator, especially regarding faculty.
It was a nifty question, and I probably should have expected it. But since the question came out of the blue, my answer did, too.
I’d love to see a culture in which faculty use their academic freedom to experiment. In my ideal setting, they’d be working together -- and separately, as appropriate -- to keep trying different approaches to helping students succeed. That could mean different teaching techniques, different scheduling ideas, different course content, novel uses of technology, or whatever; the one thing it absolutely would not mean is doing the exact same thing year after year. I would love to see faculty as a group of mad scientists, innovating gleefully.
This story about the marshmallow experiment came out at about the same time, and I think it offers a useful nuance. Most of us know the classic marshmallow study in which young children were left alone for several minutes in a room with a marshmallow. They were told that they could eat the marshmallow, or, if they managed not to until the adult returned, they could have two. The kids who exhibited enough self-discipline to hold out for the second marshmallow wound up having better lives by a host of measures.
Apparently, researchers at the University of Rochester replicated the study, but with a twist. They had some adults come through with the second marshmallow, and others seemingly forget. Then they ran the experiment again with the same kids. Unsurprisingly, kids whose trust had been violated the first time were much less likely to defer gratification the second time. It’s one thing to wait for a payoff; it’s quite another to wait for a broken promise. The study suggested that kids whose home lives are chaotic will have a harder time in school, since they will have a harder time believing that delaying gratification will result in a payoff. At home, the promised marshmallow never comes. Why would school be different?
It occurred to me that, in a sense, I’m hoping that faculty will wait for the second marshmallow. I’m hoping that they’ll use their autonomy and academic freedom to experiment, rather than to coast (or fulminate). Which requires a certain faith on their part that there will be some sort of payoff, and that they won’t be punished if an experiment fails.
With people who are relatively new to the college, it’s easier to set a certain expectation. But with those who’ve been here longer, through various administrations, it can be hard to get past old, forgotten marshmallows. Habits learned early are hard to shake. That’s why the marshmallow study matters.
In the meantime, here’s hoping that enough security will lead to gleeful experimentation, rather than just digging in. The marshmallow parallel works in two directions, after all.
It was a nifty question, and I probably should have expected it. But since the question came out of the blue, my answer did, too.
I’d love to see a culture in which faculty use their academic freedom to experiment. In my ideal setting, they’d be working together -- and separately, as appropriate -- to keep trying different approaches to helping students succeed. That could mean different teaching techniques, different scheduling ideas, different course content, novel uses of technology, or whatever; the one thing it absolutely would not mean is doing the exact same thing year after year. I would love to see faculty as a group of mad scientists, innovating gleefully.
This story about the marshmallow experiment came out at about the same time, and I think it offers a useful nuance. Most of us know the classic marshmallow study in which young children were left alone for several minutes in a room with a marshmallow. They were told that they could eat the marshmallow, or, if they managed not to until the adult returned, they could have two. The kids who exhibited enough self-discipline to hold out for the second marshmallow wound up having better lives by a host of measures.
Apparently, researchers at the University of Rochester replicated the study, but with a twist. They had some adults come through with the second marshmallow, and others seemingly forget. Then they ran the experiment again with the same kids. Unsurprisingly, kids whose trust had been violated the first time were much less likely to defer gratification the second time. It’s one thing to wait for a payoff; it’s quite another to wait for a broken promise. The study suggested that kids whose home lives are chaotic will have a harder time in school, since they will have a harder time believing that delaying gratification will result in a payoff. At home, the promised marshmallow never comes. Why would school be different?
It occurred to me that, in a sense, I’m hoping that faculty will wait for the second marshmallow. I’m hoping that they’ll use their autonomy and academic freedom to experiment, rather than to coast (or fulminate). Which requires a certain faith on their part that there will be some sort of payoff, and that they won’t be punished if an experiment fails.
With people who are relatively new to the college, it’s easier to set a certain expectation. But with those who’ve been here longer, through various administrations, it can be hard to get past old, forgotten marshmallows. Habits learned early are hard to shake. That’s why the marshmallow study matters.
In the meantime, here’s hoping that enough security will lead to gleeful experimentation, rather than just digging in. The marshmallow parallel works in two directions, after all.
Thursday, October 18, 2012
Friday Fragments
The $249 chromebook is the best idea I’ve heard all week. It seems like the chromebook is finally moving from “proof of concept” to “something actual people would actually buy.” Finally, decent size and specs at a community college price. This could fulfill the promise that netbooks made, but crapped out on, back in 2009.
----------
Minnesota is banning Coursera? Say what you want about MOOCs, but this is catastrophically stupid. 1001 varieties of internet porn? No problem! But using the web for unauthorized learning? Scandalous!
For those who aren’t fans of MOOCs, the way to defeat them is to offer something better. Relying on state-level protectionism is not going to cut it. Anyone with a VPN can make a mockery of this, and rightly so. Honestly, when I think about all of the things that people can, and will, do on the internet, following free academic classes is the least of my concerns.
--------
It will surprise nobody that I plan to vote for President Obama, but I have to admit being annoyed at him. During the second debate, he continued to use “community colleges” and “job training centers” interchangeably. They aren’t. Community colleges are important job training and workforce development sites, but they’re also -- and I use this word deliberately -- colleges. For many students, taking the first two years of a four year degree at a community college is a viable way to get an education while keeping costs down. Given that student loan burdens are a major issue, it would be nice for someone in public life to connect those dots.
---------
The Girl is starting to decipher genre. We’ve watched a few episodes of “Gilligan’s Island” over the last few weeks; it’s a gobsmacking nostalgia trip for me, and she enjoys the candy-colored slapstick. As with the old “Star Trek” episodes, I have to do some serious deprogramming of the casual sexism, lest she get too much of it, but with enough parental counterpoint, it still seems worthwhile.
After a recent episode, she turned to me and said “I get it! Gilligan is like SpongeBob, and the Skipper is like Squidward!”
I hadn’t thought of it that way, but she was basically right. What made it gratifying, though, was that she was able to recognize genre. The goofy, carefree underling who flusters the voluble but basically harmless boss -- that could be Gilligan, or it could be SpongeBob.
Pretty good for a third grader, I think.
--------
This story made me smile, albeit wistfully. Some public universities are going to their legislatures with a proposition: restore subsidies, and we’ll hold the line on tuition.
In a more perfect world, legislatures would jump at the deal. But I have no illusions that the current crop will.
The great virtue of this strategy is that it connects cause and effect. (More cynically, it provides a palatable excuse for a university to do what it was going to do anyway.) I’m a fan of reality-based decisions, so I like the idea of pointing out explicitly that much of the recent spike in tuition increases is a function of cost-shifting, rather than a lack of discipline. If you want to flatten the spike, stop cost-shifting.
Unfortunately, I can imagine a fairly smart argument from the other side: in the absence of a squeeze, higher education isn’t known for cost discipline. So I’ll suggest a different idea:
Ask the legislatures to fund experiments. Make money available, conditional on trying something different. And I don’t mean yet another workforce program. I mean something that addresses the underlying cost disease of higher education, something that gets at the credit hour and the various structural issues that push up costs at every institution, regardless of local quirks. If you want a system fix, pony up resources for people to try some.
Otherwise, we’ll be stuck in annual games of budgetary chicken, with diminishing returns. The for-profits are already suffering; if we don’t change, we’ll be next. And asking the legislature to keep Coursera out of town is not a serious answer.
----------
Minnesota is banning Coursera? Say what you want about MOOCs, but this is catastrophically stupid. 1001 varieties of internet porn? No problem! But using the web for unauthorized learning? Scandalous!
For those who aren’t fans of MOOCs, the way to defeat them is to offer something better. Relying on state-level protectionism is not going to cut it. Anyone with a VPN can make a mockery of this, and rightly so. Honestly, when I think about all of the things that people can, and will, do on the internet, following free academic classes is the least of my concerns.
--------
It will surprise nobody that I plan to vote for President Obama, but I have to admit being annoyed at him. During the second debate, he continued to use “community colleges” and “job training centers” interchangeably. They aren’t. Community colleges are important job training and workforce development sites, but they’re also -- and I use this word deliberately -- colleges. For many students, taking the first two years of a four year degree at a community college is a viable way to get an education while keeping costs down. Given that student loan burdens are a major issue, it would be nice for someone in public life to connect those dots.
---------
The Girl is starting to decipher genre. We’ve watched a few episodes of “Gilligan’s Island” over the last few weeks; it’s a gobsmacking nostalgia trip for me, and she enjoys the candy-colored slapstick. As with the old “Star Trek” episodes, I have to do some serious deprogramming of the casual sexism, lest she get too much of it, but with enough parental counterpoint, it still seems worthwhile.
After a recent episode, she turned to me and said “I get it! Gilligan is like SpongeBob, and the Skipper is like Squidward!”
I hadn’t thought of it that way, but she was basically right. What made it gratifying, though, was that she was able to recognize genre. The goofy, carefree underling who flusters the voluble but basically harmless boss -- that could be Gilligan, or it could be SpongeBob.
Pretty good for a third grader, I think.
--------
This story made me smile, albeit wistfully. Some public universities are going to their legislatures with a proposition: restore subsidies, and we’ll hold the line on tuition.
In a more perfect world, legislatures would jump at the deal. But I have no illusions that the current crop will.
The great virtue of this strategy is that it connects cause and effect. (More cynically, it provides a palatable excuse for a university to do what it was going to do anyway.) I’m a fan of reality-based decisions, so I like the idea of pointing out explicitly that much of the recent spike in tuition increases is a function of cost-shifting, rather than a lack of discipline. If you want to flatten the spike, stop cost-shifting.
Unfortunately, I can imagine a fairly smart argument from the other side: in the absence of a squeeze, higher education isn’t known for cost discipline. So I’ll suggest a different idea:
Ask the legislatures to fund experiments. Make money available, conditional on trying something different. And I don’t mean yet another workforce program. I mean something that addresses the underlying cost disease of higher education, something that gets at the credit hour and the various structural issues that push up costs at every institution, regardless of local quirks. If you want a system fix, pony up resources for people to try some.
Otherwise, we’ll be stuck in annual games of budgetary chicken, with diminishing returns. The for-profits are already suffering; if we don’t change, we’ll be next. And asking the legislature to keep Coursera out of town is not a serious answer.
Wednesday, October 17, 2012
Phoenix or Canary?
The University of Phoenix, the largest for-profit higher education provider in the country, is closing over a hundred sites. That’s over half of its physical locations. Part of the move is driven by enrollment decline, and part by an increased emphasis on online course delivery.
Although many in traditional higher ed may feel a certain schadenfreude, I was actually saddened by the news. This is hardly an unalloyed good.
Admittedly, part of my perspective comes from having worked in another for-profit, early in my career. At a time when the "virtuous" non-profits offered only adjunct work, a local for-profit offered me a full time job with a living wage and health insurance. And I wasn't the only one; I landed in a department with a cluster of young Ph.D.s who had never intended to land there. For most of us, it functioned as a port in a storm. Many have since moved on to other places -- mostly nonprofit -- but would not have had the opportunity if not for the first big break.
That isn't as idiosyncratic as it may sound. Just as for-profits accounted for most of the enrollment growth over the last decade, they also accounted for a disproportionate share of the employment growth over the last decade. For all of their flaws -- and I'm not disputing those -- they hired good new people when nobody else did. That matters.
I'd strongly counsel hiring committees at community colleges not to turn up their noses at applicants who've worked in for-profits. Some terrific people landed there, just as some terrific people have landed in part-time or adjunct positions. And much of the day-to-day work is less different than you might imagine. Some Phoenix castoffs may be well worth taking seriously.
For a while, the for-profits grew like kudzu. They had the considerable advantage of a business model in which enrollment growth more than paid for itself. (Publics run at a loss, by design.) Unlike their public counterparts, they didn’t have to reduce their offerings when demand increased. And unlike private nonprofits, they weren’t wedded to, say, summer breaks. They could scale up quickly, and they did.
But a model built on tuition alone is inherently unstable. Small drops in income require significant cuts in spending. As tired as those of us on the public side are of dealing with cuts, at least we have some sort of (admittedly shrinking) cushion in the operating budget to offset losses of tuition. When states respect that cushion, and it’s large enough, it becomes possible to make (and live up to) longish term plans.
The fatal flaw of the for-profits, in my mind, isn't that they're fundamentally different from traditional colleges; it's that they're fundamentally the same. They use the same measures of student achievement, the same sequences of courses, and many of the same assumptions as everyone else. Since they have many of the same cost drivers, and they lack the tax exemptions and public subsides of the nonprofits, they have to be clever to stay ahead. That worked for a while, but a combination of a more hostile political climate and some gradual learning among the nonprofits has changed the equation. Their lack of cushion means they experience shocks faster and harder than we do, but the shocks themselves aren’t really different.
I understand the impulse to chortle at Phoenix's misfortune. But let's not assume that the same issues that have plagued it won't plague us. I see the news less as confirmation that the critics of for-profits were right than as a warning sign that we could be next. When Phoenix rose from the ashes, it signaled a new wave in higher education. Now it may be the canary in the coal mine. We ignore the signs at our peril.
Although many in traditional higher ed may feel a certain schadenfreude, I was actually saddened by the news. This is hardly an unalloyed good.
Admittedly, part of my perspective comes from having worked in another for-profit, early in my career. At a time when the "virtuous" non-profits offered only adjunct work, a local for-profit offered me a full time job with a living wage and health insurance. And I wasn't the only one; I landed in a department with a cluster of young Ph.D.s who had never intended to land there. For most of us, it functioned as a port in a storm. Many have since moved on to other places -- mostly nonprofit -- but would not have had the opportunity if not for the first big break.
That isn't as idiosyncratic as it may sound. Just as for-profits accounted for most of the enrollment growth over the last decade, they also accounted for a disproportionate share of the employment growth over the last decade. For all of their flaws -- and I'm not disputing those -- they hired good new people when nobody else did. That matters.
I'd strongly counsel hiring committees at community colleges not to turn up their noses at applicants who've worked in for-profits. Some terrific people landed there, just as some terrific people have landed in part-time or adjunct positions. And much of the day-to-day work is less different than you might imagine. Some Phoenix castoffs may be well worth taking seriously.
For a while, the for-profits grew like kudzu. They had the considerable advantage of a business model in which enrollment growth more than paid for itself. (Publics run at a loss, by design.) Unlike their public counterparts, they didn’t have to reduce their offerings when demand increased. And unlike private nonprofits, they weren’t wedded to, say, summer breaks. They could scale up quickly, and they did.
But a model built on tuition alone is inherently unstable. Small drops in income require significant cuts in spending. As tired as those of us on the public side are of dealing with cuts, at least we have some sort of (admittedly shrinking) cushion in the operating budget to offset losses of tuition. When states respect that cushion, and it’s large enough, it becomes possible to make (and live up to) longish term plans.
The fatal flaw of the for-profits, in my mind, isn't that they're fundamentally different from traditional colleges; it's that they're fundamentally the same. They use the same measures of student achievement, the same sequences of courses, and many of the same assumptions as everyone else. Since they have many of the same cost drivers, and they lack the tax exemptions and public subsides of the nonprofits, they have to be clever to stay ahead. That worked for a while, but a combination of a more hostile political climate and some gradual learning among the nonprofits has changed the equation. Their lack of cushion means they experience shocks faster and harder than we do, but the shocks themselves aren’t really different.
I understand the impulse to chortle at Phoenix's misfortune. But let's not assume that the same issues that have plagued it won't plague us. I see the news less as confirmation that the critics of for-profits were right than as a warning sign that we could be next. When Phoenix rose from the ashes, it signaled a new wave in higher education. Now it may be the canary in the coal mine. We ignore the signs at our peril.
Tuesday, October 16, 2012
A College Tax?
It’s “Bad Idea Week” over at the Chronicle. They’ve solicited “out of the box” ideas for changing higher education. Some of them -- hey, what if community colleges hired faculty to teach? -- are just banal. (What, exactly, do you think we’ve been doing?) But others are interesting failures.
One proposal was for a dedicated tax specifically to fund higher education. The idea is that higher ed is a public good, like highways, and so it deserves a dedicated funding stream, just like the gas tax.
(Forehead slap.)
Admittedly, there’s a surface appeal. When legislators divert money from higher ed to, say, prisons, it’s easy to be seduced by a mechanism that would take that choice out of their hands.
But it isn’t as simple as that.
Some states already have a variation on this, called “millages.” A millage, as I understand it, is a property tax (expressed in “mills,” which are tens of cents per dollar of assessed value). A community college (or its “district”) will put a millage on the ballot, and it will win or lose.
Entire sessions at the CASE conference were devoted to tactics for winning millages. The public perceives millages as tax increases -- which, to be fair, they are -- and often votes against them. Entering the political process while remaining nonpolitical requires a certain finesse, and a cultural tailwind.
California has taken the concept farther. There, they govern almost entirely by referendum, reducing the legislature to vestigial status, like an appendix. Let’s just say that since the state went to that system, higher education has not fared well.
The college tax manages to combine several bad ideas into one. It isolates colleges politically, making them conspicuous targets. It increases the year-to-year instability of funding, making intelligent planning harder. It completely divorces funding from performance, creating a powerful incentive for colleges to divert funding from education to marketing. Depending on the level of government that assessed the tax -- federal, state, or local -- it could fall prey to any number of political flaws. A federal tax would shortchange the blue states and enrich the red ones, as federal taxes do. A state tax would be vulnerable to a race to the bottom, as states try to lure businesses. A local tax could fall prey to the race to the bottom, but even worse, would quickly reflect existing disparities of wealth; rich areas could have low rates and still be fine, while poor areas could have high rates and still suffer low quality. A quick look at the K-12 system is proof enough of that.
But worst of all, it lays bare for the world to see our single greatest flaw: stagnant productivity.
As a labor-intensive industry with high fixed costs and a time-bound measurement of performance, higher ed’s costs will increase more quickly than most of the rest of the economy. (See this post for details.) There is simply no way that the revenue from the dedicated tax would increase anywhere near as quickly as costs. Over time, we’d be caught in an inexorable pincer movement. For a sense of how that works, look at the California system.
I concede without argument that many of the frustrations we have with state legislatures have some basis in reality. But this idea is so much worse. The frying pan is no fun, but it beats the fire every time. Let’s hope this idea fades along with Bad Idea Week.
One proposal was for a dedicated tax specifically to fund higher education. The idea is that higher ed is a public good, like highways, and so it deserves a dedicated funding stream, just like the gas tax.
(Forehead slap.)
Admittedly, there’s a surface appeal. When legislators divert money from higher ed to, say, prisons, it’s easy to be seduced by a mechanism that would take that choice out of their hands.
But it isn’t as simple as that.
Some states already have a variation on this, called “millages.” A millage, as I understand it, is a property tax (expressed in “mills,” which are tens of cents per dollar of assessed value). A community college (or its “district”) will put a millage on the ballot, and it will win or lose.
Entire sessions at the CASE conference were devoted to tactics for winning millages. The public perceives millages as tax increases -- which, to be fair, they are -- and often votes against them. Entering the political process while remaining nonpolitical requires a certain finesse, and a cultural tailwind.
California has taken the concept farther. There, they govern almost entirely by referendum, reducing the legislature to vestigial status, like an appendix. Let’s just say that since the state went to that system, higher education has not fared well.
The college tax manages to combine several bad ideas into one. It isolates colleges politically, making them conspicuous targets. It increases the year-to-year instability of funding, making intelligent planning harder. It completely divorces funding from performance, creating a powerful incentive for colleges to divert funding from education to marketing. Depending on the level of government that assessed the tax -- federal, state, or local -- it could fall prey to any number of political flaws. A federal tax would shortchange the blue states and enrich the red ones, as federal taxes do. A state tax would be vulnerable to a race to the bottom, as states try to lure businesses. A local tax could fall prey to the race to the bottom, but even worse, would quickly reflect existing disparities of wealth; rich areas could have low rates and still be fine, while poor areas could have high rates and still suffer low quality. A quick look at the K-12 system is proof enough of that.
But worst of all, it lays bare for the world to see our single greatest flaw: stagnant productivity.
As a labor-intensive industry with high fixed costs and a time-bound measurement of performance, higher ed’s costs will increase more quickly than most of the rest of the economy. (See this post for details.) There is simply no way that the revenue from the dedicated tax would increase anywhere near as quickly as costs. Over time, we’d be caught in an inexorable pincer movement. For a sense of how that works, look at the California system.
I concede without argument that many of the frustrations we have with state legislatures have some basis in reality. But this idea is so much worse. The frying pan is no fun, but it beats the fire every time. Let’s hope this idea fades along with Bad Idea Week.
Monday, October 15, 2012
Ads on Campus
Should colleges use on-campus advertising as a revenue source?
The question came up this weekend when I noted on Twitter how spoiled I've become by watching tv almost exclusively on the DVR. Sunday night I caught part of the Cardinals-Giants game, and couldn't help but notice how many ads there were. An alert reader tweeted back that some colleges are doing the same thing on campus, selling ad space everywhere from the sides of campus buses to the walls of student centers.
It's easy to retreat to either extreme, but I wouldn't recommend it. The purist position that colleges should be havens from commercialism is hard to sustain when many of the buildings are named after donors. (At many private colleges, the entire institution is named after a donor. Mr. Rockefeller and Mr. Stanford weren't known for their scholarly contributions.) Naming rights are major sources of capital funding. Similarly, many scholarships and -- in some settings -- endowed chairs are named after donors whose money may not have derived from the life of the mind.
The purist position also bears an uneasy relationship with freedom of speech. It's easy to overstate the "money is speech" position that the Supreme Court seems to endorse, but it's also true that advertising is, among other things, directed information. If a campus tries banning a certain class of information, the precedent could lead to places we don't want to go. Besides, to the extent that students need to be prepared for the real world, learning to navigate commerical information strikes me as part of the deal.
The profit motive is hardly absent from campus anyway. Food and bookstore services are typically run on a for-profit basis, and colleges have long profited by licensing names for sweatshirts, mugs, and the various paraphernalia found in college bookstores.
But I'm still uneasy with going from an acknowledgement of commercial speech to embracing it as salvation.
To the extent that a college derives benefits from its non-profit status -- property tax exemptions, say, and the ability to accept donations -- I can't help but think there's a public trust to uphold. (Public institutions take that a step farther, adding direct subsidies to the equation.) For a college to have an official soft drink, say, strikes me as exploiting its public trust. Depending on how far the endorsement or identification goes, there's a real risk of diluting the college's brand.
I'd also worry about influence. This is a chronic concern with named chairs, for example: would the Donald Trump Chair of Economics really be free to publish on the details of corruption among real estate developers?
Finally, I'd worry about stability. Market fortunes ebb and flow much more quickly than academic needs do. If the endorsement money goes to capital, or is basically ancillary, then that doesn't matter much. But if it starts to fund operations, then the college is even more exposed to market swings than it already is.
I've heard of major endorsement deals at major universities, but not at community colleges. Wise and worldly readers, have you seen (or heard of) major commericalization deals in the community college sector? If so, are there any lessons learned for the rest of us?
The question came up this weekend when I noted on Twitter how spoiled I've become by watching tv almost exclusively on the DVR. Sunday night I caught part of the Cardinals-Giants game, and couldn't help but notice how many ads there were. An alert reader tweeted back that some colleges are doing the same thing on campus, selling ad space everywhere from the sides of campus buses to the walls of student centers.
It's easy to retreat to either extreme, but I wouldn't recommend it. The purist position that colleges should be havens from commercialism is hard to sustain when many of the buildings are named after donors. (At many private colleges, the entire institution is named after a donor. Mr. Rockefeller and Mr. Stanford weren't known for their scholarly contributions.) Naming rights are major sources of capital funding. Similarly, many scholarships and -- in some settings -- endowed chairs are named after donors whose money may not have derived from the life of the mind.
The purist position also bears an uneasy relationship with freedom of speech. It's easy to overstate the "money is speech" position that the Supreme Court seems to endorse, but it's also true that advertising is, among other things, directed information. If a campus tries banning a certain class of information, the precedent could lead to places we don't want to go. Besides, to the extent that students need to be prepared for the real world, learning to navigate commerical information strikes me as part of the deal.
The profit motive is hardly absent from campus anyway. Food and bookstore services are typically run on a for-profit basis, and colleges have long profited by licensing names for sweatshirts, mugs, and the various paraphernalia found in college bookstores.
But I'm still uneasy with going from an acknowledgement of commercial speech to embracing it as salvation.
To the extent that a college derives benefits from its non-profit status -- property tax exemptions, say, and the ability to accept donations -- I can't help but think there's a public trust to uphold. (Public institutions take that a step farther, adding direct subsidies to the equation.) For a college to have an official soft drink, say, strikes me as exploiting its public trust. Depending on how far the endorsement or identification goes, there's a real risk of diluting the college's brand.
I'd also worry about influence. This is a chronic concern with named chairs, for example: would the Donald Trump Chair of Economics really be free to publish on the details of corruption among real estate developers?
Finally, I'd worry about stability. Market fortunes ebb and flow much more quickly than academic needs do. If the endorsement money goes to capital, or is basically ancillary, then that doesn't matter much. But if it starts to fund operations, then the college is even more exposed to market swings than it already is.
I've heard of major endorsement deals at major universities, but not at community colleges. Wise and worldly readers, have you seen (or heard of) major commericalization deals in the community college sector? If so, are there any lessons learned for the rest of us?
Sunday, October 14, 2012
The Intersession Drive-By
I’m interested in hearing back from any of my wise and worldly readers who’ve seen or figured out ways to handle this situation elegantly.
My college offers a January intersession. The idea is that students take a single class in a compressed timeframe. For the last few years, it has worked remarkably well for students who are already here. They can either make up for a slip in the Fall or start making headway on the Spring. Course completion rates have floated around the 90 percent range, since such a short timeframe doesn’t give much opportunity for life to get in the way. And anecdotal feedback from the faculty who have taught it has been glowing; they report that there’s an intensity that comes from “owning” the student entirely for a short time that lends itself well to certain types of classes.
Admittedly, it’s not a panacea; I don’t know how a composition class could be done that quickly, for example. There just wouldn’t be the time to grade. But it seems to work really well for certain lab courses, some intro gen eds, and even statistics. I remain convinced that it would make a great “boot camp” opportunity for a refresher course for adult students to skip developmental math.
The goal of the January session was twofold: to help “our” students maintain continuity, and to recruit “visiting” students who are pursuing degrees elsewhere, but who are home for the break. A student at a pricier university who has nothing to do at home for a month might pick up a transferable gen ed class on the cheap and transfer it back. The student would benefit from the cost savings, we’d benefit from the enrollment, and all would be well.
But we’re having some trouble marketing to and enrolling the drive-by student.
Individual financial aid is one issue. By Federal guidelines, a student can only receive financial aid at one college at a time. So a kid who’s enrolled at Regional U can’t transfer part of his award to Local CC to take Intro to Sociology in January. I don’t really understand the rationale behind that -- it seems like a valid expense to me -- but there it is.
Institutional financial aid is a much hairier issue. In order to maintain institutional eligibility, we can only admit “regular” students to our credit-bearing classes. That means we need to verify high school graduation (or the presence of a GED) before allowing the student to enroll. There’s no quick-and-easy way to do that for the student who decides after Christmas that it might be a good idea to take a class in January. We also need to honor all the usual prerequisites, which requires a transcript evaluation; again, that’s easy with enough warning, but it doesn’t work well with the abrupt arrival in January. Depending on the class, there may be placement tests to administer and grade, adding another level of time and expense.
This seems a bit silly to me. In my naive mind, a student enrolling on a non-matriculated basis (that is, not pursuing a degree at that college) shouldn’t be required to undergo the same level of scrutiny as a student who is pursuing a degree. Ashley is home from Northwestern for a few weeks, and wants to try her hand at Intro to Psych at the local cc. Why do we need to make this difficult? I can understand if Northwestern chooses not to take the credits -- that’s their call -- but requiring Ashley to show up weeks in advance to get all the testing and paperwork processed for a single class just seems like overkill.
I’m hoping that we’re making this harder than it needs to be, and that there’s actually a reasonable elegant work-around to make it easier to catch the drive-by student. Wise and worldly readers, have you seen one? How did it work?
My college offers a January intersession. The idea is that students take a single class in a compressed timeframe. For the last few years, it has worked remarkably well for students who are already here. They can either make up for a slip in the Fall or start making headway on the Spring. Course completion rates have floated around the 90 percent range, since such a short timeframe doesn’t give much opportunity for life to get in the way. And anecdotal feedback from the faculty who have taught it has been glowing; they report that there’s an intensity that comes from “owning” the student entirely for a short time that lends itself well to certain types of classes.
Admittedly, it’s not a panacea; I don’t know how a composition class could be done that quickly, for example. There just wouldn’t be the time to grade. But it seems to work really well for certain lab courses, some intro gen eds, and even statistics. I remain convinced that it would make a great “boot camp” opportunity for a refresher course for adult students to skip developmental math.
The goal of the January session was twofold: to help “our” students maintain continuity, and to recruit “visiting” students who are pursuing degrees elsewhere, but who are home for the break. A student at a pricier university who has nothing to do at home for a month might pick up a transferable gen ed class on the cheap and transfer it back. The student would benefit from the cost savings, we’d benefit from the enrollment, and all would be well.
But we’re having some trouble marketing to and enrolling the drive-by student.
Individual financial aid is one issue. By Federal guidelines, a student can only receive financial aid at one college at a time. So a kid who’s enrolled at Regional U can’t transfer part of his award to Local CC to take Intro to Sociology in January. I don’t really understand the rationale behind that -- it seems like a valid expense to me -- but there it is.
Institutional financial aid is a much hairier issue. In order to maintain institutional eligibility, we can only admit “regular” students to our credit-bearing classes. That means we need to verify high school graduation (or the presence of a GED) before allowing the student to enroll. There’s no quick-and-easy way to do that for the student who decides after Christmas that it might be a good idea to take a class in January. We also need to honor all the usual prerequisites, which requires a transcript evaluation; again, that’s easy with enough warning, but it doesn’t work well with the abrupt arrival in January. Depending on the class, there may be placement tests to administer and grade, adding another level of time and expense.
This seems a bit silly to me. In my naive mind, a student enrolling on a non-matriculated basis (that is, not pursuing a degree at that college) shouldn’t be required to undergo the same level of scrutiny as a student who is pursuing a degree. Ashley is home from Northwestern for a few weeks, and wants to try her hand at Intro to Psych at the local cc. Why do we need to make this difficult? I can understand if Northwestern chooses not to take the credits -- that’s their call -- but requiring Ashley to show up weeks in advance to get all the testing and paperwork processed for a single class just seems like overkill.
I’m hoping that we’re making this harder than it needs to be, and that there’s actually a reasonable elegant work-around to make it easier to catch the drive-by student. Wise and worldly readers, have you seen one? How did it work?
Thursday, October 11, 2012
Friday Fragments
- With much of selective higher ed focused on the Supreme Court and its impending declaration on affirmative action in admissions, I’m grateful again to be at a community college. Here, affirmative action in admissions is a non-issue; we take all comers. We have our own legal and political challenges, heaven knows, but not that one.
- The Girl: “Why do some people “reckon” when they believe?” I didn’t have an answer for that.
- The company that owns Red Lobster and the Olive Garden is reducing the hours of its part-time employees to avoid responsibility for health insurance under Obamacare. Some folks on the interwebs are pronouncing themselves shocked, and others are declaring a failure of Obamacare. It’s neither shocking nor a sign of failure; it’s perfectly predictable self-interest.
As long as health care is tied to employment, and the employment in question has a clear cutoff in terms of hours, employers will skirt that line to avoid paying. Those who don’t will fall behind those who do. In higher ed, the explosion of adjunct faculty positions was based on the same idea. But it isn’t confined to adjuncts; the same principle applies to part-time staff below a certain threshold of hours. In the corporate world, the explosion of “temps” and unpaid interns reflects the same premise.
Go ahead and vilify the Olive Garden if it makes you feel better. But it’s only playing by the rules. If you want real change, change the rules. Decouple health care from employment. Make it a basic citizenship right, paid for collectively and controlled democratically. And let people who would really prefer part-time work take it, without having to worry about what happens when they get sick.
- This week, we had the third catastrophic hard drive failure in a year. (It’s the fourth laptop disaster; the other one was a cracked screen.) The pattern seems to be that once the kids get access to a laptop, the hard drive’s days are numbered.
The Wife has pronounced herself sick of technology, but she still needs access to email and Facebook, and I’ve still got obligations of my own. I’m considering something with a solid state drive, on the theory that it would be sturdier, but I’m having a hard time finding anything other than tablets -- which I’m not sure would work well for our purposes -- or chromebooks, which I think of as tablets with keyboards.
Is there a hard drive gremlin on the loose? Is there a kid-proof laptop out there? Are solid state drives actually sturdier? Would a tablet actually work for, say, uploading photos to Picasa? I’m stumped. All I know for certain is that I’m done with Toshiba.
- I haven’t been able to shake this story all week. Apparently, the number of words to which children are exposed before age six is the single strongest predictor of later academic success. Kids with educated parents who spend time with them accrue such a powerful advantage over other kids that the deck is stacked by the time they get to first grade.
(As parents, we stacked the deck early; we have a picture of me reading The Runaway Bunny to The Boy in the hospital, the day after he was born. By age two, he was such a fan of books that we had to hide them under the sofa just to get him to do anything else.)
It seems like the painfully obvious solution to the class gap is to pay preschool and early childhood teachers well enough to attract professionals to the job. As long as daycare workers are paid something close to the minimum wage, kids who don’t get exposure to educated language at home won’t get it in class, either.
Working in higher ed, though, the implications seem defeatist. We get students long beyond the early childhood years. I have to believe that 18 year olds -- and 38 year olds, for that matter -- are still reachable. If I didn’t, I’d have to find another line of work.
- The Girl: “Why do some people “reckon” when they believe?” I didn’t have an answer for that.
- The company that owns Red Lobster and the Olive Garden is reducing the hours of its part-time employees to avoid responsibility for health insurance under Obamacare. Some folks on the interwebs are pronouncing themselves shocked, and others are declaring a failure of Obamacare. It’s neither shocking nor a sign of failure; it’s perfectly predictable self-interest.
As long as health care is tied to employment, and the employment in question has a clear cutoff in terms of hours, employers will skirt that line to avoid paying. Those who don’t will fall behind those who do. In higher ed, the explosion of adjunct faculty positions was based on the same idea. But it isn’t confined to adjuncts; the same principle applies to part-time staff below a certain threshold of hours. In the corporate world, the explosion of “temps” and unpaid interns reflects the same premise.
Go ahead and vilify the Olive Garden if it makes you feel better. But it’s only playing by the rules. If you want real change, change the rules. Decouple health care from employment. Make it a basic citizenship right, paid for collectively and controlled democratically. And let people who would really prefer part-time work take it, without having to worry about what happens when they get sick.
- This week, we had the third catastrophic hard drive failure in a year. (It’s the fourth laptop disaster; the other one was a cracked screen.) The pattern seems to be that once the kids get access to a laptop, the hard drive’s days are numbered.
The Wife has pronounced herself sick of technology, but she still needs access to email and Facebook, and I’ve still got obligations of my own. I’m considering something with a solid state drive, on the theory that it would be sturdier, but I’m having a hard time finding anything other than tablets -- which I’m not sure would work well for our purposes -- or chromebooks, which I think of as tablets with keyboards.
Is there a hard drive gremlin on the loose? Is there a kid-proof laptop out there? Are solid state drives actually sturdier? Would a tablet actually work for, say, uploading photos to Picasa? I’m stumped. All I know for certain is that I’m done with Toshiba.
- I haven’t been able to shake this story all week. Apparently, the number of words to which children are exposed before age six is the single strongest predictor of later academic success. Kids with educated parents who spend time with them accrue such a powerful advantage over other kids that the deck is stacked by the time they get to first grade.
(As parents, we stacked the deck early; we have a picture of me reading The Runaway Bunny to The Boy in the hospital, the day after he was born. By age two, he was such a fan of books that we had to hide them under the sofa just to get him to do anything else.)
It seems like the painfully obvious solution to the class gap is to pay preschool and early childhood teachers well enough to attract professionals to the job. As long as daycare workers are paid something close to the minimum wage, kids who don’t get exposure to educated language at home won’t get it in class, either.
Working in higher ed, though, the implications seem defeatist. We get students long beyond the early childhood years. I have to believe that 18 year olds -- and 38 year olds, for that matter -- are still reachable. If I didn’t, I’d have to find another line of work.
Wednesday, October 10, 2012
Connecticut
For once, I’m not going to pick on California.
Connecticut’s new centralized higher education system office has apparently been making either offers or threats -- there’s some dispute, and I have no inside information on it -- to community college presidents. As I understand it, the legislature passed a law last year limiting remedial coursework to a single semester. Apparently, some campuses have balked, so the system office has let the presidents know that if they feel unable to comply, they are welcome to leave.
I’m less interested in the semantics of the offer/threat or in the law about remediation than I am in the way the presidents are being treated.
Whether they’re being threatened or simply offered a graceful exit, it’s pretty clear that the central state authority sees them less as leaders of individual campuses than as branch office managers of a statewide chain. In the eyes of the central board, the job of the president is to carry out statewide mandates.
That’s at odds with the way that people on campus tend to see presidents, and probably at odds with the way many presidents see themselves. It’s not the job they thought they had signed up for; that shift, I’m guessing, is behind the offers to leave. Whether you want to call the severance offers parachutes or planks strikes me as a secondary concern.
Underlying that conflict, I think, is a disagreement about either power -- if you like your theories dark -- or innovation, if you prefer them lighter. Do you get more innovation with a bunch of relatively independent actors trying different things, or do you get more with a single authority in the middle calling the shots? If the statewide goal is, say, to increase the number of college graduates, are you likelier to achieve that goal through centralization or decentralization?
My own theory, for what it’s worth, is that the job of the central authority is to be utterly clear about goals, but to be agnostic on means. The mission should be set centrally, but the individual campuses need the room to experiment with ways to fulfill it.
Let’s say that the state sets a goal of more college graduates. Fair enough; Connecticut can’t compete on natural resources, sunny climate, or cheap land, so it might as well compete on quality of workforce. There are certainly worse ideas.
If the central office sets methods as well as goals, then only one method will be tried. That’s fine, if they get the right one the first time, but it’s reducing the epistemological harvest. (“Epistemological Harvest” would be a great name for a band. But I digress.) Having different campuses try different approaches at the same time, all towards the same general goal, offers a much more wide-ranging set of data. The old “laboratories of democracy” model offers a much better opportunity to discern what works.
More fundamentally, though, it’s hard to get the best out of creative and intelligent people when giving them orders. That’s true of faculty, and it’s true of presidents. Both do their best work when they have a sense of the overall direction, some actual resources, and enough autonomy to be able to make judgment calls as needed. If presidents are reduced to martinets -- and seen accordingly by their faculty -- they won’t be as effective as they could be.
I don’t deny the direction-setting role of the legislature, and therefore of the central office. If the state decides that it wants more transfers to UConn, or more welders, or more exclusivity, that’s its call to make. Someone who just can’t abide the mission should, in fact, be asked to leave. But it’s one thing to set a goal, and quite another to dictate how to get there.
My guess is that they’d get much better results by being a lot less directive. If a given campus fails continuously and refuses to change, then sure, have at it. But the state has a tremendous resource in the organized intelligence of its campuses. Reducing them to the single intelligence of one office is a serious mistake.
Connecticut’s new centralized higher education system office has apparently been making either offers or threats -- there’s some dispute, and I have no inside information on it -- to community college presidents. As I understand it, the legislature passed a law last year limiting remedial coursework to a single semester. Apparently, some campuses have balked, so the system office has let the presidents know that if they feel unable to comply, they are welcome to leave.
I’m less interested in the semantics of the offer/threat or in the law about remediation than I am in the way the presidents are being treated.
Whether they’re being threatened or simply offered a graceful exit, it’s pretty clear that the central state authority sees them less as leaders of individual campuses than as branch office managers of a statewide chain. In the eyes of the central board, the job of the president is to carry out statewide mandates.
That’s at odds with the way that people on campus tend to see presidents, and probably at odds with the way many presidents see themselves. It’s not the job they thought they had signed up for; that shift, I’m guessing, is behind the offers to leave. Whether you want to call the severance offers parachutes or planks strikes me as a secondary concern.
Underlying that conflict, I think, is a disagreement about either power -- if you like your theories dark -- or innovation, if you prefer them lighter. Do you get more innovation with a bunch of relatively independent actors trying different things, or do you get more with a single authority in the middle calling the shots? If the statewide goal is, say, to increase the number of college graduates, are you likelier to achieve that goal through centralization or decentralization?
My own theory, for what it’s worth, is that the job of the central authority is to be utterly clear about goals, but to be agnostic on means. The mission should be set centrally, but the individual campuses need the room to experiment with ways to fulfill it.
Let’s say that the state sets a goal of more college graduates. Fair enough; Connecticut can’t compete on natural resources, sunny climate, or cheap land, so it might as well compete on quality of workforce. There are certainly worse ideas.
If the central office sets methods as well as goals, then only one method will be tried. That’s fine, if they get the right one the first time, but it’s reducing the epistemological harvest. (“Epistemological Harvest” would be a great name for a band. But I digress.) Having different campuses try different approaches at the same time, all towards the same general goal, offers a much more wide-ranging set of data. The old “laboratories of democracy” model offers a much better opportunity to discern what works.
More fundamentally, though, it’s hard to get the best out of creative and intelligent people when giving them orders. That’s true of faculty, and it’s true of presidents. Both do their best work when they have a sense of the overall direction, some actual resources, and enough autonomy to be able to make judgment calls as needed. If presidents are reduced to martinets -- and seen accordingly by their faculty -- they won’t be as effective as they could be.
I don’t deny the direction-setting role of the legislature, and therefore of the central office. If the state decides that it wants more transfers to UConn, or more welders, or more exclusivity, that’s its call to make. Someone who just can’t abide the mission should, in fact, be asked to leave. But it’s one thing to set a goal, and quite another to dictate how to get there.
My guess is that they’d get much better results by being a lot less directive. If a given campus fails continuously and refuses to change, then sure, have at it. But the state has a tremendous resource in the organized intelligence of its campuses. Reducing them to the single intelligence of one office is a serious mistake.
Tuesday, October 09, 2012
Ask the Administrator: Discerning Culture from the Outside
I love this question. A new correspondent writes
First, congratulations on the impending little one!
This is a great question, because it’s both important and difficult.
I’d start by narrowing it down. Within a single college or organization, there can be dozens of microclimates in various offices. What you care about most is the microclimate where you’ll be working. For example, it wouldn’t be at all weird to discover that the unwritten rules in, say, Admissions, are different from the unwritten rules in Biology. Outside of really small colleges, you’ll often find very different climates in different corners of a single place.
That’s why I wouldn’t necessarily focus on HR as a source. It can give you the official policies and forms, but it won’t often tell you that, say, the boss you’d work for has a habit of calling you at home at 9:00 p.m. and expecting you to drop everything. Some places work by the book, and others vaguely recall that there’s a book somewhere.
I’d start with focusing on what you can control, which is your self-awareness. What aspects of “work-life balance” mean the most to you? I mean that in the most concrete, banal sense possible. Is it being home at the same time every day? Is it being allowed to not answer the phone at home? Is it the ability to work from home, as needed? Job-sharing? Unusual hours? Avoiding travel? Try to be as specific as you can in your own mind. Some places that are perfectly fine with, say, having a consistent time to leave, may not be fine with working at home. If you ask a general question about “family friendliness” thinking that it refers to flextime, but they hear it as referring to leaving by 5:00, you may get a truthful but misleading answer.
(For example, given the kids’ schedule, it’s fine for me to arrive at work at the crack of dawn, but I need to be home for dinner. That means that I arrive before almost everybody, but I leave at a consistent time. In my bachelor days, I preferred the polar opposite. Parenthood has forced me to impersonate a morning person, of all things.)
Then I’d look at what is actually practiced, to the extent possible. During interviews, ask the people at the table the questions that matter the most to you, in the most concrete terms possible. If you care about leaving at a set time, ask the people there if they do. If you care about not getting calls at home, ask the people there how often that happens. People know the “right” answer to most abstractions, but often get more truthful when you get more specific. If they recoil in horror or disbelief at your question, then you know what you need to know.
In my more optimistic moments, I like to believe that Gen X types, as they move into management, will be better about these things. As the generation that watched its parents divorce and lived the reality of “joint custody” from a kid’s point of view, I hope that we’ll be more mindful of the three-dimensional realities of people’s lives. Besides, from a management perspective, burning out employees is a stupid waste of resources; over time, you can get better performance from people who aren’t having work-induced personal crises.
Good luck!
Wise and worldly readers, I hope and believe that some of you have found or figured out other ways to suss out the true family-friendliness of a workplace before joining it. Is there a reliable indicator that an outsider could use?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Do you (or any of your wise and worldly readers) have any advice about looking for or finding clues to a college's culture, before you actually work there? For example, everyone says they are "family friendly", but the idea of work/life balance means different things to different people (and Gen X seems to be part of the turning of the tide- go Gen X!) I am a mid/high level administrator, and just getting a chance to start my family (one little one, another on the way) and have already, in the past few weeks, encountered both direct and indirect comments about wanting balance (apparently not a good thing), how I will get my work done (which was just fine last time I was out), and even the idea that one must aways say yes at work- no matter the effect on any other aspect of your work or home life (this last one was said to a group of people- rather demoralizing).
My good friend is on the job market right now as well, and we struggle with how to identify a good, communal, family-oriented culture, and we are both all about cc's. The closest we have come is taking into account the interactions with general staff. My friend has called a number of HR offices asking questions about application procedures, and has received quite a variety of responses, both in terms of helpfulness as well as just general courtesy. Any suggestions you might have would be appreciated!
First, congratulations on the impending little one!
This is a great question, because it’s both important and difficult.
I’d start by narrowing it down. Within a single college or organization, there can be dozens of microclimates in various offices. What you care about most is the microclimate where you’ll be working. For example, it wouldn’t be at all weird to discover that the unwritten rules in, say, Admissions, are different from the unwritten rules in Biology. Outside of really small colleges, you’ll often find very different climates in different corners of a single place.
That’s why I wouldn’t necessarily focus on HR as a source. It can give you the official policies and forms, but it won’t often tell you that, say, the boss you’d work for has a habit of calling you at home at 9:00 p.m. and expecting you to drop everything. Some places work by the book, and others vaguely recall that there’s a book somewhere.
I’d start with focusing on what you can control, which is your self-awareness. What aspects of “work-life balance” mean the most to you? I mean that in the most concrete, banal sense possible. Is it being home at the same time every day? Is it being allowed to not answer the phone at home? Is it the ability to work from home, as needed? Job-sharing? Unusual hours? Avoiding travel? Try to be as specific as you can in your own mind. Some places that are perfectly fine with, say, having a consistent time to leave, may not be fine with working at home. If you ask a general question about “family friendliness” thinking that it refers to flextime, but they hear it as referring to leaving by 5:00, you may get a truthful but misleading answer.
(For example, given the kids’ schedule, it’s fine for me to arrive at work at the crack of dawn, but I need to be home for dinner. That means that I arrive before almost everybody, but I leave at a consistent time. In my bachelor days, I preferred the polar opposite. Parenthood has forced me to impersonate a morning person, of all things.)
Then I’d look at what is actually practiced, to the extent possible. During interviews, ask the people at the table the questions that matter the most to you, in the most concrete terms possible. If you care about leaving at a set time, ask the people there if they do. If you care about not getting calls at home, ask the people there how often that happens. People know the “right” answer to most abstractions, but often get more truthful when you get more specific. If they recoil in horror or disbelief at your question, then you know what you need to know.
In my more optimistic moments, I like to believe that Gen X types, as they move into management, will be better about these things. As the generation that watched its parents divorce and lived the reality of “joint custody” from a kid’s point of view, I hope that we’ll be more mindful of the three-dimensional realities of people’s lives. Besides, from a management perspective, burning out employees is a stupid waste of resources; over time, you can get better performance from people who aren’t having work-induced personal crises.
Good luck!
Wise and worldly readers, I hope and believe that some of you have found or figured out other ways to suss out the true family-friendliness of a workplace before joining it. Is there a reliable indicator that an outsider could use?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Subscribe to:
Posts (Atom)