A few days ago we got word of the latest round of state funding cuts. I've literally lost track of the number of cuts we've taken this year, but the cumulative impact is drastic. Worse, we got the first inkling of the likely cut for next fiscal year, which starts July 1, and it will make this year's cuts look minor.
Given a reluctance to pass the brunt of the cuts on to the students in the form of tuition/fee increases, we've cut spending dramatically, and are readying to cut even more. What makes this interesting is that we've hit the inflection point at which a difference of degree becomes a difference of kind.
Every part of the college has taken hits: athletics, academics, student life, marketing, everybody. On the academic side, until now we've mostly been able to get by with the usual playbook: cut travel, replace departing full-time faculty with adjuncts, and subject any purchase requests to Inquisition-level questioning. We've put out the call for voluntary unpaid leaves, efficiency improvement suggestions, and the rest of the usual tricks. The brunt of the impact of the cuts so far has fallen on the evergreen disciplines -- English, history, etc. -- since it's easier to find adjuncts there than in other parts of the curriculum.
We've hit the limits of that approach, and now we're gearing up for some not-very-much-fun conversations about program eliminations. Over the next few months, we'll be looking at whose programs to cut, and therefore, who to lay off. Instead of giving everybody in the boat smaller rations, we'll start throwing some people overboard to save the rest.
Perversely enough, this actually bodes well for the evergreens. Every degree program has an English comp requirement, so there's no way we'd eliminate the English department. Certain disciplines are ubiquitious throughout the curricula, cheap to teach, and popular with students: psychology, history, sociology. Math, like English, is universally required, and relatively cheap to teach (though it's not terribly popular). While these areas can be relatively easy to water down, to a point, they simply can't be eliminated.
But the occupational programs with one or two full-time faculty, few or no adjuncts, low enrollments, and significant capital costs are in serious trouble. Those areas can't really be watered down, since they're already pretty much running at skeleton crew level (and adjuncts would be hard to find anyway). There, either we do the program or we don't. And if we don't, even tenure won't save you.
In the popular imagination, hard-headed reality dictates that occupational programs are more worthwhile than the 'fluffy' academic stuff. The popular imagination is wrong. Actually, the occupational programs are far more expensive for us to run. The only way we can sustain them, to the extent we do, is by cross-subsidizing them with profits from the 'fluffy' academic courses. History subsidizes Nursing. When we come under extreme economic pressure, we go back to basics, and that means the liberal arts. They're the only parts of the college that pay for themselves. That may seem like a betrayal of public purpose, but if we don't survive, we won't serve any public purpose at all. If you want the boutique-y stuff, I say to the taxpaying public, feel free to pay for it. In the meantime, we'll do what we have to do.
Of course, we'd rather avoid the problem altogether by improving revenues, which necessarily involves increasing enrollments. (I don't see aid improving anytime soon, and the philanthropic sector isn't recession-proof, either.) Increasing enrollments happens in two ways: increased admissions and better retention of those who are already there. The recession is giving us increased admissions, and that helps. (When the job market tanks, the opportunity cost of going back to school drops.) But improving retention is harder.
Simply put, each additional retention gain is harder than the one before it. Each one becomes more resource-intensive, as you move from simple stuff (getting the course schedule right) to harder stuff (improving financial aid and the bookstore) to the really expensive stuff (tutoring, academic support, increased counseling). Each new layer of retention is more expensive and difficult than the one before it. At a certain point, additional retention isn't financially worthwhile.
As with programs, so with students. At a certain point, the lifeboat is full.
The numbers on this are pretty clear, and the short-term logic is pretty tough to counter. But there's that matter of 'mission.' The point of a community college, first and foremost, is to serve people who don't have other options. We don't turn away people with other options, of course -- the whole point of open admissions is that you don't turn away anybody with a demonstrated ability to benefit -- but the primary reason we're here is to help the folks who most need it.
By definition, though, the needy are inefficient. A student who shows up prepared for college-level work, passes everything the first time without tutoring, and has his personal life together is remarkably cheap to educate, especially in the liberal arts. A student who has academic skills deficits, who needs counseling, and who attends part-time for several years is much higher-maintenance, and therefore more expensive.
When times are relatively flush, we can do some justice to both efficiency and mission. Now, we're being forced to choose efficiency. Fortuitously enough, the lowest-maintenance students also tend to be the ones most likely to take the traditional transfer-oriented liberal arts classes, so we're being pushed in the same direction by different forces. The stars are aligning for a back-to-basics movement, and an upscaling of our student body.
I'm just concerned that too much efficiency compromises our reason to exist.
In which a veteran of cultural studies seminars in the 1990's moves into academic administration and finds himself a married suburban father of two. Foucault, plus lawn care. For private comments, I can be reached at deandad at gmail dot com. The opinions expressed here are my own and not those of my employer.
Friday, January 30, 2009
Thursday, January 29, 2009
Things I've Learned from The Boy
A few days ago, we did a post-dinner Home Depot run. It wasn't terribly successful, and it was cold, and late, and we were all a little cranky.
In the car on the way back, we were uncharacteristically silent for some time. Then,
The Boy (in perfect 'Yoda' voice): Farted, I did.
And that was that. TW and I laughed the rest of the way home. Comic relief goes a long way.
---------------
Each week, he has ten spelling words for which he has to write sentences. Last week, one of the words was antonym. He wrote:
The antonym of synonym is antonym.
The teacher gave him a smiley face for that one. That's my boy!
----------------
Snow days are occasions for pure, unbridled glee. And Nintendo DS.
In the car on the way back, we were uncharacteristically silent for some time. Then,
The Boy (in perfect 'Yoda' voice): Farted, I did.
And that was that. TW and I laughed the rest of the way home. Comic relief goes a long way.
---------------
Each week, he has ten spelling words for which he has to write sentences. Last week, one of the words was antonym. He wrote:
The antonym of synonym is antonym.
The teacher gave him a smiley face for that one. That's my boy!
----------------
Snow days are occasions for pure, unbridled glee. And Nintendo DS.
Wednesday, January 28, 2009
The Bookstore Conundrum
A returning correspondent writes:
If only it were that simple...
“Surely those bookstores don't earn money.” Actually, they do. In some cases, quite a lot. And the college gets a cut, either directly or indirectly.
At my cc, the college actually owns the bookstore. Bookstore profits are funneled directly into the college's operating budget. (Nationally, the trend has been to outsource the bookstore to a national company like Follett's. In those cases, the revenue stream to the college is based on rent, rather than sales, but if the sales dried up, it's a safe bet the rent would, too.)
Now, if one were so inclined, one could call this a conflict of interest. College hires faculty, faculty choose books, books enrich college. That's true as far as it goes, but there's more to it than that. Unless they choose books they've written themselves – which happens – the faculty don't get any actual kickbacks directly. The bookstores usually make higher profit margins on used books than on new ones, so they're often joining students in the crusade to get faculty not to change books too often. Publishers know this, so they 'bundle' all manner of stuff with textbooks and change editions every hour on the hour to try to suck the air out of the used book market, with which they compete.
From my desk, I'm happy to encourage faculty to allow paperbacks, or used editions, whenever it makes pedagogical sense. (That tends to work better in American literature than in computer science, for obvious reasons.) Used editions are higher profit items, and still cheaper for students, so I get to feel good about helping the students while also helping the college's budget. To the extent that we can outsource our shortfalls to publishers, I'm happy to do it. But there are obvious limits to this, and I've never pressed the point when faculty have insisted that a particular new book was simply better.
Back in the day, campus bookstores had effective monopolies, since most required texts were specialized enough that other bookstores within realistic student distance wouldn't have them. Now that students have access to online booksellers, it's possible in many cases for students to do end-runs around campus bookstores. Yet, judging by sales figures, very few do.
Some of that is probably inertia, and I've heard anecdotally that some of it is based on financial aid. (If your book voucher is only good at the campus bookstore, then the question of where to shop has been pretty much settled.) Some is based on speed; if you need the book for a class tomorrow, buying it in person is the best bet. And if you don't have access to the list of necessary books until you get your hands on the syllabus on the first day of class, then the 'speed' variable becomes harder to evade. (A really savvy student could purchase only the first book at the bookstore, and order the rest online, but that doesn't tend to happen.) Depending on what happens with e-book readers, I guess it's possible that this issue could become moot, but I suspect that's at least several years away on any meaningful scale.
Finally, of course, there's the issue of returns if you drop the class. College bookstores usually have policies that are tied, if vaguely, to the local academic calendar. Online bookstores typically don't.
When I was in grad school at Flagship State, the university had a primary bookstore, but faculty also freely used several other bookstores in town. It didn't seem to help much with prices; I recall being struck even then that no matter where I bought books, I paid too much. The official store overcharged; the seedy store overcharged; the painfully trendy store overcharged. Once I got really ambitious and drove to the university bookstore at another university; it, too, overcharged. And once the Supreme Court got all finicky about 'fair use' in the 90's, the old “Kinko's discount” became harder to pull off. Naturally, this made overcharging even easier, since the safety valve of samizdat had been largely closed.
So the short answer to the question is, colleges keep bookstores because they're profitable. The secondary question, which is a little harder, is why the student grapevine is still relatively ineffective at circumventing the system.
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
As a Ph.D. student who actually purchases most of the books on the
required lists, I'm becoming more price-sensitive than I was when I
was an undergraduate (and, incidentally, funded more generously!) and
a spendthrift M.A. student (when I only had to purchase a few books).
Consequently, I do most of my shopping on Amazon now, which is both
faster and more convenient than trudging to and from the bookstore half
a dozen times as the required texts trickle in over the semester.
My question is this. Why haven't colleges given up the bookstore ghost
altogether and simply set up a link on their home page to an Amazon
site listing all of the texts that students should buy? (Or Alibris,
or Powells, or whatever.) Surely these bookstores don't earn money,
and their nontextbook revenue streams (shot glasses, beer steins,
corkscrews, and sweatshirts) could be housed in smaller and even more
profitable-per-square-foot stores.
Students will always complain that textbooks are too expensive, of
course, but surely this would eliminate some of the intermediary costs
while also doing away with the hassles of textbook return policies and
so forth.
As it stands, it just doesn't appear that the hassles--from
understocked books to testy salespeople to blocks-long queues--are
worth giving up what is always prime university real estate to,
essentially, a store people only use twice a year.
If only it were that simple...
“Surely those bookstores don't earn money.” Actually, they do. In some cases, quite a lot. And the college gets a cut, either directly or indirectly.
At my cc, the college actually owns the bookstore. Bookstore profits are funneled directly into the college's operating budget. (Nationally, the trend has been to outsource the bookstore to a national company like Follett's. In those cases, the revenue stream to the college is based on rent, rather than sales, but if the sales dried up, it's a safe bet the rent would, too.)
Now, if one were so inclined, one could call this a conflict of interest. College hires faculty, faculty choose books, books enrich college. That's true as far as it goes, but there's more to it than that. Unless they choose books they've written themselves – which happens – the faculty don't get any actual kickbacks directly. The bookstores usually make higher profit margins on used books than on new ones, so they're often joining students in the crusade to get faculty not to change books too often. Publishers know this, so they 'bundle' all manner of stuff with textbooks and change editions every hour on the hour to try to suck the air out of the used book market, with which they compete.
From my desk, I'm happy to encourage faculty to allow paperbacks, or used editions, whenever it makes pedagogical sense. (That tends to work better in American literature than in computer science, for obvious reasons.) Used editions are higher profit items, and still cheaper for students, so I get to feel good about helping the students while also helping the college's budget. To the extent that we can outsource our shortfalls to publishers, I'm happy to do it. But there are obvious limits to this, and I've never pressed the point when faculty have insisted that a particular new book was simply better.
Back in the day, campus bookstores had effective monopolies, since most required texts were specialized enough that other bookstores within realistic student distance wouldn't have them. Now that students have access to online booksellers, it's possible in many cases for students to do end-runs around campus bookstores. Yet, judging by sales figures, very few do.
Some of that is probably inertia, and I've heard anecdotally that some of it is based on financial aid. (If your book voucher is only good at the campus bookstore, then the question of where to shop has been pretty much settled.) Some is based on speed; if you need the book for a class tomorrow, buying it in person is the best bet. And if you don't have access to the list of necessary books until you get your hands on the syllabus on the first day of class, then the 'speed' variable becomes harder to evade. (A really savvy student could purchase only the first book at the bookstore, and order the rest online, but that doesn't tend to happen.) Depending on what happens with e-book readers, I guess it's possible that this issue could become moot, but I suspect that's at least several years away on any meaningful scale.
Finally, of course, there's the issue of returns if you drop the class. College bookstores usually have policies that are tied, if vaguely, to the local academic calendar. Online bookstores typically don't.
When I was in grad school at Flagship State, the university had a primary bookstore, but faculty also freely used several other bookstores in town. It didn't seem to help much with prices; I recall being struck even then that no matter where I bought books, I paid too much. The official store overcharged; the seedy store overcharged; the painfully trendy store overcharged. Once I got really ambitious and drove to the university bookstore at another university; it, too, overcharged. And once the Supreme Court got all finicky about 'fair use' in the 90's, the old “Kinko's discount” became harder to pull off. Naturally, this made overcharging even easier, since the safety valve of samizdat had been largely closed.
So the short answer to the question is, colleges keep bookstores because they're profitable. The secondary question, which is a little harder, is why the student grapevine is still relatively ineffective at circumventing the system.
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Tuesday, January 27, 2009
Ask the Administrator: What the Fish?
Ask the Administrator: What the Fish?
A new correspondent writes:
As a general rule, I try to ignore Stanley Fish. It's the decent thing to do. As an old Kristin Hersh lyric puts it, “I don't judge people/I just try to look away. I want to look away now.”
That said, the guy is the Tony Danza of higher ed. For reasons that elude me entirely, he keeps popping up. How he continues to find sweet gigs, like New York Times columnist, is a complete mystery. I suspect that in an attic somewhere, there's a picture of him looking unpublished. But I digress.
I'll be generous, and assume that Fish is working from narcissism, rather than glaring incompetence. From the not-thinking-very-hard perspective of someone who made his professional bones writing about Milton, it may be plausible to say that irrelevance is a sustainable gig. And yes, there are a few well-upholstered corners of higher ed in which you can both declare your irrelevance and cash large checks. But to suggest that that's all there is to higher education, or all there should be, is just silly.
Having attended one of those well-upholstered corners of higher ed as an undergraduate, I can let the world in on a dirty little secret. Many of the students majoring in classic liberal arts disciplines aren't forgoing professional education; they're just postponing it. After the SLAC, they went on to law school, or med school, or business school, or graduate school. The savvier ones even understood their SLAC experience as distinctly 'pre-law' or 'pre-med' or whatever. Undergraduates at, say, Harvard aren't generally known for their lack of worldly ambition, either.
But leave that aside. Because Fish's position isn't really based on actual student behavior. It's based on a sort of declension narrative, a decline and fall from the Platonic ideal. Back in the day, the story goes, students were deferential and wise and pure and virtuous. Now they're vulgar and materialistic, unworthy of respect from those of us who remember the days of milk and honey.
Anybody who knows the history of American higher education knows that this is a load.
Harvard, for example, was established to train clergy. It was explicitly and unapologetically vocational. The land-grant universities were established to foster agriculture and the 'useful arts.' Many of the smaller colleges, both public and private, were established as “teacher's colleges,” with the clear vocational purpose of training teachers. (Sometimes they were called 'normal schools,' which is how Normal, Illinois got its name.) Community colleges were founded specifically to bring both 'pure' and vocational education to anybody who wanted either; that's why so many have the word 'comprehensive' in their mission statements. State college and university systems had workforce and economic development missions even before they called them that.
Most of American higher education started as clearly vocational, and mission-crept its way away from that over time. That's neither entirely good nor entirely bad, but it's the direct opposite of the 'fall from grace' narrative. The idea that paradise was lost might make sense to a Milton scholar, but it has nothing to do with reality.
I'll imagine an objection. “Ah, but what about the shift of majors? Students used to major in English or history; now they major in Business! What about that?”
There's some truth to that, but the idea that English is somehow pure is relatively recent. In the 19th century, the idea of studying literature in anything other than Latin or Greek was considered a form of selling out. And anybody who peruses the offerings of a typical English department now would be hard-pressed to say that it's all classic literature, all the time. It never was. (If you really want to be disabused of the idea of the study of language as pure, check the history of the term 'sophistry.' Even in Athenian times, 'rhetoric' was understood as primarily utilitarian.) And anybody who doesn't know that 'history' or 'poli sci' is usually the liberal arts equivalent of 'pre-law' hasn't been paying attention.
I'd also argue that some of the most interesting work in the social sciences today comes from the intersection of economics, psychology, and business. Behavioral economics is based on being 'impure' in the best possible way. The most interesting work in most fields – biology, engineering, medicine, political science, architecture -- comes from informed engagement with some sort of problem in the world, rather than some sort of cloistered musings. Proust is the exception, not the rule.
If I had to give a definition of higher education, it would probably involve something like “learning to bring analytical rigor to bear on the world.” Necessarily, that involves picking a particular slice of the world and focusing on that. That narrow slice may be your navel, but most of the time, it won't be.
Whew. Now to the second part of the question: where do new programs come from?
In my experience, they can come from any of several sources.
Sometimes they come from student initiatives and/or political crises. (This was usually the source of women's studies or other identity-based programs.) Sometimes they come from pure faculty interest. Sometimes they come from employers, or from what we expect employers will want in the near future. Sometimes scholarly fields just develop in ways that require secession from their home disciplines. Sometimes they come from a sort of 'emulation,' in which schools lower on the prestige pole imitate schools higher up, both to give students opportunities and out of a sense that that's just how it's done. (Operationally, that's the source of most mission creep. The approved euphemism is “raising our academic profile.”) Sometimes they come from efforts to chase external money. (Over the last few years, there has been a profusion of “homeland security” majors. You tell me.) And yes, sometimes they just come from some administrator with a bee in his bonnet.
Usually, for a new major to succeed, it has to solve somebody's problem. That problem may be a lack of employable graduates in a given field, or a lack of enrollments in a given department, or a consistent hole in the existing curriculum that swallows up otherwise-worthwhile projects. A purely-vanity major won't 'take,' since it only solves one person's problem.
Judging by the conclusion to his piece, Stanley Fish has already solved his own problems. Good for him. For the rest of us, though, there's serious work to be done.
Wise and worldly readers – have you seen majors develop in odd or unusual ways?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
A new correspondent writes:
After recently reading Stanley Fish's NY Times blog
on education, I felt moved to write in. I recently attended a talk about
curriculum and program design where large university decided to roll
out a new undergraduate program (let's call it "computer science
lite") since enrollments were collapsing in a related discipline
("traditional computer science."). As part of the planning process at
this university, the committee asked for consultations from
professionals in the IT industry (and presumably other educators). The
IT sector said that graduates were clearly weak in professional skills
(defined to be skills such as communications, project management etc).
Industry feedback seemingly played a major, possibly decisive, role in
the design of this new undergraduate program. This focus on employer
input as central strikes me as interesting and rather unusual in
higher education.
Contrast this to Fish's views about post-secondary education which he
asserts, "In previous columns and in a recent book I have argued that
higher education, properly understood, is distinguished by the absence
of a direct and designed relationship between its activities and
measurable effects in the world." This strikes me as odd. What about
the public policy programs? What about medicine? What about almost all
the social sciences (which often propose ways to address the world).
What do you make of Fish's "purist humanities" view of higher ed? I
suspect you would categorize as an elite R1 view, but I'm curious.
In devising new programs or re-evaluating existing ones (say as part
of regular exercise to determine where to allocate funds), what
counts? Cheers of delight from employers? Nods of approval from
graduate schools admitting students from a given program? Student
satisfaction, however measured? Or does the administration and faculty
simply make an assessment based on what they see and feel and go with
that?
As a general rule, I try to ignore Stanley Fish. It's the decent thing to do. As an old Kristin Hersh lyric puts it, “I don't judge people/I just try to look away. I want to look away now.”
That said, the guy is the Tony Danza of higher ed. For reasons that elude me entirely, he keeps popping up. How he continues to find sweet gigs, like New York Times columnist, is a complete mystery. I suspect that in an attic somewhere, there's a picture of him looking unpublished. But I digress.
I'll be generous, and assume that Fish is working from narcissism, rather than glaring incompetence. From the not-thinking-very-hard perspective of someone who made his professional bones writing about Milton, it may be plausible to say that irrelevance is a sustainable gig. And yes, there are a few well-upholstered corners of higher ed in which you can both declare your irrelevance and cash large checks. But to suggest that that's all there is to higher education, or all there should be, is just silly.
Having attended one of those well-upholstered corners of higher ed as an undergraduate, I can let the world in on a dirty little secret. Many of the students majoring in classic liberal arts disciplines aren't forgoing professional education; they're just postponing it. After the SLAC, they went on to law school, or med school, or business school, or graduate school. The savvier ones even understood their SLAC experience as distinctly 'pre-law' or 'pre-med' or whatever. Undergraduates at, say, Harvard aren't generally known for their lack of worldly ambition, either.
But leave that aside. Because Fish's position isn't really based on actual student behavior. It's based on a sort of declension narrative, a decline and fall from the Platonic ideal. Back in the day, the story goes, students were deferential and wise and pure and virtuous. Now they're vulgar and materialistic, unworthy of respect from those of us who remember the days of milk and honey.
Anybody who knows the history of American higher education knows that this is a load.
Harvard, for example, was established to train clergy. It was explicitly and unapologetically vocational. The land-grant universities were established to foster agriculture and the 'useful arts.' Many of the smaller colleges, both public and private, were established as “teacher's colleges,” with the clear vocational purpose of training teachers. (Sometimes they were called 'normal schools,' which is how Normal, Illinois got its name.) Community colleges were founded specifically to bring both 'pure' and vocational education to anybody who wanted either; that's why so many have the word 'comprehensive' in their mission statements. State college and university systems had workforce and economic development missions even before they called them that.
Most of American higher education started as clearly vocational, and mission-crept its way away from that over time. That's neither entirely good nor entirely bad, but it's the direct opposite of the 'fall from grace' narrative. The idea that paradise was lost might make sense to a Milton scholar, but it has nothing to do with reality.
I'll imagine an objection. “Ah, but what about the shift of majors? Students used to major in English or history; now they major in Business! What about that?”
There's some truth to that, but the idea that English is somehow pure is relatively recent. In the 19th century, the idea of studying literature in anything other than Latin or Greek was considered a form of selling out. And anybody who peruses the offerings of a typical English department now would be hard-pressed to say that it's all classic literature, all the time. It never was. (If you really want to be disabused of the idea of the study of language as pure, check the history of the term 'sophistry.' Even in Athenian times, 'rhetoric' was understood as primarily utilitarian.) And anybody who doesn't know that 'history' or 'poli sci' is usually the liberal arts equivalent of 'pre-law' hasn't been paying attention.
I'd also argue that some of the most interesting work in the social sciences today comes from the intersection of economics, psychology, and business. Behavioral economics is based on being 'impure' in the best possible way. The most interesting work in most fields – biology, engineering, medicine, political science, architecture -- comes from informed engagement with some sort of problem in the world, rather than some sort of cloistered musings. Proust is the exception, not the rule.
If I had to give a definition of higher education, it would probably involve something like “learning to bring analytical rigor to bear on the world.” Necessarily, that involves picking a particular slice of the world and focusing on that. That narrow slice may be your navel, but most of the time, it won't be.
Whew. Now to the second part of the question: where do new programs come from?
In my experience, they can come from any of several sources.
Sometimes they come from student initiatives and/or political crises. (This was usually the source of women's studies or other identity-based programs.) Sometimes they come from pure faculty interest. Sometimes they come from employers, or from what we expect employers will want in the near future. Sometimes scholarly fields just develop in ways that require secession from their home disciplines. Sometimes they come from a sort of 'emulation,' in which schools lower on the prestige pole imitate schools higher up, both to give students opportunities and out of a sense that that's just how it's done. (Operationally, that's the source of most mission creep. The approved euphemism is “raising our academic profile.”) Sometimes they come from efforts to chase external money. (Over the last few years, there has been a profusion of “homeland security” majors. You tell me.) And yes, sometimes they just come from some administrator with a bee in his bonnet.
Usually, for a new major to succeed, it has to solve somebody's problem. That problem may be a lack of employable graduates in a given field, or a lack of enrollments in a given department, or a consistent hole in the existing curriculum that swallows up otherwise-worthwhile projects. A purely-vanity major won't 'take,' since it only solves one person's problem.
Judging by the conclusion to his piece, Stanley Fish has already solved his own problems. Good for him. For the rest of us, though, there's serious work to be done.
Wise and worldly readers – have you seen majors develop in odd or unusual ways?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Monday, January 26, 2009
Ask the Administrator: Stereotypes of the For-Profit World
A new correspondent writes:
Having done this myself, I agree that there are both fair and unfair barriers you'll need to be prepared to address.
On the 'fair' side, it's true that you probably don't have experience dealing as an administrator with tenure and/or unions. Those aren't small considerations. When I made the leap, I was struck the persistence of memory in the tenured world; slights endured fifteen years ago were still felt as fresh wounds. Whether that's 'vigilance' or 'the need to get a life' varies by case, but it was noticeable.
(Weirdly, some of the folks most punctilious about noting every questionable act “The Administration” has ever done, or contemplated doing, are remarkably loose about distinguishing one member of The Administration from the rest. True example: as an academic dean, I've been confronted by a half-dozen angry secretaries and blamed personally for the college not calling a snow day. I couldn't call a snow day if I wanted to, but to some people, any administrator is interchangeable with any other.)
In dealing with unions, the biggest lesson I had to learn was the 'second' contract. The first contract is the official, written one, which I strongly encourage you to read closely and repeatedly. The second one is the quasi-official unwritten one, which goes by the name of 'past practice.' The shock for me was that 'past practice' actually carried legal force, in addition to cultural force.
Past practice is an inexcusably murky area, since, by definition, it's unwritten and subject to conflicting memory. It's also hard to define. The good folks in HR tell me that it refers to the “terms and conditions of employment,” but that's about as helpful as referring to “academic freedom.” In practice, what counts as a term and condition of employment can be incredibly elastic either way. For example, say there's a construction project on campus, and some of the staff who used to park in lot Q now have to park in lot F, along with the faculty. Some faculty now have a harder time finding spaces, especially since most of them arrive later in the day than most of the staff. In some cases, they've had to park in another, less desirable lot, forcing a longer walk. Have the terms and conditions of faculty employment been altered? (Answer: it depends.)
I'm one of those literal-minded people who would like to believe that 'thou shalt nots' should have to be written down somewhere before I could be nailed for violating them. Alas, this is not how it's done. Figuring out what actual past practice has been is your problem, since it isn't written down. And different people's interpretations are almost inevitably colored by their self-interest. Of course, if you guess wrong, you're up against it. Why this is even legal is beyond me, but there it is.
The only reasonably effective way I've found for dealing with the spectre of past practice has been to give the unions heads-up before I try anything that I think could raise hackles. Since adopting this strategy, I've noticed a palpable decline in the number of indignant “how dare you's” I have to endure. It's slower, but spending a little more time upfront saves a hell of a lot of drama later. I've had cases in which a past practice was based on a past circumstance that has since changed, but the practice hasn't. Having the conversation upfront about a possible change, and acknowledging the changed circumstances, works a lot better than just announcing the change and trusting that everyone will discern its wisdom after the fact. They won't.
(I have to tip my cap to Andrew Young, the former UN Ambassador and Mayor of Atlanta, for helping me figure this out. He did an interview on The Colbert Report during the writers' strike, discussing a hospital strike several decades earlier. He said that when it comes down to it, every strike is about the same thing: respect. If the union membership feels respected, it will go along with all kinds of things. If it doesn't, it won't. Maybe it's me, but framing it in those terms helped quite a bit. Pre-decision consultation, if done openly and with a willingness to rethink things based on what you hear, is a concrete sign of respect.)
The last 'fair' barrier to be prepared for is grantsmanship. Depending on the position you're applying for, there may be an expectation that you will take an active role in helping faculty (or the college) secure grants. For-profits, for the most part, aren't eligible for most grants, and simply don't 'do' philanthropy. My advice on this one is that if it comes up, simply note that your institution was simply ineligible from the start, and note that you're willing to put in the time and energy to learn. It's not ideal, but it is what it is.
In terms of unfair (or mostly unfair) stereotypes, I'd expect to run into some people who will assume that you carry with you the values (or lack thereof) of for-profit education. They'll be skeptical of you from the start, thinking that anyone who came up through that system must be some sort of trojan horse for standardization, or corporatization, or whatever the current bogeyman is. My response to this, which I heard early and often, was to point out that I was leaving the for-profit sector. If I were truly a profit-obsessed troglodyte, I pointed out, I would have stayed there. The whole point of leaving was to get away from that value system. That didn't satisfy everybody, but it shifted the conversation, and had the added virtue of being true. Over time, I've heard that less and less, since I've built my own record.
The transition is challenging, and if you get the job, you'll spend the first year learning a new playbook. But that's okay. The great advantage you'll bring is a comparative perspective. Most people on your new campus won't have that. In my more optimistic moments, I like to think that people who have seen multiple organizational models can draw on the best of each, and even help forge sustainable new ones. It's a little Pollyannaish, I admit, but not entirely without truth.
Good luck!
Wise and worldly readers – how might you respond to a candidate like this? Have you made a similar leap?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
After six years of teaching and academic administration at two proprietary schools (neither one being your Proprietary U unless you've disguised it incredibly well), I am applying for a position at a local community college. As someone who has made the transition, what concerns do you think I should be prepared to address on the off chance I get an interview?
I know I have no experience working with a faculty union, but I have worked at an R1, so I am at least familiar with the concepts and structures of faculty governance. I'm a little more concerned about what assumptions, groundless or otherwise, they'll have about my background.
As ever, advice from wise and worldly readers is welcome.
Having done this myself, I agree that there are both fair and unfair barriers you'll need to be prepared to address.
On the 'fair' side, it's true that you probably don't have experience dealing as an administrator with tenure and/or unions. Those aren't small considerations. When I made the leap, I was struck the persistence of memory in the tenured world; slights endured fifteen years ago were still felt as fresh wounds. Whether that's 'vigilance' or 'the need to get a life' varies by case, but it was noticeable.
(Weirdly, some of the folks most punctilious about noting every questionable act “The Administration” has ever done, or contemplated doing, are remarkably loose about distinguishing one member of The Administration from the rest. True example: as an academic dean, I've been confronted by a half-dozen angry secretaries and blamed personally for the college not calling a snow day. I couldn't call a snow day if I wanted to, but to some people, any administrator is interchangeable with any other.)
In dealing with unions, the biggest lesson I had to learn was the 'second' contract. The first contract is the official, written one, which I strongly encourage you to read closely and repeatedly. The second one is the quasi-official unwritten one, which goes by the name of 'past practice.' The shock for me was that 'past practice' actually carried legal force, in addition to cultural force.
Past practice is an inexcusably murky area, since, by definition, it's unwritten and subject to conflicting memory. It's also hard to define. The good folks in HR tell me that it refers to the “terms and conditions of employment,” but that's about as helpful as referring to “academic freedom.” In practice, what counts as a term and condition of employment can be incredibly elastic either way. For example, say there's a construction project on campus, and some of the staff who used to park in lot Q now have to park in lot F, along with the faculty. Some faculty now have a harder time finding spaces, especially since most of them arrive later in the day than most of the staff. In some cases, they've had to park in another, less desirable lot, forcing a longer walk. Have the terms and conditions of faculty employment been altered? (Answer: it depends.)
I'm one of those literal-minded people who would like to believe that 'thou shalt nots' should have to be written down somewhere before I could be nailed for violating them. Alas, this is not how it's done. Figuring out what actual past practice has been is your problem, since it isn't written down. And different people's interpretations are almost inevitably colored by their self-interest. Of course, if you guess wrong, you're up against it. Why this is even legal is beyond me, but there it is.
The only reasonably effective way I've found for dealing with the spectre of past practice has been to give the unions heads-up before I try anything that I think could raise hackles. Since adopting this strategy, I've noticed a palpable decline in the number of indignant “how dare you's” I have to endure. It's slower, but spending a little more time upfront saves a hell of a lot of drama later. I've had cases in which a past practice was based on a past circumstance that has since changed, but the practice hasn't. Having the conversation upfront about a possible change, and acknowledging the changed circumstances, works a lot better than just announcing the change and trusting that everyone will discern its wisdom after the fact. They won't.
(I have to tip my cap to Andrew Young, the former UN Ambassador and Mayor of Atlanta, for helping me figure this out. He did an interview on The Colbert Report during the writers' strike, discussing a hospital strike several decades earlier. He said that when it comes down to it, every strike is about the same thing: respect. If the union membership feels respected, it will go along with all kinds of things. If it doesn't, it won't. Maybe it's me, but framing it in those terms helped quite a bit. Pre-decision consultation, if done openly and with a willingness to rethink things based on what you hear, is a concrete sign of respect.)
The last 'fair' barrier to be prepared for is grantsmanship. Depending on the position you're applying for, there may be an expectation that you will take an active role in helping faculty (or the college) secure grants. For-profits, for the most part, aren't eligible for most grants, and simply don't 'do' philanthropy. My advice on this one is that if it comes up, simply note that your institution was simply ineligible from the start, and note that you're willing to put in the time and energy to learn. It's not ideal, but it is what it is.
In terms of unfair (or mostly unfair) stereotypes, I'd expect to run into some people who will assume that you carry with you the values (or lack thereof) of for-profit education. They'll be skeptical of you from the start, thinking that anyone who came up through that system must be some sort of trojan horse for standardization, or corporatization, or whatever the current bogeyman is. My response to this, which I heard early and often, was to point out that I was leaving the for-profit sector. If I were truly a profit-obsessed troglodyte, I pointed out, I would have stayed there. The whole point of leaving was to get away from that value system. That didn't satisfy everybody, but it shifted the conversation, and had the added virtue of being true. Over time, I've heard that less and less, since I've built my own record.
The transition is challenging, and if you get the job, you'll spend the first year learning a new playbook. But that's okay. The great advantage you'll bring is a comparative perspective. Most people on your new campus won't have that. In my more optimistic moments, I like to think that people who have seen multiple organizational models can draw on the best of each, and even help forge sustainable new ones. It's a little Pollyannaish, I admit, but not entirely without truth.
Good luck!
Wise and worldly readers – how might you respond to a candidate like this? Have you made a similar leap?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Friday, January 23, 2009
Ask My Readers: Called Out on Retention
A regular correspondent makes an interesting point about “retention” in a different sense of the word:
There's a lot here, so I'll just add a few thoughts and ask my wise and worldly readers for their reactions.
I remember a moment at PU in which I was trying to help a student build his schedule just a few days before classes started. He was supposed to take the second course of a sequence. When I told him that, he demurred, saying it would be too hard. I pointed out that he had taken the first course in the sequence the previous semester, and had passed it, so he should be ready. His response, which I remember to this day: “But that was over a month ago!”
Alrighty then. I guess the moral is never to have a doctor more than a week or two out of medical school.
Surely, we have all taken classes – and passed them, maybe even with decent grades – from which we don't remember much. Most of the foreign languages I've studied are gone. I haven't the foggiest recollection of how to do derivatives. Heck, it gets worse than that. I'm still fuzzy on an embarrassing number of state capitals.
I'm not sure if that's really the issue here, though.
Remediation typically addresses basic reading, writing, and math. (By 'basic math,' I mean up through high school algebra. We don't even test geometry or trig.) These are mostly skills, as opposed to specific facts, and they're cumulative. They build over time, and can be reinforced (or not) outside of school. People who read get good at it; people who don't, don't.
That's why I'm not sure that the analogy to specific course content holds. These skills aren't confined to single courses. They're built, or not, over years.
That said, I agree that merely bashing the K-12 system doesn't solve the problem, and may even make it worse. The K-12 system is tasked with an imposing, and ever-growing, list of goals. Dedicated teachers run into the standard bureaucratic obstacles, plus adolescent hormones, helicopter parents, absent parents, standardized testing, unequal funding, the cult of athletics, and local politics, among other things. Having higher ed pile on isn't helpful, and isn't likely to generate constructive conversation.
And it's certainly fair to ask professors to reflect on what they want students to take away from their classes years later. Many specific facts will simply be lost to the sands of time; there's no way around that. If your course is a gen ed class, or the kind of class that non-majors take, then your class may be the one time the students will ever be exposed to serious inquiry in that discipline. Given that not everybody will become an expert in your subject, what do you want them to take away from it?
I'll admit that it took a couple years of teaching for me to start thinking in those terms. Early on, I made the rookie mistake of trying to 'cover' everything. When I got back bizarrely disjointed versions of the material in papers, I gradually realized the error and started trying to focus more on the big picture. After a while, I decided that what I really wanted the students to develop was a combination of aggressive curiosity and some sense of how to frame questions. If they got that, I figured they were capable of following up on their own. Less 'covering,' more 'uncovering.'
(One of my most gratifying moments as a teacher came when a colleague mentioned to me that one of her students had spoken to her about my class, which she had taken the previous semester. The student said that she had never cared about the subject before, but now couldn't stop thinking about it. I considered that a victory.)
In a discussion last year with a local high school, whose graduates routinely crashed and burned on our essay test, it became clear that something like this was really at issue. The high school taught writing as 'error avoidance,' so the students wrote very simple prose in very simple ways. The college test evaluated the ability to make an argument, which necessarily involves some level of complexity. A student who did reasonably well at the high school rules could flop at the college rules and not know why. We both evaluated 'writing,' but we defined the term in importantly different ways. Once we had that epiphany, the conversation got easier.
There's certainly a lot to chew on here, and I've done my share. Wise and worldly readers, what say you?
Have a question (or challenge)? Ask the Administrator at deandad (at) gmail (dot) com.
Every time you write about remedial classes,
retention, or the K-12 preparation your comments seem
to get thread-jacked with folks who basically say that
the K-12 system sucks...
Maybe it would be worth trying to host a slightly
different conversation on your public forum...
The nugget of my question, "What do you do to ensure
that students retain their knowledge and skills from a
class?"
Students place into remedial classes for a number of
reasons, but the most persistent assumption on the
part of college-faculty seems to be that the K-12
system is failing students by either not teaching
appropriate content or allowing them to pass without
demonstrating that they have mastered the appropriate
skills.
Let's do a small thought experiment. Let's assume for
the moment that your local school system is staffed
with competent people who know their content, teach
appropriate content, and students who earn passing
grades actually demonstrate mastery of that content.
(I'd suggest that, for the most part, this is a
reasonable assumption given my 8 years of work with
local school districts up and down the eastern
seaboard).
If students show up at college and are unable to
demonstrate the appropriate skill-set to avoid
remedial classes, what should we then assume about
these students?
Clearly, that they have failed to retain the concepts
and skills that they were taught.
So, they place into a remedial class... My question,
in longer form, is, suppose that they complete the
remedial sequence in one year. When they return to
school the following August after being away from
school since early May, how much do they retain?
Heck, ask this question about non-remedial courses...
How much do students retain?
4 months after the course ends, if you give students
the exact same final exam that they took at the end of
the course, how well should they do? How well would
they really do?
What do college-level institutions do in order to help
students better retain what they've been taught?
Figuring out this question, and sharing with the K-12
folks, could do far more to reduce remedial
enrollments than having college faculty endlessly
repeating "K-12 [needs] to do their job" as suggested
in your comment threads...
There's a lot here, so I'll just add a few thoughts and ask my wise and worldly readers for their reactions.
I remember a moment at PU in which I was trying to help a student build his schedule just a few days before classes started. He was supposed to take the second course of a sequence. When I told him that, he demurred, saying it would be too hard. I pointed out that he had taken the first course in the sequence the previous semester, and had passed it, so he should be ready. His response, which I remember to this day: “But that was over a month ago!”
Alrighty then. I guess the moral is never to have a doctor more than a week or two out of medical school.
Surely, we have all taken classes – and passed them, maybe even with decent grades – from which we don't remember much. Most of the foreign languages I've studied are gone. I haven't the foggiest recollection of how to do derivatives. Heck, it gets worse than that. I'm still fuzzy on an embarrassing number of state capitals.
I'm not sure if that's really the issue here, though.
Remediation typically addresses basic reading, writing, and math. (By 'basic math,' I mean up through high school algebra. We don't even test geometry or trig.) These are mostly skills, as opposed to specific facts, and they're cumulative. They build over time, and can be reinforced (or not) outside of school. People who read get good at it; people who don't, don't.
That's why I'm not sure that the analogy to specific course content holds. These skills aren't confined to single courses. They're built, or not, over years.
That said, I agree that merely bashing the K-12 system doesn't solve the problem, and may even make it worse. The K-12 system is tasked with an imposing, and ever-growing, list of goals. Dedicated teachers run into the standard bureaucratic obstacles, plus adolescent hormones, helicopter parents, absent parents, standardized testing, unequal funding, the cult of athletics, and local politics, among other things. Having higher ed pile on isn't helpful, and isn't likely to generate constructive conversation.
And it's certainly fair to ask professors to reflect on what they want students to take away from their classes years later. Many specific facts will simply be lost to the sands of time; there's no way around that. If your course is a gen ed class, or the kind of class that non-majors take, then your class may be the one time the students will ever be exposed to serious inquiry in that discipline. Given that not everybody will become an expert in your subject, what do you want them to take away from it?
I'll admit that it took a couple years of teaching for me to start thinking in those terms. Early on, I made the rookie mistake of trying to 'cover' everything. When I got back bizarrely disjointed versions of the material in papers, I gradually realized the error and started trying to focus more on the big picture. After a while, I decided that what I really wanted the students to develop was a combination of aggressive curiosity and some sense of how to frame questions. If they got that, I figured they were capable of following up on their own. Less 'covering,' more 'uncovering.'
(One of my most gratifying moments as a teacher came when a colleague mentioned to me that one of her students had spoken to her about my class, which she had taken the previous semester. The student said that she had never cared about the subject before, but now couldn't stop thinking about it. I considered that a victory.)
In a discussion last year with a local high school, whose graduates routinely crashed and burned on our essay test, it became clear that something like this was really at issue. The high school taught writing as 'error avoidance,' so the students wrote very simple prose in very simple ways. The college test evaluated the ability to make an argument, which necessarily involves some level of complexity. A student who did reasonably well at the high school rules could flop at the college rules and not know why. We both evaluated 'writing,' but we defined the term in importantly different ways. Once we had that epiphany, the conversation got easier.
There's certainly a lot to chew on here, and I've done my share. Wise and worldly readers, what say you?
Have a question (or challenge)? Ask the Administrator at deandad (at) gmail (dot) com.
Thursday, January 22, 2009
Ask the Administrator: The Gang that Couldn't Shoot Straight
A new correspondent writes:
Been there.
The short answer is, you go someplace else.
Back at Proprietary U, at least toward the end of my time there, there was a single-minded focus on finding excuses to pass students. Since the place was tuition-driven and enrollment was dropping, the idea was that anything that encouraged attrition – like, say, failing students – was bad for business.
(In fairness, that attitude wasn't there when enrollments were growing. It was a stupid response to a crisis, rather than a stupid philosophical position.)
For a while, I tried fighting the good fight from within. I argued up the chain that graduating incompetent people would permanently devalue the degree, thereby precluding the possibility of recovery. I tried to shift the focus from 'punishing faculty' to 'supporting students,' even going so far as to do a PowerPoint presentation (and I hate PowerPoint presentations) to senior management about the effects of inappropriate 'cut scores' on student success. And I grabbed any extenuating nugget I could, and used it until it just couldn't be used any more.
And I lost. The direction was set from on high, and the direction was to retain by any means necessary.
When I got wind of some particularly objectionable directives that, had I stayed, I would have had to implement, I knew it was time to go. The organization was a lot bigger than I was, and its leadership had a clear, if mistaken, sense of what it wanted. So I sent out c.v.'s, and took the first reasonable offer I received.
Put differently, this is what a bad 'fit' looks like from the employee side.
The top brass at PU was wrong, in my view, in some pretty fundamental ways. But it had the right to be wrong. Those calls fell within its purview. If it wanted to hollow out the organization's reason to exist, it could. I just didn't want to be a part of that.
Adjuncting is a lousy gig in any number of well-documented ways, but at least it's an easy gig to leave without having to have some awkward conversations.
Depending on how bad the place is, though, you might be able to salvage some useful nuggets before you go. If you can find a thoughtful (or at least reasonable) person there with some kind of title, you might be able to swing a decent letter of recommendation. I've had adjuncts request class observations specifically for that purpose, and I've gone along with the requests I've received. If you're leaving, others probably are, too, and some of them may land in interesting and/or useful places. Maintain the positive contacts you've built, if any.
I wouldn't worry overly much about resume stain from having adjuncted there. In this market, it's widely understood that academics in evergreen disciplines generally take what they can get.
For the record, hearing of deans who treat faculty this way really grinds my gears. The stereotypes of emptyheaded administrators are bad enough without providing empirical confirmation. And memories of bad behavior linger much longer, and more strongly, than memories of good. If we had a deans' union, I'd want these folks kicked out of it.
The good news is that not all community colleges are run this way. The grass really is greener.
Alas.
Good luck with your situation. I don't envy you.
Wise and worldly readers – how have you handled situations in which it seemed that everybody else drank the Kool-Aid?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
The letter about the horrible adjunct struck a cord for me, but for a very different reason. I am an adjunct at a local community college and it while I have enjoyed it, and learned a lot about what works and what doesn't in the CC classroom, I can't help but wonder if there aren't more 'horrible adjuncts' out there. I can imagine there are, because although I believe I am competent and capable, I have never had an official evaluation (in fact, no one has ever come to watch me teach), nor are there official student evaluations of courses. And that doesn't even begin to address the issues with the dean, who has told instructors that students shouldn't be called out for texting in class and has accused others of racism for questioning the removal of basic English language competency requirements, or for failing students who stop showing up to class.
So I guess my question is, where does one go when it seems the whole college is one giant lump of incompetence? And yes, this is partly selfish, because the school I'm teaching at is on the brink of losing its accreditation, and how does that look on a CV? But more than that, I worry about the students who pay good money, and think that they are getting an education, when what they are getting may or may not be.
Been there.
The short answer is, you go someplace else.
Back at Proprietary U, at least toward the end of my time there, there was a single-minded focus on finding excuses to pass students. Since the place was tuition-driven and enrollment was dropping, the idea was that anything that encouraged attrition – like, say, failing students – was bad for business.
(In fairness, that attitude wasn't there when enrollments were growing. It was a stupid response to a crisis, rather than a stupid philosophical position.)
For a while, I tried fighting the good fight from within. I argued up the chain that graduating incompetent people would permanently devalue the degree, thereby precluding the possibility of recovery. I tried to shift the focus from 'punishing faculty' to 'supporting students,' even going so far as to do a PowerPoint presentation (and I hate PowerPoint presentations) to senior management about the effects of inappropriate 'cut scores' on student success. And I grabbed any extenuating nugget I could, and used it until it just couldn't be used any more.
And I lost. The direction was set from on high, and the direction was to retain by any means necessary.
When I got wind of some particularly objectionable directives that, had I stayed, I would have had to implement, I knew it was time to go. The organization was a lot bigger than I was, and its leadership had a clear, if mistaken, sense of what it wanted. So I sent out c.v.'s, and took the first reasonable offer I received.
Put differently, this is what a bad 'fit' looks like from the employee side.
The top brass at PU was wrong, in my view, in some pretty fundamental ways. But it had the right to be wrong. Those calls fell within its purview. If it wanted to hollow out the organization's reason to exist, it could. I just didn't want to be a part of that.
Adjuncting is a lousy gig in any number of well-documented ways, but at least it's an easy gig to leave without having to have some awkward conversations.
Depending on how bad the place is, though, you might be able to salvage some useful nuggets before you go. If you can find a thoughtful (or at least reasonable) person there with some kind of title, you might be able to swing a decent letter of recommendation. I've had adjuncts request class observations specifically for that purpose, and I've gone along with the requests I've received. If you're leaving, others probably are, too, and some of them may land in interesting and/or useful places. Maintain the positive contacts you've built, if any.
I wouldn't worry overly much about resume stain from having adjuncted there. In this market, it's widely understood that academics in evergreen disciplines generally take what they can get.
For the record, hearing of deans who treat faculty this way really grinds my gears. The stereotypes of emptyheaded administrators are bad enough without providing empirical confirmation. And memories of bad behavior linger much longer, and more strongly, than memories of good. If we had a deans' union, I'd want these folks kicked out of it.
The good news is that not all community colleges are run this way. The grass really is greener.
Alas.
Good luck with your situation. I don't envy you.
Wise and worldly readers – how have you handled situations in which it seemed that everybody else drank the Kool-Aid?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Wednesday, January 21, 2009
Remedial Sequences
A hat-tip to Sherman Dorn for pointing out this story in IHE that I had missed.
According to a report from the Community College Research Center, looking at remedial courses as a sequence, rather than discrete courses, leads to some disturbing conclusions. As reported by IHE, fewer than 4 out of every 10 students who start a remedial sequence actually finish it, with most of the attrition occurring during, or even before, the first course. The report recommends that colleges pay special attention to advisement and counseling between courses, to keep students from falling between the cracks.
This is a HUGE deal for community colleges, and it's terribly complicated. At my cc, which certainly isn't among the really struggling ones, a majority of entering students place developmental (we prefer that term to 'remedial') in at least one course, usually math. The developmental math sequence starts all the way back with arithmetic, and builds through basic and intermediate algebra before the students can take credit-bearing courses. Developmental English includes both reading and writing, and there's an ongoing debate as to whether ESL should be considered developmental.
The paradox of developmental courses is that the more basic the material, the lower the pass rate. We have a higher pass rate in calculus than in arithmetic, just as we have a higher pass rate in World Literature than in basic reading. Of course, the only students who take calculus are those who sailed through the lower-level math courses – usually in high school – so they're presumably capable. The paradox becomes clearer when you consider self-selection.
The issues around developmental ed are legion.
Politically, it's radioactive. A fair number of taxpayers blanch at the idea of paying a community college to teach material that they've already paid the K-12 system to cover. (Some of them persist in the outdated notion that 'trades' are easily distinguishable from 'degree programs,' and suggest that you don't need basic reading or math skills in the trades. For those keeping score at home, the largest providers of 'vocational' education in America are community colleges, and the feedback we get from employers – consistently and without fail – is that they need employees who can communicate, and who don't freeze up when doing simple math.) Among supporters of developmental ed, conversations about obstacles faced by students have an elaborate etiquette, since they can very easily shade into student-blaming. (And honesty compels me to admit that some students are actually their own worst enemies.) It's also easy to veer into unhelpful quasi-socialist tirades, as if corporations made a profit from non-profit public sector colleges teaching basic algebra. They don't.
Philanthropists generally like to support successful outcomes, usually defined as graduation and job placement. That's understandable, but it usually means that the developmental stuff has to come entirely from our dwindling operating budget.
Developmental courses are resource-intensive. They have to be kept small, since these students need personal attention. Most of the tutoring in our academic support center – free of charge to the students, but hugely expensive to the college – is in the developmental sequences. The attrition rate in these courses is significantly higher than in the credit-bearing courses, so between small starting sizes and high attrition, we wind up with relatively little tuition revenue to pay for them. (Yes, we use far too many adjuncts in developmental courses to try to make up some of the difference, but there are limits to that, too.) And with apologies to Tolstoy, 'good' students are mostly the same, but every struggling student struggles in his own way.
To make matters worse, students often recoil when told that they need to take (and pay for) courses that “don't count” towards graduation. These courses stretch out the time and money to complete the degree, and some students see them as conspiracies to separate them from their money. Combine shaky preparation with a suspicious attitude, and the odds of success aren't high.
I've seen different philosophies of remediation. One school says that you need to break everything down into the tiniest possible units, and proceed “step-by-step.” Another says that remediation should be compressed into the shortest time possible. One says that it should be taught 'contextually,' with examples drawn from intended majors; another says that it should all be 'self-paced,' with computers and tutors; another says that it's all about back-to-basics. (For the record, my position can be boiled down to that great line from the movie Wargames, with Matthew Broderick and Ally Sheedy: “Hell, I'd piss on a sparkplug if I thought it'd help!”)
And then there's the school that denies that developmental education should exist at all. If you haven't learned it by 18, this school says, that's your problem.
I'm not a fan of that last one.
In a more perfect world, we'd have the resources to run a whole bunch of experiments locally and see what happens. What happens if we compress three semesters of developmental math into one? What happens if we just throw everybody into freshman comp? (That was the de facto policy at Proprietary U when I was there. It resulted in very low grading standards for freshman comp.) What happens if we go all self-paced, all the time?
We're not there. At best, we can try to glean successes from other schools, as well as our own, and fix what we can, on the fly. We're trying that, and we're participating in a national program that would be entirely too revealing to name, but so far the improvements have been small at best. It's a major issue, and with our K-12 and immigration systems being what they are, it will continue to be a major issue for the foreseeable future.
I'm glad to see that some people with the resources to do comparative work are looking seriously at this. We need all the help we can get.
According to a report from the Community College Research Center, looking at remedial courses as a sequence, rather than discrete courses, leads to some disturbing conclusions. As reported by IHE, fewer than 4 out of every 10 students who start a remedial sequence actually finish it, with most of the attrition occurring during, or even before, the first course. The report recommends that colleges pay special attention to advisement and counseling between courses, to keep students from falling between the cracks.
This is a HUGE deal for community colleges, and it's terribly complicated. At my cc, which certainly isn't among the really struggling ones, a majority of entering students place developmental (we prefer that term to 'remedial') in at least one course, usually math. The developmental math sequence starts all the way back with arithmetic, and builds through basic and intermediate algebra before the students can take credit-bearing courses. Developmental English includes both reading and writing, and there's an ongoing debate as to whether ESL should be considered developmental.
The paradox of developmental courses is that the more basic the material, the lower the pass rate. We have a higher pass rate in calculus than in arithmetic, just as we have a higher pass rate in World Literature than in basic reading. Of course, the only students who take calculus are those who sailed through the lower-level math courses – usually in high school – so they're presumably capable. The paradox becomes clearer when you consider self-selection.
The issues around developmental ed are legion.
Politically, it's radioactive. A fair number of taxpayers blanch at the idea of paying a community college to teach material that they've already paid the K-12 system to cover. (Some of them persist in the outdated notion that 'trades' are easily distinguishable from 'degree programs,' and suggest that you don't need basic reading or math skills in the trades. For those keeping score at home, the largest providers of 'vocational' education in America are community colleges, and the feedback we get from employers – consistently and without fail – is that they need employees who can communicate, and who don't freeze up when doing simple math.) Among supporters of developmental ed, conversations about obstacles faced by students have an elaborate etiquette, since they can very easily shade into student-blaming. (And honesty compels me to admit that some students are actually their own worst enemies.) It's also easy to veer into unhelpful quasi-socialist tirades, as if corporations made a profit from non-profit public sector colleges teaching basic algebra. They don't.
Philanthropists generally like to support successful outcomes, usually defined as graduation and job placement. That's understandable, but it usually means that the developmental stuff has to come entirely from our dwindling operating budget.
Developmental courses are resource-intensive. They have to be kept small, since these students need personal attention. Most of the tutoring in our academic support center – free of charge to the students, but hugely expensive to the college – is in the developmental sequences. The attrition rate in these courses is significantly higher than in the credit-bearing courses, so between small starting sizes and high attrition, we wind up with relatively little tuition revenue to pay for them. (Yes, we use far too many adjuncts in developmental courses to try to make up some of the difference, but there are limits to that, too.) And with apologies to Tolstoy, 'good' students are mostly the same, but every struggling student struggles in his own way.
To make matters worse, students often recoil when told that they need to take (and pay for) courses that “don't count” towards graduation. These courses stretch out the time and money to complete the degree, and some students see them as conspiracies to separate them from their money. Combine shaky preparation with a suspicious attitude, and the odds of success aren't high.
I've seen different philosophies of remediation. One school says that you need to break everything down into the tiniest possible units, and proceed “step-by-step.” Another says that remediation should be compressed into the shortest time possible. One says that it should be taught 'contextually,' with examples drawn from intended majors; another says that it should all be 'self-paced,' with computers and tutors; another says that it's all about back-to-basics. (For the record, my position can be boiled down to that great line from the movie Wargames, with Matthew Broderick and Ally Sheedy: “Hell, I'd piss on a sparkplug if I thought it'd help!”)
And then there's the school that denies that developmental education should exist at all. If you haven't learned it by 18, this school says, that's your problem.
I'm not a fan of that last one.
In a more perfect world, we'd have the resources to run a whole bunch of experiments locally and see what happens. What happens if we compress three semesters of developmental math into one? What happens if we just throw everybody into freshman comp? (That was the de facto policy at Proprietary U when I was there. It resulted in very low grading standards for freshman comp.) What happens if we go all self-paced, all the time?
We're not there. At best, we can try to glean successes from other schools, as well as our own, and fix what we can, on the fly. We're trying that, and we're participating in a national program that would be entirely too revealing to name, but so far the improvements have been small at best. It's a major issue, and with our K-12 and immigration systems being what they are, it will continue to be a major issue for the foreseeable future.
I'm glad to see that some people with the resources to do comparative work are looking seriously at this. We need all the help we can get.
Tuesday, January 20, 2009
President Obama
When Barack Obama was born, his parents' marriage would have been illegal in Virginia. This Fall, he carried Virginia.
I've been accused of being just a little cynical. (TW mentioned that this marks the end of the 8 year long relationship between her butt and President Bush's face. No more annual state-of-the-mooning.) Today, I'm glad to put that aside.
We need hope, and we need it now.
In some ways, it's already flickering. A President who believes in science, and who appoints science advisors who actually know what they're talking about, is already a refreshing change. A President who isn't afraid to be publicly and conspicuously literate, who wears his intelligence with quiet confidence, inspires confidence. And a President who isn't afraid to surround himself with smart, experienced people – without getting subsumed by them – does this academic's heart good.
It's a hell of a time to take over. The economy is in free-fall, we're fighting multiple wars, and there's no obvious way out of either. The healthcare system is a bad joke, we're addicted to oil, and the boomers are on the cusp of retiring. We've spent the last eight years systematically isolating ourselves from our allies. The job won't be easy.
Good luck, President Obama. You've got your work cut out for you. Some of us are rooting for you enough that we're even surprised at ourselves.
It's a busy day, with the semester just starting. But I'll take a moment at noon and watch history happen. It's time.
I've been accused of being just a little cynical. (TW mentioned that this marks the end of the 8 year long relationship between her butt and President Bush's face. No more annual state-of-the-mooning.) Today, I'm glad to put that aside.
We need hope, and we need it now.
In some ways, it's already flickering. A President who believes in science, and who appoints science advisors who actually know what they're talking about, is already a refreshing change. A President who isn't afraid to be publicly and conspicuously literate, who wears his intelligence with quiet confidence, inspires confidence. And a President who isn't afraid to surround himself with smart, experienced people – without getting subsumed by them – does this academic's heart good.
It's a hell of a time to take over. The economy is in free-fall, we're fighting multiple wars, and there's no obvious way out of either. The healthcare system is a bad joke, we're addicted to oil, and the boomers are on the cusp of retiring. We've spent the last eight years systematically isolating ourselves from our allies. The job won't be easy.
Good luck, President Obama. You've got your work cut out for you. Some of us are rooting for you enough that we're even surprised at ourselves.
It's a busy day, with the semester just starting. But I'll take a moment at noon and watch history happen. It's time.
Friday, January 16, 2009
Ask the Administrator: Teaching Writing in the Social Sciences
A new correspondent writes:
I'm pretty sure there's a law against social scientists teaching anybody how to write. I once had an article rejected because the reviewer found my prose too “breezy.” Compared to most of what gets published in my field, he had a point: you could actually discern my argument without Advil. If we started saying clearly what we meant, well, then how would we intimidate anybody?
Okay, now that I've cleared my throat...
Some colleges still have “Writing Across the Curriculum” programs, in which departments outside of English designate a couple of courses (or sections) as “writing-intensive, ” with assignments that are explicitly about both process and result. For example, a particular section of Intro to Sociology would use sociology as the fodder for what amounted to a writing class.
The WAC movement waned, I think, because it tried to do too much with too little. Faculty in the disciplines resisted the extra grading and the barrage of criticism from English departments that they were doing it wrong. English departments resisted it on the grounds that the departments were doing it wrong, and if they weren't doing it wrong, what was the expertise of the English department? Having tried to do this sort of thing myself in my teaching days, I can attest that teaching both process and content at the same time is harder than teaching either alone. When all that extra work comes with student complaints and no new resources, it's easy to predict the outcome.
That said, the WAC movement has held on in some places, and you'd be a natural candidate to pick up those sections that nobody else wants.
You also might want to look for schools with interdisciplinary freshman seminars. Most of the time, that means small liberal arts colleges. (Cc's usually don't do interdisciplinary freshman seminars, since they tend not to transfer cleanly.)
Less obviously, you might want to try your hand at teaching online. Since online teaching necessarily involves a great deal of written communication, you could ply your dual trades there and get credit on the hiring side for being ready to jump in wherever you're needed. At many teaching-oriented places, the candidate who is comfortable teaching both in class and online has an edge over the candidate who can only do one or the other.
At the interview stage, when the discussion turns to teaching, I'd recommend discussing the ways you structure some of your student assignments. At interviews for positions at teaching-intensive colleges, most candidates (not all, admittedly) are savvy enough to say that they like teaching. You can set yourself apart by actually showing it. Do you require students to turn in drafts of papers? Do you have them turn in separate narratives describing how they did what they did? (This works pretty well as a plagiarism deterrent, btw.) How, exactly, do you give feedback that manages to be neither too prescriptive nor too demoralizing? Thoughtful discussion of points like these are relatively rare (cough) on the social science side of the house.
Some will discount your interest as irrelevant, and some will probably view it as prima facie evidence that you aren't a hardcore social scientist. But some of us think that helping students learn to make arguments, clearly, with evidence, about actual goings-on in the world, is actually a good thing. It's a minority view, but a good one.
Good luck!
Wise and worldly readers – any hints you could offer would be appreciated.
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
I have a question to ask you and your "wise and worldly readers." :) I'm a PhD candidate in an evergreen social science, and I just taught for the first time last semester. While I loved many things about teaching, the biggest surprise for me was how much I loved teaching writing. I loved marking student papers, trying to teach them about how to structure an argument, working with them on how to craft a better piece of writing and thinking.
I know most writing is taught in English Comp classes, which I'm obviously not properly placed, disciplinarily, to teach. But, at different sorts of schools, what opportunities are there for social scientists to teach writing? I know the elite SLAC my wife attended had "writing-intensive" courses across the disciplines; how common are those? Is wanting to teach writing an asset in the job market? How might I position myself (beyond saying "I love teaching writing!" in a cover letter) to show this interest?
I'm pretty sure there's a law against social scientists teaching anybody how to write. I once had an article rejected because the reviewer found my prose too “breezy.” Compared to most of what gets published in my field, he had a point: you could actually discern my argument without Advil. If we started saying clearly what we meant, well, then how would we intimidate anybody?
Okay, now that I've cleared my throat...
Some colleges still have “Writing Across the Curriculum” programs, in which departments outside of English designate a couple of courses (or sections) as “writing-intensive, ” with assignments that are explicitly about both process and result. For example, a particular section of Intro to Sociology would use sociology as the fodder for what amounted to a writing class.
The WAC movement waned, I think, because it tried to do too much with too little. Faculty in the disciplines resisted the extra grading and the barrage of criticism from English departments that they were doing it wrong. English departments resisted it on the grounds that the departments were doing it wrong, and if they weren't doing it wrong, what was the expertise of the English department? Having tried to do this sort of thing myself in my teaching days, I can attest that teaching both process and content at the same time is harder than teaching either alone. When all that extra work comes with student complaints and no new resources, it's easy to predict the outcome.
That said, the WAC movement has held on in some places, and you'd be a natural candidate to pick up those sections that nobody else wants.
You also might want to look for schools with interdisciplinary freshman seminars. Most of the time, that means small liberal arts colleges. (Cc's usually don't do interdisciplinary freshman seminars, since they tend not to transfer cleanly.)
Less obviously, you might want to try your hand at teaching online. Since online teaching necessarily involves a great deal of written communication, you could ply your dual trades there and get credit on the hiring side for being ready to jump in wherever you're needed. At many teaching-oriented places, the candidate who is comfortable teaching both in class and online has an edge over the candidate who can only do one or the other.
At the interview stage, when the discussion turns to teaching, I'd recommend discussing the ways you structure some of your student assignments. At interviews for positions at teaching-intensive colleges, most candidates (not all, admittedly) are savvy enough to say that they like teaching. You can set yourself apart by actually showing it. Do you require students to turn in drafts of papers? Do you have them turn in separate narratives describing how they did what they did? (This works pretty well as a plagiarism deterrent, btw.) How, exactly, do you give feedback that manages to be neither too prescriptive nor too demoralizing? Thoughtful discussion of points like these are relatively rare (cough) on the social science side of the house.
Some will discount your interest as irrelevant, and some will probably view it as prima facie evidence that you aren't a hardcore social scientist. But some of us think that helping students learn to make arguments, clearly, with evidence, about actual goings-on in the world, is actually a good thing. It's a minority view, but a good one.
Good luck!
Wise and worldly readers – any hints you could offer would be appreciated.
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Thursday, January 15, 2009
Ask the Administrator: Getting the Boss Fired
A returning correspondent writes:
There's a lot here, but I'll focus on what I consider the key point. (I won't really address #1, since we don't have TA's at the cc level. I'll leave that one for folks who work with TA's on a regular basis.)
It's not your job to get her fired.
Let's assume that everything you write is correct, and that she's a terrible teacher and supervisor. Not criminal, and not in violation of any of the basic canons of behavior (sleeping with students, accepting money for grades), but just a really lousy teacher.
Mere badness – as opposed to violation of law or canons of ethical behavior – falls under the category of 'professional judgment.' The professional judgment in question belongs to the hiring manager, typically a department chair. It does not belong to you.
To the extent that you can inform that judgment with relevant and verifiable facts, presented calmly, that may or may not be worthwhile, depending on personalities and local culture. But if you go on a crusade to get her fired, you will be perceived – rightly or wrongly – as part of the problem.
One of the really frustrating lessons I've had to learn in administration is that you never want to get into a point-by-point argument with a crazy person. They don't fight fair, and you'll get dragged into their mud. The way to handle them is to take the high road, stick to facts, and to trust that, over time, their nuttiness will discredit them. (If the entire culture of the college is nutty, you're probably best off finding another place to work.)
If you fight a gossip-monger by gossip-mongering, it's hard to imagine a positive outcome. At best, maybe you battle to a draw, reducing your own credibility to her level. At worst, you lose, since she has more practice at that game. Don't do it.
And that's without even addressing the cost of the battle in terms of both time and emotional energy. Both are finite, and both could be better spent doing almost anything else. You'd be much better off focusing on things you can actually control. For example, if it helps you sleep at night, you could go to the chair to deny her allegations, and merely take a classy and conspicuous silence regarding her. I'm not usually a fan of this approach – it's hard to defend yourself without sounding defensive – but it can make sense in some cases. Then learn what you can from the experience and move on.
(For the record, my advice would be very different if she were doing something clearly illegal or immoral. In those cases, I see an ethical obligation to blow the whistle. But that's not this case.)
If it's any consolation, it's possible to learn from lousy mentors and bosses, just as it's possible to learn from good ones. Reflect carefully on all that you've seen and experienced. To the extent that you can translate visceral responses into conscious ideas, you may be able to make yourself a more effective instructor.
Good luck. This isn't a pretty situation, but it doesn't have to get any uglier.
Wise and worldly readers – any thoughts on this one?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Here's the situation: I worked as a TA for an intro level survey course for a truly awful adjunct. She was condescending, vague about my role inside and outside the classroom, unclear about how strict/lenient grading should be, and frequently imposed impractical deadlines. With the students in the class, she was vague about expectations, a truly harsh grader, thematically all over the place, and in particular, refused to explain to the students what she meant by "good writing" (probably just wasn't capable of, is more like it). She also was terrible about answering student emails/keeping the students informed about changes to the syllabus. All in all, pretty much your standard nightmare with a PhD.
As her TA, I struggled pretty much daily with what my role both in- and outside of the classroom. My suggestions for how to improve the class (like a suggestion for a session on improving student writing, which I even volunteered to organize and run outside of class time) were met with hostility and disgust. I helped the students best I could, but a lot of the time, there wasn't much I could do (since it was unclear what this woman even wanted from her students, outside of a textbook recitation of facts, etc)...
So, my question is actually two-fold:
1)What do you (and your readers) feel a TA's role, both inside and outside the classroom, should be? How/Should a professor communicate responsibilities with their TA?
and
2) How can you get a truly awful adjunct fired without making a god-awful mess of things for yourself? Our department is kind of all over the place in terms of knowing who to talk to about anything, but someone needs to know truly how awful this woman is. I'm afraid, however, that this will just look like bitching on my part. I know that this professor is a serious gossip hound, and she talks about EVERYBODY, including her TA's, and I know for a fact that she's had some not-so-nice things to say about me. I don't think she should be removed for my sake; I KNOW that there are hundreds of other smart, qualified people that would happily take a position at our institution.
There's a lot here, but I'll focus on what I consider the key point. (I won't really address #1, since we don't have TA's at the cc level. I'll leave that one for folks who work with TA's on a regular basis.)
It's not your job to get her fired.
Let's assume that everything you write is correct, and that she's a terrible teacher and supervisor. Not criminal, and not in violation of any of the basic canons of behavior (sleeping with students, accepting money for grades), but just a really lousy teacher.
Mere badness – as opposed to violation of law or canons of ethical behavior – falls under the category of 'professional judgment.' The professional judgment in question belongs to the hiring manager, typically a department chair. It does not belong to you.
To the extent that you can inform that judgment with relevant and verifiable facts, presented calmly, that may or may not be worthwhile, depending on personalities and local culture. But if you go on a crusade to get her fired, you will be perceived – rightly or wrongly – as part of the problem.
One of the really frustrating lessons I've had to learn in administration is that you never want to get into a point-by-point argument with a crazy person. They don't fight fair, and you'll get dragged into their mud. The way to handle them is to take the high road, stick to facts, and to trust that, over time, their nuttiness will discredit them. (If the entire culture of the college is nutty, you're probably best off finding another place to work.)
If you fight a gossip-monger by gossip-mongering, it's hard to imagine a positive outcome. At best, maybe you battle to a draw, reducing your own credibility to her level. At worst, you lose, since she has more practice at that game. Don't do it.
And that's without even addressing the cost of the battle in terms of both time and emotional energy. Both are finite, and both could be better spent doing almost anything else. You'd be much better off focusing on things you can actually control. For example, if it helps you sleep at night, you could go to the chair to deny her allegations, and merely take a classy and conspicuous silence regarding her. I'm not usually a fan of this approach – it's hard to defend yourself without sounding defensive – but it can make sense in some cases. Then learn what you can from the experience and move on.
(For the record, my advice would be very different if she were doing something clearly illegal or immoral. In those cases, I see an ethical obligation to blow the whistle. But that's not this case.)
If it's any consolation, it's possible to learn from lousy mentors and bosses, just as it's possible to learn from good ones. Reflect carefully on all that you've seen and experienced. To the extent that you can translate visceral responses into conscious ideas, you may be able to make yourself a more effective instructor.
Good luck. This isn't a pretty situation, but it doesn't have to get any uglier.
Wise and worldly readers – any thoughts on this one?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Wednesday, January 14, 2009
When a Workplace Skips a Generation
There's a fairly wide, if shallow, literature out there on different generations in the workplace. It's often fun to read, if of limited usefulness. This week I finally realized what's missing.
What are the effects when a workplace skips a generation?
In the community college world, this seems distressingly common. In discussion with a contact at another cc this week, where the same phenomenon holds, I realized that the reasons for the gap are more complicated than I had initially thought.
At its core, of course, is the huge burst of hiring in the late sixties and early seventies, followed by decades of severely restricted hiring. Combine that with a tenure system, salaries based on seniority, pension benefits based on seniority, and a relative lack of alternatives for many employees after a given number of years, and you get serious stasis.
Perversely, the explosion of antidiscrimination law and litigation made it even worse. In order to avoid litigation, many public (and presumably some private) institutions started hiring according to pretty rigid 'point' systems. Applicants get so many points for degree level, years of experience, and the like, with interviews going to the top point-getters. When experience is valued linearly – when the marginal point value between years fifteen and twenty is the same as the marginal point value between years zero and five – this amounts to a perfectly legal form of reverse age discrimination. My contact mentioned that her college actually did some hiring about ten years ago, but nearly everybody hired at that point was in their late forties or higher. They scored the most points.
(My proposal for valuing experience: value the first few years quite a bit, then apply the law of diminishing returns. This is consistent with peer-reviewed studies of the effects of experience on effectiveness.)
In the last three years – ending quite abruptly this Fall – there was a hiring boomlet, with a cohort evenly divided between the fiftysomethings and the twentysomethings. Now with the latest freeze, I don't expect another cohort to come in for several years at least.
So on both campuses, there's a huge cohort of fifty-and-up, and a small cohort in its twenties. The thirty- and forty-somethings are rare birds. Thirty- and forty-somethings with children are even rarer.
I've heard of similar patterns happening in certain boom-and-bust industries, like energy. When hiring happens in bursts, with long troughs in between, it's easy for a generation to get skipped.
The effects strike me as generally negative. (And that's without even addressing the issue of fairness to the cohort for whom opportunities were few and far between.)
For one, it wreaks havoc with any serious effort at succession planning. When everybody in a department is counting days until retirement, except the one new 26 year old hire who is focused solely on getting tenure, it's easy to see a leadership crisis in the offing. Even hiring from the outside is tough when the generation skipped was skipped by an entire industry.
It also seems to have negative effects on employee retention. Among my friends from high school, college, and grad school who went on to get doctorates – we're into double digits here – I can only think of two who are still full-time professors. One is looking into administration, and the other is looking actively for an industry job. (Several moved into industry, a few into administration, and the rest just sort of fell off the planet.) At my cc, the retention rate for the few folks of my generation is conspicuously lower than for the group before it and the group after it. The leadership of the campus has noticed it, but doesn't seem to have any serious idea what to do about it. And now that the conversation has shifted from 'hires' to 'layoffs,' this isn't exactly a burning issue. If anything, this group will be among the first to get laid off, sacrificed yet again to the bitch goddess of seniority.
The lack of a peer group makes that sense of 'belonging' much harder to sustain, especially when the huge and immovable group above you has so much history. (I'm told to expect the pace of retirements to slow even more, now that the returns on retirement accounts have turned negative. Swell.) I've even become a sort of unofficial translator for some of my colleagues, making the utterances of two groups forty years apart from each other mutually intelligible. (True example: I had to explain at a recent meeting that current 19 year olds regard email as obsolete. That elicited audible groans from some of the senior folk, who still insist on receiving anything official as hardcopy.) I have literally been stopped in the hallway by senior faculty, asking me to translate something a student said. Nobody in their departments is young enough to do it, so it goes to me by default. Being the Ambassador from Mars doesn't do much to solidify that sense of identification.
Hiring to fill in the gaps is explicitly illegal, given the assumptions embedded in the age discrimination laws. That doesn't help, either.
Wise and worldly readers in similar demographic blind spots – have you found ways of dealing with this? Has your college? I know this isn't a crisis, yet, but I can see one coming down the pike. And the loneliness can get a little wearing.
What are the effects when a workplace skips a generation?
In the community college world, this seems distressingly common. In discussion with a contact at another cc this week, where the same phenomenon holds, I realized that the reasons for the gap are more complicated than I had initially thought.
At its core, of course, is the huge burst of hiring in the late sixties and early seventies, followed by decades of severely restricted hiring. Combine that with a tenure system, salaries based on seniority, pension benefits based on seniority, and a relative lack of alternatives for many employees after a given number of years, and you get serious stasis.
Perversely, the explosion of antidiscrimination law and litigation made it even worse. In order to avoid litigation, many public (and presumably some private) institutions started hiring according to pretty rigid 'point' systems. Applicants get so many points for degree level, years of experience, and the like, with interviews going to the top point-getters. When experience is valued linearly – when the marginal point value between years fifteen and twenty is the same as the marginal point value between years zero and five – this amounts to a perfectly legal form of reverse age discrimination. My contact mentioned that her college actually did some hiring about ten years ago, but nearly everybody hired at that point was in their late forties or higher. They scored the most points.
(My proposal for valuing experience: value the first few years quite a bit, then apply the law of diminishing returns. This is consistent with peer-reviewed studies of the effects of experience on effectiveness.)
In the last three years – ending quite abruptly this Fall – there was a hiring boomlet, with a cohort evenly divided between the fiftysomethings and the twentysomethings. Now with the latest freeze, I don't expect another cohort to come in for several years at least.
So on both campuses, there's a huge cohort of fifty-and-up, and a small cohort in its twenties. The thirty- and forty-somethings are rare birds. Thirty- and forty-somethings with children are even rarer.
I've heard of similar patterns happening in certain boom-and-bust industries, like energy. When hiring happens in bursts, with long troughs in between, it's easy for a generation to get skipped.
The effects strike me as generally negative. (And that's without even addressing the issue of fairness to the cohort for whom opportunities were few and far between.)
For one, it wreaks havoc with any serious effort at succession planning. When everybody in a department is counting days until retirement, except the one new 26 year old hire who is focused solely on getting tenure, it's easy to see a leadership crisis in the offing. Even hiring from the outside is tough when the generation skipped was skipped by an entire industry.
It also seems to have negative effects on employee retention. Among my friends from high school, college, and grad school who went on to get doctorates – we're into double digits here – I can only think of two who are still full-time professors. One is looking into administration, and the other is looking actively for an industry job. (Several moved into industry, a few into administration, and the rest just sort of fell off the planet.) At my cc, the retention rate for the few folks of my generation is conspicuously lower than for the group before it and the group after it. The leadership of the campus has noticed it, but doesn't seem to have any serious idea what to do about it. And now that the conversation has shifted from 'hires' to 'layoffs,' this isn't exactly a burning issue. If anything, this group will be among the first to get laid off, sacrificed yet again to the bitch goddess of seniority.
The lack of a peer group makes that sense of 'belonging' much harder to sustain, especially when the huge and immovable group above you has so much history. (I'm told to expect the pace of retirements to slow even more, now that the returns on retirement accounts have turned negative. Swell.) I've even become a sort of unofficial translator for some of my colleagues, making the utterances of two groups forty years apart from each other mutually intelligible. (True example: I had to explain at a recent meeting that current 19 year olds regard email as obsolete. That elicited audible groans from some of the senior folk, who still insist on receiving anything official as hardcopy.) I have literally been stopped in the hallway by senior faculty, asking me to translate something a student said. Nobody in their departments is young enough to do it, so it goes to me by default. Being the Ambassador from Mars doesn't do much to solidify that sense of identification.
Hiring to fill in the gaps is explicitly illegal, given the assumptions embedded in the age discrimination laws. That doesn't help, either.
Wise and worldly readers in similar demographic blind spots – have you found ways of dealing with this? Has your college? I know this isn't a crisis, yet, but I can see one coming down the pike. And the loneliness can get a little wearing.
Tuesday, January 13, 2009
Ask the Administrator: Perceptions of Online Graduate Degrees
A new correspondent writes:
Ah, the joys of turning 40. Preach it, brother. Sandra Tsing Loh claims that 40 is when the wheels fall off. I'm experiencing it as the age when you discover that nobody really knows very much, yourself absolutely included. But that's another post.
The real question was about graduate online degree programs, and how they're perceived.
First and most obviously, not every online program is the same (just like not every traditional program is the same). Some are more respected than others. Having said that, what I've observed has been that online degrees work fine for certain kinds of administrative jobs, but are still usually looked upon askance by faculty.
That may seem paradoxical, but it isn't if you look at administrative jobs outside the academic line of department chair – dean – provost. For example, in the student services areas (admissions, counseling, financial aid, etc.), I've seen people do a great deal with online degrees. In those areas, what really gets you ahead is actual job performance; the degree is usually seen as getting your hand stamped. As long as the stamper is properly accredited, all is well.
With your background, I could imagine you doing very well running tutoring services, and making your way up from there. You'd be dealing with the very students who have captured your heart, and would be staying close to the academic mission, but you wouldn't have to worry about impressing anybody with a highfalutin degree or a dense publication dossier.
That said, you'd probably need to stop thinking in terms of 'how do I break into the university world,' and start thinking about whether you could be happy staying in the community college (or lower-tier four-year public college) world. The Harvards of the world maintain their prestige through rigorous inbreeding, often to predictable effect. Jumping strata like that is unusual and uniquely difficult. It can be done, but I wouldn't base a life plan on it.
At some level, it's about figuring out what matters to you. In my own case, despite having done time in some of the snootier corners of academe, I've decided that clarity of mission trumps prestige. Community colleges have a clarity of mission, and a genuine public purpose, that I find appealing. I feel like I have something real to contribute here. In the snootier corners, there are plenty of folk more adept at certain kinds of gamesmanship than I, and I'm okay with that.
In other words, if you see the online degree as a way back into the ivory tower, I'd be skeptical. But if you see it as a way to apply what you've learned through teaching, to help students who really need it, and to feed your kids and pay your mortgage, go for it.
Good luck!
Wise and worldly readers – what do you think? How are online graduate degrees received where you work?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
I am two years into the four year tenure process at my community college after having been a lowly lecturer at a research institution for five years. The seven years I have been teaching professionally has taken its toll on my dissertation. Well, on both my dissertations...my original lit theory one (2 chapters) and the composition pedagogy one (1 1/2 chapters) I devised after having taught composition exclusively for two years. In any event, both dissertations are defunct, as is my time extension to complete the dissertation process. While I might be able to revive my prior degree with more begging and promises, I cannot bring myself to do that again. Besides, to salvage some sort of professional ethos, I finally came to a point there a few years ago where I had to tell myself, "Not completing is ok. I am a teacher, not a researcher."
Which is, of course, not completely the truth. At 40, I am finally settling into some financial security, a marriage, home ownership, a job with which I could retire, fatherhood, and have (mostly) given up video games, I am beginning to think about what else I need to do. I have never actually stopped thinking of writing or researching and I usually present at one or two conferences a year. A part of me still longs to produce interesting and vital scholarship and to have the letters behind my name that give that scholarship more scholarly heft. And, some day, I think it might be nice to teach a grad seminar or two at a local university.
Though I am dubious of the academic value online classes I teach, I am considering pursuing a new doctorate in an online setting. In particular, I am thinking about going for an EdD or for a cross-disciplinary degree in human organizations because on a daily basis I grow more interested in how we at CCs deal with students who are at pre-college levels. Because of my schedule though (and my need to keep working to pay off my prior college loans), I cannot see any way to go for a degree except online. My questions for you, then, are as follows: As an administrator at a CC, what would you think of one of your faculty members pursuing an advanced degree online? As an academic, what do you think is the tenor of the academy regarding online graduate work? And as someone who is, presumably, familiar with the tensions between community colleges and universities faculties, how do you think a community college instructor who gets doctored up online might be assessed by a university hiring committee?
Ah, the joys of turning 40. Preach it, brother. Sandra Tsing Loh claims that 40 is when the wheels fall off. I'm experiencing it as the age when you discover that nobody really knows very much, yourself absolutely included. But that's another post.
The real question was about graduate online degree programs, and how they're perceived.
First and most obviously, not every online program is the same (just like not every traditional program is the same). Some are more respected than others. Having said that, what I've observed has been that online degrees work fine for certain kinds of administrative jobs, but are still usually looked upon askance by faculty.
That may seem paradoxical, but it isn't if you look at administrative jobs outside the academic line of department chair – dean – provost. For example, in the student services areas (admissions, counseling, financial aid, etc.), I've seen people do a great deal with online degrees. In those areas, what really gets you ahead is actual job performance; the degree is usually seen as getting your hand stamped. As long as the stamper is properly accredited, all is well.
With your background, I could imagine you doing very well running tutoring services, and making your way up from there. You'd be dealing with the very students who have captured your heart, and would be staying close to the academic mission, but you wouldn't have to worry about impressing anybody with a highfalutin degree or a dense publication dossier.
That said, you'd probably need to stop thinking in terms of 'how do I break into the university world,' and start thinking about whether you could be happy staying in the community college (or lower-tier four-year public college) world. The Harvards of the world maintain their prestige through rigorous inbreeding, often to predictable effect. Jumping strata like that is unusual and uniquely difficult. It can be done, but I wouldn't base a life plan on it.
At some level, it's about figuring out what matters to you. In my own case, despite having done time in some of the snootier corners of academe, I've decided that clarity of mission trumps prestige. Community colleges have a clarity of mission, and a genuine public purpose, that I find appealing. I feel like I have something real to contribute here. In the snootier corners, there are plenty of folk more adept at certain kinds of gamesmanship than I, and I'm okay with that.
In other words, if you see the online degree as a way back into the ivory tower, I'd be skeptical. But if you see it as a way to apply what you've learned through teaching, to help students who really need it, and to feed your kids and pay your mortgage, go for it.
Good luck!
Wise and worldly readers – what do you think? How are online graduate degrees received where you work?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Monday, January 12, 2009
The Bright Side of Economic Freefall
As the soufflé of an overleveraged economy collapses, it’s easy to focus on the negative.
I spent most of December writing variations on “now maybe we can finally start to rethink how we do business” posts, so I won’t rehash that. I’ll just note that it would be criminal to let a perfectly good crisis go to waste, and leave it at that.
Instead, this one’s about psychology.
With faculty (and administrative) searches being cancelled left and right, I’m thinking that this dropoff might be the final nail in the coffin of the idea of ‘meritocracy.’ Simply put, the searches being cancelled now are no reflection of the quality of candidates, any more than the boom market of the sixties was a reflection of the quality of candidates then. The market is dramatically tougher now than it was just two or three years ago; to suggest that the candidate pool worsened in that time by several orders of magnitude is simply silly. The disconnect between ‘candidate quality’ and ‘market quality’ is so dramatic, at this point, that the ‘merit’ narrative is simply unsustainable. And that may actually be a good thing.
Although it certainly beats some of the historical alternatives, the ‘merit’ narrative strikes me as not just false, but actually damaging.
First, and most obviously, it suggests that the folks who don't land the positions they want are somehow damaged goods. Some of them may be, but in this market, they can't all be. I can't help but wonder to what degree the otherwise-puzzling persistence of long-term adjuncts who just keep on plugging, looking for the big break, is driven by a felt need to redeem themselves in this value system. It's not economically rational, but there must be something, or there wouldn't be so many people doing it. To the extent that we can start to distinguish 'pay' from 'worth,' maybe some people will finally feel like it's okay to try something else.
Second, the effects on the 'winners' are ambiguous, at best. Some seem to internalize the ranking, resulting in a career wasted in bitterness that they aren't ensconced at some higher-level institution. Those who do manage to land at the really prestigious places seem to fall prey to 'impostor syndrome' at fairly high levels. In either case, the self-doubt is both corrosive and unnecessary. Even the winners lose.
Over the break, I had the chance to read Malcolm Gladwell's Outliers, which is great fun. Among other things, the book spends some time on the idea of a 'threshold' of ability. As he tells it, between the limits of knowledge and the realities of life, it's often silly to pretend that minute differences in ability are actually meaningful. For many tasks -- including very high-end ones -- there's basically a 'threshold' of ability. Either you can do it, or you can't. Among those who can, single-measure differences are essentially arbitrary.
Yet we academics persist in believing that the Great Chain of Prestige, starting at Harvard and working its way on down, is founded in basic truth. And because it's an objective reflection of merit, being anyplace other than the tippity-top must reflect a personal failing.
The ghosts of cancelled searches this year suggest otherwise. If some of us start to realize that, this round of panic will actually have done some good.
I spent most of December writing variations on “now maybe we can finally start to rethink how we do business” posts, so I won’t rehash that. I’ll just note that it would be criminal to let a perfectly good crisis go to waste, and leave it at that.
Instead, this one’s about psychology.
With faculty (and administrative) searches being cancelled left and right, I’m thinking that this dropoff might be the final nail in the coffin of the idea of ‘meritocracy.’ Simply put, the searches being cancelled now are no reflection of the quality of candidates, any more than the boom market of the sixties was a reflection of the quality of candidates then. The market is dramatically tougher now than it was just two or three years ago; to suggest that the candidate pool worsened in that time by several orders of magnitude is simply silly. The disconnect between ‘candidate quality’ and ‘market quality’ is so dramatic, at this point, that the ‘merit’ narrative is simply unsustainable. And that may actually be a good thing.
Although it certainly beats some of the historical alternatives, the ‘merit’ narrative strikes me as not just false, but actually damaging.
First, and most obviously, it suggests that the folks who don't land the positions they want are somehow damaged goods. Some of them may be, but in this market, they can't all be. I can't help but wonder to what degree the otherwise-puzzling persistence of long-term adjuncts who just keep on plugging, looking for the big break, is driven by a felt need to redeem themselves in this value system. It's not economically rational, but there must be something, or there wouldn't be so many people doing it. To the extent that we can start to distinguish 'pay' from 'worth,' maybe some people will finally feel like it's okay to try something else.
Second, the effects on the 'winners' are ambiguous, at best. Some seem to internalize the ranking, resulting in a career wasted in bitterness that they aren't ensconced at some higher-level institution. Those who do manage to land at the really prestigious places seem to fall prey to 'impostor syndrome' at fairly high levels. In either case, the self-doubt is both corrosive and unnecessary. Even the winners lose.
Over the break, I had the chance to read Malcolm Gladwell's Outliers, which is great fun. Among other things, the book spends some time on the idea of a 'threshold' of ability. As he tells it, between the limits of knowledge and the realities of life, it's often silly to pretend that minute differences in ability are actually meaningful. For many tasks -- including very high-end ones -- there's basically a 'threshold' of ability. Either you can do it, or you can't. Among those who can, single-measure differences are essentially arbitrary.
Yet we academics persist in believing that the Great Chain of Prestige, starting at Harvard and working its way on down, is founded in basic truth. And because it's an objective reflection of merit, being anyplace other than the tippity-top must reflect a personal failing.
The ghosts of cancelled searches this year suggest otherwise. If some of us start to realize that, this round of panic will actually have done some good.
Friday, January 09, 2009
Ask the Administrator: The Book or the Grant?
A new correspondent in a humanistic discipline writes:
(In a subsequent email, he noted that his is a humanistic discipline.)
My first thought is, this is a good problem to have. I know people who would kill to have this problem.
Having said that, though, your reference to opportunity cost is spot-on. Time spent on halfhearted grant applications is time not spent doing other things, like finishing your book. So there is a choice to be made.
Given that you're in a humanistic discipline, successful large-scale grantsmanship is relatively rare. Yes, it would impress everybody if you were to pull in some major cash, but the reason it would impress them is that it rarely happens, especially in the early years of a career. It would be great, but it isn't necessary or expected.
(In the social sciences, I've noticed a distinct trend among granting agencies to favor quantitative approaches over qualitative ones. Over the decades, this has led to a catastrophic distortion of scholarship in untold ways. But that's somebody else's fight. And I'm told that in the natural sciences, books count for almost nothing and grantsmanship is far more important. Context matters.)
A book, on the other hand, is probably both necessary and expected. Teaching loads as light as yours almost always come with publication requirements, whether formal or informal, and you ignore those at your peril. Taking care of first things first – doing the necessary before the nice-to-have – will give you the freedom later to take those 5% shots. Missing the longshot now could put you in a badly disadvantaged position at your job, and there's no need for that.
Better, a book under your belt will likely make you a stronger candidate for whatever grants you do eventually pursue, if any. The rich tend to get richer, so getting your hand stamped as a Recognized Scholar can only help. First things first.
Good luck!
Wise and worldly readers – how would you read this one?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
I'm now three semesters into a permanent position at a lower-mid-range research institution with aspirations to become something better. Said institution has no strength in my specialty, but gives me an absurdly low teaching load (2-1), generously supports research travel, and is even located in a nice town. One can always find something to kvetch about, but I'm basically delighted.
That said, I'm curious about hear your take on my present emotions filling out a grant application. The powers that be are very anxious for faculty to apply for large, government-funded research grants (say $80,000), because they bring money into the university. There's a full time staff member whose only responsibility is to help faculty fill out the applications. My department has some dead wood in it, and as the bright young thing with a shiny "recently on the job market" publication record, I'm under a lot of pressure to fill out an application.
Well, I have a lot of trouble thinking of something to spend this kind of money on! My research is not collaborative, and anyway requires knowledge of languages that aren't widely spoken in my institution: I can't really hire research assistants. I don't want to buy out my teaching: I worked hard to this job, take an interest in pedagogy, and, if anything, would like to teach a bit more. I can think of some pluses to winning a grant, of course. I'd enjoy the prestige, it would help my promotion prospects, I could hire a grad student to grade my first year papers, and I could buy plane tickets for summer research trips (though this could also be done through less-competitive university travel grants). Yet there are opportunity costs to filling out the grant: new paperwork conventions to master, electronic forms to fill out, and the applications, I'm told, are only successful 5% of the time. All in all, I think I would rather be finishing my book.
What's your take on this? Am I being a lazy faculty member, grumbling about pulling his weight for the team, or am I a greenhorn sucker with no backbone who should learn to stand up for his own research priorities?
(In a subsequent email, he noted that his is a humanistic discipline.)
My first thought is, this is a good problem to have. I know people who would kill to have this problem.
Having said that, though, your reference to opportunity cost is spot-on. Time spent on halfhearted grant applications is time not spent doing other things, like finishing your book. So there is a choice to be made.
Given that you're in a humanistic discipline, successful large-scale grantsmanship is relatively rare. Yes, it would impress everybody if you were to pull in some major cash, but the reason it would impress them is that it rarely happens, especially in the early years of a career. It would be great, but it isn't necessary or expected.
(In the social sciences, I've noticed a distinct trend among granting agencies to favor quantitative approaches over qualitative ones. Over the decades, this has led to a catastrophic distortion of scholarship in untold ways. But that's somebody else's fight. And I'm told that in the natural sciences, books count for almost nothing and grantsmanship is far more important. Context matters.)
A book, on the other hand, is probably both necessary and expected. Teaching loads as light as yours almost always come with publication requirements, whether formal or informal, and you ignore those at your peril. Taking care of first things first – doing the necessary before the nice-to-have – will give you the freedom later to take those 5% shots. Missing the longshot now could put you in a badly disadvantaged position at your job, and there's no need for that.
Better, a book under your belt will likely make you a stronger candidate for whatever grants you do eventually pursue, if any. The rich tend to get richer, so getting your hand stamped as a Recognized Scholar can only help. First things first.
Good luck!
Wise and worldly readers – how would you read this one?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Thursday, January 08, 2009
When Professors Vanish
This is one of the parts of the job for which you're never really trained.
Every so often, usually around this time of year, an adjunct who knows he's not coming back next semester simply vanishes. No grades turned in for the Fall class, no responses to emails or phone calls, just 'poof.' (I've never seen this happen with someone who had classes lined up for the following semester.) With no realistic prospect of continued employment, our short-term leverage for getting grades turned in is pretty weak. ("Give us the grades or...uh...just give us the grades!") Most people are professional enough that even if they don't like the pay, or the non-renewal, they still make the distinction between the students and the institution. But 'most' isn't 'all,' and the damage is real.
Most obviously, students are denied the timely credit they've earned. Sometimes that doesn't matter, as long as the credit eventually comes through, but sometimes they need it right away. Falling below the threshold of "successful academic progress" can have consequences for financial aid, academic probation/dismissal, and employer reimbursements, among other things. Delays in posting grades also hurt students who are sending out transfer applications, since deadlines are unforgiving and receiving schools generally assume that any glitch in the application reflects on the student. And it wreaks havoc with any course in a chain of prerequisites, since nobody knows who passed and who didn't.
In terms of getting back at the college, the extra work generated by late grades usually doesn't fall on the intended targets. The registrar's office does the on-time grades through 'batch' processing; anything late has to be done manually. Financial aid works much the same way. (Financial aid has it worse, since pots of money can be exhausted by the time the late grades are changed.) Neither office has anything to do with setting adjunct pay scales, but they're the ones that do the heavy lifting when this happens.
The post-finals 'poof' is the worst kind. Back at my first administrative gig, I saw adjuncts walk away mid-semester. That was bad, but at least at that point it was possible to address the students as a group, find a sub, and 'look for points.' (Whenever something along these lines happens, I've taken the position that we should do our best to hold the students as harmless as possible.) After final exams, though, there's no clean and painless way to address the students as a group. At that point, too, some fairly substantial components of the final grade are typically missing -- the final exam and/or final paper or project -- so it's tough to assign any sort of reasonable value to what they can show.
I don't make any grand claims for the kind of teacher I was, but I can honestly say that doing something like this never occurred to me. Even in the worst adjunct gigs, when I soured on entire institutions, I never left the students hanging. So when someone did, I was initially dumbstruck. It was so far past reasonable that I couldn't even piece together a coherent response.
(Before the flaming, I'll just stipulate that I'm not talking about an organized work stoppage. I consider that a different issue. This is action by a single person.)
If we get through January without this happening, I'll consider it a great start to the year.
Every so often, usually around this time of year, an adjunct who knows he's not coming back next semester simply vanishes. No grades turned in for the Fall class, no responses to emails or phone calls, just 'poof.' (I've never seen this happen with someone who had classes lined up for the following semester.) With no realistic prospect of continued employment, our short-term leverage for getting grades turned in is pretty weak. ("Give us the grades or...uh...just give us the grades!") Most people are professional enough that even if they don't like the pay, or the non-renewal, they still make the distinction between the students and the institution. But 'most' isn't 'all,' and the damage is real.
Most obviously, students are denied the timely credit they've earned. Sometimes that doesn't matter, as long as the credit eventually comes through, but sometimes they need it right away. Falling below the threshold of "successful academic progress" can have consequences for financial aid, academic probation/dismissal, and employer reimbursements, among other things. Delays in posting grades also hurt students who are sending out transfer applications, since deadlines are unforgiving and receiving schools generally assume that any glitch in the application reflects on the student. And it wreaks havoc with any course in a chain of prerequisites, since nobody knows who passed and who didn't.
In terms of getting back at the college, the extra work generated by late grades usually doesn't fall on the intended targets. The registrar's office does the on-time grades through 'batch' processing; anything late has to be done manually. Financial aid works much the same way. (Financial aid has it worse, since pots of money can be exhausted by the time the late grades are changed.) Neither office has anything to do with setting adjunct pay scales, but they're the ones that do the heavy lifting when this happens.
The post-finals 'poof' is the worst kind. Back at my first administrative gig, I saw adjuncts walk away mid-semester. That was bad, but at least at that point it was possible to address the students as a group, find a sub, and 'look for points.' (Whenever something along these lines happens, I've taken the position that we should do our best to hold the students as harmless as possible.) After final exams, though, there's no clean and painless way to address the students as a group. At that point, too, some fairly substantial components of the final grade are typically missing -- the final exam and/or final paper or project -- so it's tough to assign any sort of reasonable value to what they can show.
I don't make any grand claims for the kind of teacher I was, but I can honestly say that doing something like this never occurred to me. Even in the worst adjunct gigs, when I soured on entire institutions, I never left the students hanging. So when someone did, I was initially dumbstruck. It was so far past reasonable that I couldn't even piece together a coherent response.
(Before the flaming, I'll just stipulate that I'm not talking about an organized work stoppage. I consider that a different issue. This is action by a single person.)
If we get through January without this happening, I'll consider it a great start to the year.
Wednesday, January 07, 2009
Ask the Administrator: Union Work as CV Stain?
A slightly nervous correspondent writes:
My first thought is that both regional and local variables will come into play.
Where I am now, I can't imagine union work being held against you. If anything, in some departments, it might help you get past the department's search committee. But I also know that there are parts of the country in which labor activism would raise eyebrows, if not hackles. (I'm not entirely sure what a 'hackle' is, but I know it's not supposed to be raised.)
That said, there are also dramatic variations among institutions in the same region, and even among administrators within a single institution. The blue state/red state divide may give you a pretty good sense of the aggregate, but in any given case, it's not much help. I'd venture a guess that it would be most toxic at colleges that were battling unionization drives at the time. (Many years ago, when trying to escape Proprietary U, I had an interview for a deanship at a small private university. When I asked another dean there whether the faculty were unionized, he responded "not yet." That spoke volumes.) Oddly enough, I've noticed that colleges without unions get all worked up about them, but colleges with unions tend to accept them as facts of life. Having managed in a collective-bargaining environment for some time now, I can attest that contracts bind both sides, and that once you figure that out, a lot of the fear goes away.
In terms of what you reveal, you're free to leave things off the initial cv, but at most public institutions, there's also a standard 'job application' form that every applicant has to submit that includes questions about your last several jobs, in chronological order. Failure to disclose something on that, if it were found out, would be grounds for revocation of an offer, or for termination if it were discovered later. ('Failure to disclose' comes in handy when you find out that someone neglected to mention a criminal conviction. At that point, you don't have to establish direct relevance, or even evasive intent; all you have to do is show failure to disclose.) If you take the job, I think you'd have to be willing to disclose it and take the risks that come with that.
On a different note, your predicament calls to mind a persistent and terrible structural flaw in many graduate programs: the funding runs out before you're a viable candidate anyplace else. I've never fully understood why that happens with such frequency, but it does. Perhaps my colleagues at graduate programs could enlighten us.
In any event, best of luck on your search.
Wise and worldly readers -- have you seen any effect from disclosing union activism in your job searches? Alternately, from the hiring side, have you seen the issue arise? How did it play out?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
I'm a grad student in a humanities discipline at a public university, and I'm set to graduate with my MA in Spring 09. I had planned to graduate Fall 08, but it didn't work out that way. My ambition is a tenure-track position at one of the many fine community colleges in this area. I came up from community college myself, and I am a true believer in the CC mission. But because my degree won't be posted until May, I am stuck for a job until Summer 09 at the earliest, and I need some money coming in. I worked the last three semesters as a TA, and as a TA, I was represented by the UAW. I've been moderately active with the union, and now they've offered me a job as an organizer. The money is better than I'd get working at private ESL or test-prep schools, which seem to be pretty much my only other options right now, and I'd like to help get exploited grad students into the union. Here's the thing: my dad was a union organizer (non-academic), and he suffered some pretty serious retribution in his workplace, including being denied promotions and advancement opportunities. I wouldn't be handling grievances or anything like that, just getting people signed up, running elections, etc., but I'm concerned that if I get labeled as an activist or an organizer, I won't be able to get a tenure-track position. So I guess my questions are these: Are my fears founded? If so, would I have to reveal my union work during the interview process? If I don't reveal this work and I am hired, would that come back to bite me? I feel like I don't have enough information to make this decision. All advice is appreciated, and I'd like to remain anonymous.
My first thought is that both regional and local variables will come into play.
Where I am now, I can't imagine union work being held against you. If anything, in some departments, it might help you get past the department's search committee. But I also know that there are parts of the country in which labor activism would raise eyebrows, if not hackles. (I'm not entirely sure what a 'hackle' is, but I know it's not supposed to be raised.)
That said, there are also dramatic variations among institutions in the same region, and even among administrators within a single institution. The blue state/red state divide may give you a pretty good sense of the aggregate, but in any given case, it's not much help. I'd venture a guess that it would be most toxic at colleges that were battling unionization drives at the time. (Many years ago, when trying to escape Proprietary U, I had an interview for a deanship at a small private university. When I asked another dean there whether the faculty were unionized, he responded "not yet." That spoke volumes.) Oddly enough, I've noticed that colleges without unions get all worked up about them, but colleges with unions tend to accept them as facts of life. Having managed in a collective-bargaining environment for some time now, I can attest that contracts bind both sides, and that once you figure that out, a lot of the fear goes away.
In terms of what you reveal, you're free to leave things off the initial cv, but at most public institutions, there's also a standard 'job application' form that every applicant has to submit that includes questions about your last several jobs, in chronological order. Failure to disclose something on that, if it were found out, would be grounds for revocation of an offer, or for termination if it were discovered later. ('Failure to disclose' comes in handy when you find out that someone neglected to mention a criminal conviction. At that point, you don't have to establish direct relevance, or even evasive intent; all you have to do is show failure to disclose.) If you take the job, I think you'd have to be willing to disclose it and take the risks that come with that.
On a different note, your predicament calls to mind a persistent and terrible structural flaw in many graduate programs: the funding runs out before you're a viable candidate anyplace else. I've never fully understood why that happens with such frequency, but it does. Perhaps my colleagues at graduate programs could enlighten us.
In any event, best of luck on your search.
Wise and worldly readers -- have you seen any effect from disclosing union activism in your job searches? Alternately, from the hiring side, have you seen the issue arise? How did it play out?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Tuesday, January 06, 2009
The Uses of Students
A couple of weeks ago I had a conversation with a contact at a respected private university. We discussed the different effects the recession is having at the cc level, as opposed to the private university level, and compared notes on enrollment trends. Then she mentioned something that explained a lot.
She said that while cc grads who transfer to her university do just as well academically as native students, they don't donate as much back to the university as alums. They only spent two years there, instead of four, so they don't feel the same level of attachment. The university knows that, so it puts a pretty tight lid on transfer admissions. It admits a few students to fill out the numbers in some upper-level courses, but that's it. It doesn't want to jeopardize the future funding stream from donations.
I have to admit, I hadn't thought of that.
In the cc world, we don't really talk about the usefulness of different types of students. (Since we have open-door admissions, there wouldn't be much point anyway.) That's not because we don't need contributions from alums -- heaven knows we do -- but it just isn't consistent with our mission. We take everybody, whether they're good risks for future philanthropy or not.
But it did help to explain the weirdly bifurcated responses we usually get when we try to send students to certain private universities.
The lower-tier ones are almost entirely tuition-driven, and our grads pay tuition like anybody else, so they take our folk with open arms. The very elite places only take small numbers, but they only take small numbers of anybody, so we're not singled out. But certain mid-level schools will take just a few students -- often taking every credit they transfer -- and then shut the door quickly.
It would be tempting to take the moral high ground here and decry certain places for using students as cash cows, but honesty compels me to admit that we often use student success stories in both our advertising and our philanthropic appeals. I don't have a problem with that; a college that doesn't have success stories to show probably has some tough questions to answer. Of course, we also use student tuition to pay bills, and strong enrollment numbers help us make the case to legislators (when state budgets allow) to improve our funding. So yes, we use students to keep the place running; after all, if we didn't have students, I'd be hard-pressed to justify our existence. It's just that, at this level, we don't connect 'admissions' to 'development.' Other places do.
I'm not sure what to do with this information, beyond sharing it with everyone in internet land. Should we start coaching our sophomores to talk about their future philanthropic prospects? Maybe use this as an angle to pursue "joint admissions" programs with some of the local schools, to get students identifying with them early?
Oddly enough, the two corners of higher ed with reliably 'need-blind' admissions are the super-elites, who can afford anything, and community colleges. The folks in between see prospective students not only as tuition payers, but as future donors, and judge them accordingly.
Wise and worldly readers, I need your help. Is there a way to use this information to help cc grads transfer more successfully?
She said that while cc grads who transfer to her university do just as well academically as native students, they don't donate as much back to the university as alums. They only spent two years there, instead of four, so they don't feel the same level of attachment. The university knows that, so it puts a pretty tight lid on transfer admissions. It admits a few students to fill out the numbers in some upper-level courses, but that's it. It doesn't want to jeopardize the future funding stream from donations.
I have to admit, I hadn't thought of that.
In the cc world, we don't really talk about the usefulness of different types of students. (Since we have open-door admissions, there wouldn't be much point anyway.) That's not because we don't need contributions from alums -- heaven knows we do -- but it just isn't consistent with our mission. We take everybody, whether they're good risks for future philanthropy or not.
But it did help to explain the weirdly bifurcated responses we usually get when we try to send students to certain private universities.
The lower-tier ones are almost entirely tuition-driven, and our grads pay tuition like anybody else, so they take our folk with open arms. The very elite places only take small numbers, but they only take small numbers of anybody, so we're not singled out. But certain mid-level schools will take just a few students -- often taking every credit they transfer -- and then shut the door quickly.
It would be tempting to take the moral high ground here and decry certain places for using students as cash cows, but honesty compels me to admit that we often use student success stories in both our advertising and our philanthropic appeals. I don't have a problem with that; a college that doesn't have success stories to show probably has some tough questions to answer. Of course, we also use student tuition to pay bills, and strong enrollment numbers help us make the case to legislators (when state budgets allow) to improve our funding. So yes, we use students to keep the place running; after all, if we didn't have students, I'd be hard-pressed to justify our existence. It's just that, at this level, we don't connect 'admissions' to 'development.' Other places do.
I'm not sure what to do with this information, beyond sharing it with everyone in internet land. Should we start coaching our sophomores to talk about their future philanthropic prospects? Maybe use this as an angle to pursue "joint admissions" programs with some of the local schools, to get students identifying with them early?
Oddly enough, the two corners of higher ed with reliably 'need-blind' admissions are the super-elites, who can afford anything, and community colleges. The folks in between see prospective students not only as tuition payers, but as future donors, and judge them accordingly.
Wise and worldly readers, I need your help. Is there a way to use this information to help cc grads transfer more successfully?
Monday, January 05, 2009
Back in the Saddle Again
Christmas break was glorious, if a bit too short. After about a week of waking up at a civilized hour, my brain started to snap back to its original shape and I started to feel human again. (Idea for the betterment of human civilization: move the start of the workday to, oh, ten-ish. You're welcome.) The kids were astonishingly well-behaved, TW made the house look fantastic, and we did lots of family time. Even the blogging break was welcome, as the idea well was running dangerously low. (And thanks to everyone for their gracious holiday emails!)
We skipped the lutefisk again this year, to everybody's secret relief. Some traditions live because they continue to answer a felt need. Some live because they're just too cute, or too easy, or too evocative of memory to discard. And some live out of sheer perversity. Lutefisk is that last kind. If you haven't had the pleasure, just imagine that gelatinous stuff that surrounds the slab of spam in the can, injected with a vague fish-y flavor. Better yet, don't.
As part of our “Okay, we live in the Northeast, we'll deal with it” resolution, we've become much more focused on cold-weather outdoor activities. We did several sledding outings, since there's a fairly impressive hill nearby. An etiquette tip from Dean Dad: watch your language as you hurtle to your certain doom. It occurred to me a bit late, as I careened down a tree-lined ravine without benefit of a steering mechanism, that I didn't want my last word on this earth to be “fuuuuuuuuuuck!” Luckily, our cold-weather bundling of the kids – start with a layer, add another, then another, and continue until they're pretty much immobilized – works both thermally and acoustically. I think.
TW also organized a successful ice-skating outing in a nearby hockey rink. I hadn't been to a hockey rink since probably the 1980's, when my Dad took me to some minor-league hockey games in Northern Town. The atmosphere hasn't changed at all. There's something comforting and familiar about cold concrete, echo-y acoustics, and the sight of children pointing and laughing at the Zamboni. The men's room didn't have the trough – don't ask – but otherwise, it was a dead ringer. I could almost smell the stale beer, and hear the florid, alcohol-fueled cursing of the crowd.
(As I remember it, minor-league hockey has a different feel from minor-league baseball. Minor-league baseball has a shambolic grace to it. Half the crowd isn't really there for the game, and the team owners know it, so they go out of their way to provide entertainment between half-innings. Children are everywhere, and the stands are full of families.
Minor-league hockey, in my experience, has more of a 'blood sport' feel to it. It's much more blue-collar, with fewer women and children, and a much higher profanity-per-sentence average. (Sample hockey cheer: “*&%#@*$%!”) I don't know quite how Canadian culture, which is otherwise so polite, gave rise to a sport that falls somewhere between 'bar brawl' and 'rugby, plus sticks,' but such are the mysteries of sport. To get the flavor of minor-league hockey at home, just walk into a closed, unheated garage in January, drink cheap beer from a Big Gulp cup, shout the worst obscenities you can for a couple of hours, and urinate against the wall. For extra credit, crash into something. Good times.)
We even watched some movies, which almost never happens during the year. On the highbrow side, we saw Milk, which I strongly recommend. The period detail in that movie was amazing, and Sean Penn managed to vanish into a character very different from the ones he usually plays. On the lowbrow side, we saw Hamlet 2, Step Brothers, and Get Smart. Get Smart doesn't really have much going for it, other than Anne Hathaway. Hamlet 2 was a little disappointing. I was hoping for some shockingly profane brilliance, like in the South Park movie, but most of it was a little flat. But Step Brothers was far better than it had any right to be. I was particularly taken by Kathryn Hahn, as the wife of the annoying younger brother. She combined a refined manner with a go-for-broke comic determination; imagine Laura Linney channeling Chris Farley. Not an easy mix, but she made it work. The scene in which she accosts John C. Reilly in the men's room is brilliant, bizarre, and laugh-out-loud funny. Someone give this woman a lead role!
Finally, of course, was the dreaded roadtrip. Stuffing four people, three days' worth of luggage, and an extended family's pile of Christmas presents into a small car, then hitting the road for several hours, is not for the faint of heart. It was worthwhile, though if you had asked me at about the three-hour mark on the way back, I might not have agreed. Handheld video games are what got us through, just like in pioneer times. (“I blasteth thee, vile creature!”) When the kids are distracted, all things are possible.
Here's hoping the state budget caught a bit of that Christmas spirit...
We skipped the lutefisk again this year, to everybody's secret relief. Some traditions live because they continue to answer a felt need. Some live because they're just too cute, or too easy, or too evocative of memory to discard. And some live out of sheer perversity. Lutefisk is that last kind. If you haven't had the pleasure, just imagine that gelatinous stuff that surrounds the slab of spam in the can, injected with a vague fish-y flavor. Better yet, don't.
As part of our “Okay, we live in the Northeast, we'll deal with it” resolution, we've become much more focused on cold-weather outdoor activities. We did several sledding outings, since there's a fairly impressive hill nearby. An etiquette tip from Dean Dad: watch your language as you hurtle to your certain doom. It occurred to me a bit late, as I careened down a tree-lined ravine without benefit of a steering mechanism, that I didn't want my last word on this earth to be “fuuuuuuuuuuck!” Luckily, our cold-weather bundling of the kids – start with a layer, add another, then another, and continue until they're pretty much immobilized – works both thermally and acoustically. I think.
TW also organized a successful ice-skating outing in a nearby hockey rink. I hadn't been to a hockey rink since probably the 1980's, when my Dad took me to some minor-league hockey games in Northern Town. The atmosphere hasn't changed at all. There's something comforting and familiar about cold concrete, echo-y acoustics, and the sight of children pointing and laughing at the Zamboni. The men's room didn't have the trough – don't ask – but otherwise, it was a dead ringer. I could almost smell the stale beer, and hear the florid, alcohol-fueled cursing of the crowd.
(As I remember it, minor-league hockey has a different feel from minor-league baseball. Minor-league baseball has a shambolic grace to it. Half the crowd isn't really there for the game, and the team owners know it, so they go out of their way to provide entertainment between half-innings. Children are everywhere, and the stands are full of families.
Minor-league hockey, in my experience, has more of a 'blood sport' feel to it. It's much more blue-collar, with fewer women and children, and a much higher profanity-per-sentence average. (Sample hockey cheer: “*&%#@*$%!”) I don't know quite how Canadian culture, which is otherwise so polite, gave rise to a sport that falls somewhere between 'bar brawl' and 'rugby, plus sticks,' but such are the mysteries of sport. To get the flavor of minor-league hockey at home, just walk into a closed, unheated garage in January, drink cheap beer from a Big Gulp cup, shout the worst obscenities you can for a couple of hours, and urinate against the wall. For extra credit, crash into something. Good times.)
We even watched some movies, which almost never happens during the year. On the highbrow side, we saw Milk, which I strongly recommend. The period detail in that movie was amazing, and Sean Penn managed to vanish into a character very different from the ones he usually plays. On the lowbrow side, we saw Hamlet 2, Step Brothers, and Get Smart. Get Smart doesn't really have much going for it, other than Anne Hathaway. Hamlet 2 was a little disappointing. I was hoping for some shockingly profane brilliance, like in the South Park movie, but most of it was a little flat. But Step Brothers was far better than it had any right to be. I was particularly taken by Kathryn Hahn, as the wife of the annoying younger brother. She combined a refined manner with a go-for-broke comic determination; imagine Laura Linney channeling Chris Farley. Not an easy mix, but she made it work. The scene in which she accosts John C. Reilly in the men's room is brilliant, bizarre, and laugh-out-loud funny. Someone give this woman a lead role!
Finally, of course, was the dreaded roadtrip. Stuffing four people, three days' worth of luggage, and an extended family's pile of Christmas presents into a small car, then hitting the road for several hours, is not for the faint of heart. It was worthwhile, though if you had asked me at about the three-hour mark on the way back, I might not have agreed. Handheld video games are what got us through, just like in pioneer times. (“I blasteth thee, vile creature!”) When the kids are distracted, all things are possible.
Here's hoping the state budget caught a bit of that Christmas spirit...
Subscribe to:
Posts (Atom)