Tuesday, January 31, 2012


College in High Schools

This one is really looking for advice from those among my wise and worldly readers who’ve found reasonably elegant ways to handle a particular situation.

Like many community colleges, mine offers some credit-bearing courses onsite in some local high schools that are just far enough way that it would be difficult for the students to commute.  In some cases, we’re just renting space in the high school and teaching at night.  Those cases are relatively straightforward; we pay a room fee and otherwise do what we would normally do.

But sometimes the school district wants a 100-level class offered to its students, on its premises, during its normal school day.  The logic, which makes sense to me, is that rather than simulating college with an AP or IB class, why not just teach the real thing?  Transcripted credits often do better in transfer than do, say, AP scores, which many colleges accept for placement but not credit.  Even better, when they bring in a real college professor, they bring in college level expectations for the students.  And the choices tend to be greater, since we offer classes in subjects for which AP tests don’t exist (as far as I know).

We’ve run into some logistical issues, though, and this is where I’m hoping some folks have found elegant solutions.

We knew, going in, that the semesters didn’t align cleanly.  (For example, our classes end in mid-May, but theirs run well into June.)  That’s an issue, but hardly a deal-breaker.  High schools also generally prefer to run classes five days a week in bite-size chunks of time; again, not our preferred method -- especially from a staffing perspective -- but not a surprise, either.

Textbooks take some diplomacy.  Students (and parents) in high school are accustomed to getting their books for free.  Colleges are accustomed to referring students to bookstores to buy their own.  When you’re running a college course in a high school, you need to address the book purchasing issue upfront.  Will the district pay, or will the students?  Do they have to go somewhere, will they be provided in class, or can they order online?

What we didn’t anticipate as much as we should have was the issue of placement tests.  Many of our 100-level courses require eligibility for English 101 -- that is to say, the ability to place out of developmental English.  A disturbing number of the high school seniors who are motivated enough to sign up for college courses don’t clear that hurdle.  I say “surprising” in part because of the merits, but in part because of the timing; if the prospective students don’t get their results until shortly before the course begins, and find themselves academically ineligible, then we can find ourselves in the awkward spot of having too few students to run the class.

I’ve floated the idea of just setting aside some seats in some online sections of classes we’re running anyway.  That way, I thought, we’d get around both the ‘travel’ issue and the minimum size issue.  If, say, six students out of twenty-five in a given Intro to Psych class are high school seniors, the class can run just fine.  I’d even argue that they’re getting a more authentic college experience, to the extent that their classmates are primarily 18 and older.

But that doesn’t always meet the needs of the high schools.  For reasons of their own, they need to have students in prescribed places at prescribed times, with someone who is paid to teach/supervise them.  Turning students loose for a while, with the expectation that they’ll eventually find their way to the course’s site, doesn’t meet the institution’s needs.

Finally, there’s the awkward fact that when high schools close, they close.  Colleges typically have admissions staff, registration staff, and the like available for probably fifty weeks a year.  That means that there’s nothing unusual about, say, administering placement tests in July and signing students up for classes in August.  That’s just not the case in many high schools, so even if we can align (or get around the non-alignment of) teaching schedules, all the support services frequently crash into each other.

I’m wondering if any of my wise and worldly readers have seen this kind of arrangement -- a college teaching college classes in a high school -- done smoothly.  What’s the trick?  Is there something we’re missing?

Monday, January 30, 2012



President Obama used the term “value” in outlining the criteria he’d use, if he had his druthers, in allocating Federal funding to higher ed.

First, the obligatory disclaimer: he doesn’t get his druthers very often.  He doesn’t quite seem to get that the Republicans have no intention of letting him succeed at pretty much anything.

That said, though, the idea of looking at “value” is suggestive.  

I’d like to look at the “value” of continued wealth polarization.  Let’s also look at the “value” of the highest incarceration rates in the developed world.  While we’re at it, let’s take a long, hard look at the “value” of wars of choice, the carried interest tax deduction, and HMO’s.  Higher education has its issues, but any objective barometer of “value” would suggest that there’s far more surplus value to be squeezed out of any -- let alone all -- of those than you’ll ever get out of educating people.  

That said, though, I suspect some sectors of higher ed should be more worried than others.

I’ve suggested before that there are really four categories of colleges.  Allowing for the obvious oversimplification, they are:

high prestige, high cost -- Harvard, MIT, Swarthmore

high prestige, low cost -- UC Berkeley, Michigan, UVA

low prestige, low cost -- community colleges, state colleges

low prestige, high cost -- innumerable little private colleges scattered hither and yon, for-profits

Even granting that these are very broad strokes -- I like to think that my own cc punches well above its weight in academic quality, and I have the transfer stats to prove it -- the first three categories strike me as passing the basic “value” test.  The fourth, not so much.  

A standard cynical response would argue that “hither and yon” could be rephrased as “in a host of Congressional districts,” and would suggest that localist political concerns would prevent any real harm from befalling the college in so-and-so’s district.  There’s some truth to that, but it neglects the fact that most Federal financial aid is channeled through students, rather than sent directly to institutions.  While that has predictable and sometimes pernicious effects, it does mean that a blanket change to Federal policy can happen without singling out any particular place.  Lowering the cap on federal student loans, for instance, would hit any college or university that charges above the cap; students would have to decide whether the particular college was worth it.

Over time, I expect that the most viable survival strategy for the endangered sector is to start playing a different game.  Instead of offering an undistinguished bachelor’s degree for a premium price, start experimenting with different kinds of credentials.  (I’m intrigued by the “badges” craze, for instance, and I suspect that it’s just the beginning.)  If I were high up at a struggling private college, I’d seriously consider starting some conversations with the Feds about alternative credentialing and financial aid.  Yes, it’s a break from tradition, but if sticking with tradition involves a fast slide into oblivion, the argument for experimentation is easier to make.  Add value where you can, even if it isn’t where you initially had in mind.

With cost becoming an ever-more-serious consideration, the only way I see that fourth sector surviving is by changing its value proposition.  That may not be the intent of the “reforms,” but I’ll take a happy unintended consequence.  If the veiled threat of oblivion spurs innovation in one sector, and that innovation spreads as it catches on, so much the better.  It wouldn’t be the first time that a problematic idea had a happy outcome.

In the meantime, let’s talk about the “value” of those tax cuts for the one percent...

Sunday, January 29, 2012


Accidental Productivity

Work doesn’t always look like work.

Over the last week, I’ve had a couple of long, meandering conversations with professors as they’ve returned for the Spring.  One was completely spontaneous, and the other was a focused discussion that quickly and thoroughly overran its purpose.  They were the kinds of discussions that can only happen before the crush of classes gets fully under way -- deadlines aren’t looming yet, students aren’t hunting them yet, and everybody is still relatively well-rested.  It’s a brief window.

In the moment, they both pretty much felt like goofing off.  And I won’t deny that some of the discussion was basically shooting the breeze.  

But they both helped me understand some issues that a more purposeful inquiry wouldn’t have.

I won’t violate any confidences by spelling out the issues, so I’ll give a parallel.  For many, many years, I used to wonder why the Catholic church insists on teaching rules that it knows perfectly well don’t work, like the bans on contraception and homosexuality.  Like the Unitarian liberal that I basically am, I could rattle off the perverse consequences of those prohibitions, and have wondered with frustration why the church continued to stick to its guns even in the face of mountains of evidence.

Then, a few years ago, someone explained to me that failure is the point.  Failure brings guilt, which requires forgiveness and inspires a need for validation.  (“Don’t make the same mistakes I made...”)  The impossibility of actually following the rules was a feature, not a bug.  In that light, my uses of mountains of evidence were either beside the point or actually counterproductive.  

In both of the recent discussions, I came to understand that some issues on campus that struck me as obvious were only obvious if you took them literally.  If you instead took them as “statements” indicative of “identity,” then suddenly my well-reasoned and empirically accurate objections didn’t quite stick.  They weren’t “wrong,” any more than the evidence-based objections to banning birth control were “wrong,” but they answered a different question.  

Getting to that realization, though, had to happen indirectly.  People don’t always have fully developed explanations for why they think the things they do; sometimes you can only get at it indirectly and even accidentally.

That kind of planning for accidents -- making yourself accident-prone -- is hard to reduce to a schedule or a cost-benefit analysis.  Some conversations lead nowhere, and it’s certainly possible to veer off into rants, hobbyhorses, or Grand Unified Theories of Everything.  In practice, it’s often a fine line between “productive wandering” and “goofing off.”  These useful moments came surrounded by moments discussing Tim Tebow, the ethics of the honey badger (“Honey badger don’t care!  He takes what he wants!”), and what it must be like to work in the HR department on the Death Star.  

Yes, I’m a little nerdy.

Moments like these strike me as both necessary and difficult to encourage.  They happen when people have time, but are still physically around.  They require a level of personal comfort, and a willingness to put in a chunk of time without any specific agenda or hope of payoff.  They require enough slack in the system for people to be human.

As budgets tighten and accountability measures proliferate, I hope we’re able to keep enough  slack in the system to allow smart people to have actual conversations without “action items” or “strategic goals” or “measurable outcomes.”  Sometimes, the most productive moments happen in the gaps, by accident, precisely because nobody’s trying.

Thursday, January 26, 2012


Friday Follow-ups

A few follow-ups to issues that came up this week:

In response to the “rejection” post earlier this week, one commenter suggested that ageism is rampant in faculty hiring.  S/he offered no particular evidence for the claim, but stated it as a sort of “everybody knows.”

Like many “everybody knows” claims, I haven’t seen it.  Over the past several years, my college has hired faculty in their twenties and faculty in their sixties, with many in between.  The processes for screening applicants are rigorous enough that it’s hard to imagine that kind of bias making itself felt very often.  At my previous college, the same was true.

That’s not to say that my own experience is universal, obviously, but it does make me wonder about the pre-emptive certainty of the claim, and the purpose served by asserting it.  Presumably, if the practice were ubiquitous, I would have seen it by now.  Instead, the charge gets thrown around, but without proof.  It’s hard to prove a negative, of course, but the charge feels a little like working the referee.  

If anything, I’d think the valid concern would go the other way.  With colleges adopting “tiered” benefits packages based on date of hire, newer employees will never get the level of benefits of their elders.  That sure smells like age discrimination to me...


In reference to yesterday’s post about cost (among other things), a commenter asked how I could assert ever-rising costs for colleges in the face of flat salaries for faculty.

That’s an easy one.  Costs include much more than salaries.

The elephant in the room for any discussion of labor costs is health insurance.  When the cost of employer-provided insurance goes up, then labor costs go up, even if salaries remain flat.  The employee might not feel it, but the employer absolutely does.  From the employer’s perspective, an increase in the cost of benefits is no different than a raise.  

This is why I pull out what little hair I still have whenever I read the New Faculty Majority’s advocacy of the “Vancouver model” for paying adjuncts.  Vancouver is in Canada.  In Canada, health insurance is not attached to employment.  If you don’t account for that, then you miss the point.  Establish single-payer health insurance in America, and we can get a handle on the adjunct compensation issues.  Until then, we use adjunct compensation to get a handle on health insurance.  


The IHE survey of provosts made for fascinating reading.  The two findings that jumped out at me were the Lake Wobegon effect and the provosts’ views of unions.

The Lake Wobegon effect -- “where every child is above average” -- applied both to academic standards and the effects of funding cuts.  Substantial numbers of provosts reported that grade inflation and declining academic performance are major issues, but only at other places.  And funding cuts will reduce quality on their campuses, even though they haven’t yet.

Many commenters took those responses as evidence of denial.  That may be, but there’s also a sense in which they have to give those answers.  A chief academic officer with a conscience would resign if s/he thought that academic standards had declined on her watch.  But if you don’t deploy the threat of decline in the future, it becomes hard to argue against further cuts.  So you adopt a seemingly contradictory view.

The responses on unions were portrayed as negative, though I read that as a function of the question asked.  If you assume that unions exist primarily to benefit their members, then the answers given necessarily follow.  That doesn’t necessarily imply that unions are objectionable; it just means that their first priority is their membership.  (The same argument holds about the claim that corporations exist primarily to make money.  Of course they do.  The relevant question is whether they accomplish a broader social good anyway.)  I don’t see the contradiction between saying that unions exist primarily to serve their members, and that they’re generally positive anyway.  Are they helpful, or self-interested?  Yes.  Just like corporations.  

Wednesday, January 25, 2012


On Notice?

President Obama has put higher education “on notice” that if we keep raising tuition, we’ll get our public funding cut.

To which I say, huh?

We’ve had our public funding cut already.  Since 2008, an uninterrupted series of cuts has been the direct cause of severe tuition increases for public higher ed.  If you want to stop the tuition increases, the first thing to do is to require the states to restore and then maintain realistic funding levels.  (When referring to a point in time, the usual term is a “maintenance of effort” requirement.  Otherwise, it can be set as a “grant in aid.”)  When the states have cut back, colleges have turned to the Feds through the indirect means of raising tuition, much of which is funded by Federal financial aid.

That would be the Federal financial aid that is coming under attack, by the way.  This summer the six-month graduation payment deferment is supposed to go away, the interest rate on Federal student loans is supposed to double, and students who place into college through the “ability to benefit” rules won’t be eligible for aid at all.  Why on earth you’d want to start dunning new grads for loan payments in the midst of a recession is beyond me, but there it is.  

So the “on notice” thing strikes me as a bit late.  

If they want to do something intelligent and effective about cost control, on the other hand, I have a few suggestions.  Plagiarize at will, folks.

First, don’t make the classic mistake of looking at tuition increases as percentages.  Loan payments are dollar amounts, so we should look at tuition increases in dollar amounts.  For example, let’s compare Rich Kid U with Eastern Podunk Community College.  RKU charges forty thousand a year, and EPCC charges four thousand a year.  RKU announces that it feels the pain of its graduates struggling with student loan debt, so it holds the tuition increase to three percent this year.  EPCC continues to reel from state cuts, so it raises tuition six percent.  In percentage terms, it looks like EPCC is the problem.

But it isn’t.  That three percent at RKU amounts to $1200.  The six percent at EPCC amounts to $240.  RKU is the problem.  EPCC is part of the solution.

Second, kill the credit hour dead.  I’ve addressed this many times before, so I’ll keep it short here.  If you define your product in units of time, then you will never -- by definition -- increase productivity.  Ever.  Without productivity gains, cost increases in excess of inflation aren’t just predictable; they’re mathematically inevitable.  (Economists call it “Baumol’s cost disease.”)

Third, resist the temptation to replace tuition with tithes.  Out in California, Mark Yudof has generated some sympathetic press by proposing that students pay after the fact, using a set percentage of their post-graduation salaries.  The humanitarian appeal is based on the fact that new grads usually don’t make much money, so using a percentage cuts some slack in the early years.

But the tithe model has serious flaws.  First, and most basically, it decouples the amount of aid received from the amount paid back.  Over time, that’ll start to smell like welfare, and we know what happens to welfare programs.  Second, it assumes -- falsely -- that everyone who gets loans actually graduates.  As annoying as loan payments are a decade down the line, they’re that much worse if you don’t even have a degree to show for them.  Finally, Americans’ credit card habits suggest that “buy now, pay later” leads to overspending now.  In the moment, the tithe model can resemble a blank check.  This does not bode well for cost control.

The way to control costs is to cap aid upfront, and to require colleges that charge above that to make up the difference themselves whenever there’s demonstrated need.  If aid is capped at, say, ten thousand a year for a full-time student, then EPCC can do its six percent increase without much issue, but RKU has to ask itself some hard questions.  

Finally, we need to make a decision as a polity.  Do we want to direct resources to students to use as free agents, accreting credits wherever and however they see fit?  Or do we want high-quality, low-cost institutions?  Because it’s one or the other.

The consumerist, “DIY U” model suggests funneling aid through students, and letting them choose what they want.  As a consequence, institutions lose their own funding.  They make up the difference through a combination of service cuts and cost increases.  

If we want high-quality, low-cost institutions, we have to subsidize them.  This is just a basic fact of life.  

For evidence, look at the for-profits.  They charge considerably more than the publics, and offer a good bit less.  That’s what the absence of a subsidy forces.  The ideologically-driven fantasy that the market will magically make everything better simply doesn’t work for public goods, like having an educated workforce and citizenry.  That’s because they capture too little of the positive externalities they generate.

You want quality colleges at affordable tuition levels?  Me, too.  It’s not that hard.  Break the credit hour, abandon the fantasies of market-driven bounty and/or titheing, and commit to subsidies that actually meet the need.  Yes, there are some basic structural reforms within higher ed that would help -- longtime readers know a few of my favorites -- but first things first.  If you want affordability, you have to spread costs around.  

What you absolutely do NOT do is move to a consumer model.  The last thirty years provide ample evidence of that.  I know this isn’t popular in the culture right now, but institutions matter.  Resorting to the quasi-libertarian fantasy of the almighty empowered consumer will result, inevitably, in systematically underfunded public goods.  There’s a philosophical position that says “and a good thing, too...” -- let’s have that debate.  But let’s have the debate that acknowledges the costs, rather than one presuming some sort of magical harmony of interest.  If you tell public colleges to act like businesses, don’t complain when they act like the colleges that actually are.

Tuesday, January 24, 2012


Thoughts on Workforce Development

What does workforce development look like?

The question is becoming more important as the term is gaining political steam.  

Politicians like to offer workforce development as an answer to the recession.  The idea is that if the folks without jobs had the skills to get the jobs that are going unfilled, then everybody would win.  Which is true, as far as it goes.

But it’s dangerous to expect too much of the “train the unemployed” strategy.

Too narrow a focus on workforce development usually has the effect of neglecting the ‘transfer’ function of community colleges.  Yes, community colleges can provide short-term training, but they can also provide the first two years of a four-year degree.  For students who are concerned about their debt loads, this can be a very attractive option.  The second word in “community college” is “college,” which is easy to forget when the political discourse reduces community colleges to training centers.

Some of us like to think that the transfer role is, in fact, a form of workforce development.  When a student from a shaky background finds her footing at the community college and eventually transfers for the four-year degree -- and sometimes more than that -- she becomes eligible for jobs she never could have attained without the degree.  That’s a potent, if slow, form of workforce development.  Conceptually, there’s no reason that transfer couldn’t be considered a part of the workforce development role, but in practical terms it’s usually considered separately.  That’s a mistake.

Still, even if everyone agreed on a broad definition, we’d still face some pretty serious issues.

-- The catastrophic increases in incarceration rates since the 1980s have done more than just starve public higher education of resources, though they have certainly done that.  They’ve also generated a tremendous number of low-skilled adults with criminal records.  Many of the jobs that pay a decent wage aren’t open to people with criminal records, even if they’ve completed a training program.  While I certainly support a more discerning approach to criminal justice, it’s hard to know what to offer the folks who’ve already been snagged in the current system.  Training them for jobs they can’t get doesn’t strike me as the answer, though.

-- Picking market winners isn’t easy.  This year’s hot field is next year’s cold one; knowing in advance what will be hot is usually educated guesswork.  I’m fairly sure that neurosurgeons and Ph.D. computer scientists will still do well, but that’s not terribly helpful at this level.  The most predictable lower-level workforce needs are actually the skills we expect students to pick up in their general education courses: effective communication, the ability to see the big picture, enough quantitative skill to know when an answer doesn’t sound right.  Those skills are evergreens, and like evergreens, they take time to grow.  

-- Sheer numbers.  Yes, there are fields in which a few local employers need some people quickly, and the work lends itself to relatively fast training.  But most of the time, the first or second group through exhausts the available openings.  

-- Adult Basic Education.  In many cases, the workforce development that’s actually needed isn’t so much training on this machine or that process, but instruction in English for Speakers of Other Languages or Adult Literacy.  This kind of instruction is usually separate from a college’s developmental track, since it isn’t necessarily geared towards getting the students into a degree program.  Unfortunately, ABE programs are often run on a shoestring, and are even more precarious economically than community colleges are.  If we really want to reach some of the hardest-to-place people, let’s start with the basics.

My free advice to any politicians reading this is simple: don’t let the fantasy of the simple, classic version of workforce development overshadow the big picture.  If you want to improve the prospects of the local workforce, start with adult basic education, add short-term training programs, and beef up the classic academic offerings at community colleges for transfer.  (While you’re at it, you might want to think about all that incarceration...)  If you want the full range of jobs, you need the full range of preparation.  Otherwise, you’ll just keep cycling people through training programs every few years, every time the economic winds shift.

And if you can come up with something practical for folks with criminal records, all the better.

Monday, January 23, 2012



This piece in the Chronicle occasioned quite a few comments, and for good reason.  Non-superstar academics under the age of about 60 typically have plenty of good (and bad) rejection stories.  This post is an attempt to look at rejection from the other side.

(That’s not to deny that I’ve had my own fair share of rejections, with varying levels of grace.  But those stories are legion on the blogosphere; I’m hoping to contribute some clarity as to why they sometimes happen the way they do.)

As many places have, my college has purchased and used an online applicant screening system for the first cut.  Applicants enter some basic information, and have to certify (or not) that they meet each of the minimum requirements stated in the posting.  They also have the chance to self-identify as a member of one of the specified underrepresented groups, if applicable.  

Applicants who don’t have the minima are immediately disqualified.  If the ad says “Master’s degree or higher in xxx discipline,” and you have a bachelor’s, you’re out.  Those notices are prewritten templates, and they’re quite impersonal.  

At the next level, applicants who seem to meet the minima have their applications read by the search committee.  Depending on the position, the level of selectivity at this point can vary tremendously.  For faculty positions in, say, Nursing, the struggle is just to find a decently sized pool.  For faculty positions in the evergreen disciplines, it’s much more about winnowing the pile down.  The task at this point is to decide who to invite to campus for a first interview.

The role of affirmative action, at this point, is to ensure that any members of underrepresented groups who meet the minimum qualifications get first-round interviews.  (That requirement gets refined when the numbers become unwieldy; we’re not doing forty first-round interviews for anything.)  Those who don’t meet the minima don’t get interviewed.

Applicants whose packages get read, but who don’t get interviews, get impersonal rejections.  That’s largely a function of time.  For a typical faculty search, we’ll have 80 to 100 applications, of which probably 50 meet the first-level screen.  (For English and certain humanities fields, double those.)  There’s no reasonable way to craft personalized rejections for fifty different people.  And it’s not at all clear, at this point, what the incentive to do that would be.  

I know I’ll get flamed for this, but it’s the damn truth: there are people in this world who will attempt to use litigation as a weapon.  A generic rejection offers no ammunition; a personalized one fairly screams to be used in court.  It’s much safer to say something like “we received many excellent applications...best of luck in your future endeavors” than to say something like “other people had more relevant experience than you.”  

The game changes slightly with the applicants who get interviewed.  First-round interviews usually involve eight to ten candidates, of whom three or four will make it to the second round.  For faculty, the first round is where teaching demonstrations happen.  At this point, rejections can be slower to come, since some candidates will back out at the last minute and you want to have a full slate of choices.  In essence, some candidates are forwarded, and others are held in reserve as fallback options.  At this stage, fast rejections only happen when someone’s interview led to a “hell, no” response.  Which sometimes happens.

Second round interviews involve higher administrators, including the hiring manager and the chair of the first-round committee.  (For faculty, they include the VPAA and the relevant dean; for staff, they’ll typically include the relevant VP and director or dean.)  This is the decision-making stage, and it can involve its own unique set of delays.  Scheduling a new round takes a little while, as does reference checking.  There’s also the wild card of the time that the first choice candidates takes to think it over.  That can range anywhere from “I’ll take it!” to “can I have a couple of weeks?”  If the first choice person winds up turning it down, which happens from time to time, then we start again with the second choice candidate.  Repeat as necessary.  

At least here, candidates who make it to the last round can expect a phone call one way or the other.  But even there, the calls are -- and have to be -- pretty terse.  The more said, the more sued, and that’s not because the truth is nefarious; it’s because some people are willing to use whatever is at hand to get what they want.  

In a perfect world, it would be lovely to have the option of candor with rejected candidates, at least when they ask for it.  (“Can you tell me what I could have done differently?”)  But even if that were possible, most of them would find it pretty unhelpful.  With exceptions, second round interviews aren’t usually decided by glaring mistakes.  They really come down to casting.  Given the folks we already have, who would add the most?  Inevitably, some of that comes down to professional judgment.  There’s no way it can’t.  Hearing something like “you didn’t do anything wrong; someone else just had wider range than you” isn’t terribly helpful, even though it’s often true.

Some methods of rejection are certainly worse than others, but at some level, there’s no way to make rejection not suck.  With many qualified applicants for each position, it has to happen.  And with the legal climate we have today, meaningful candor from the institution isn’t going to happen.  That leaves boilerplate.  I don’t like it either, but it’s a rational response to the incentives that actually exist.  The best I can offer is that none of it is designed to be offensive or demeaning, even when it feels that way.  And when it comes down to it, it isn’t about you.  It’s about the institution.  Best to read it accordingly.  

Sunday, January 22, 2012


"Middle Skills"

If you know what “middle skills” are, you’re nearly as nerdy as I am.  They’re the hot new thing in discussions of both economic development and community colleges.

Broadly speaking, “middle skills” refer to workers who have some formal post-high school education, but who don’t have a four-year degree.  That could mean a two-year degree and/or some sort of certificate.  Jobs that require specific sets of “middle skills” include all manner of technicians, alllied health positions, certain kinds of office work, and even -- in some jurisdictions -- law enforcement.  

This New York Times story about why the iphone isn’t built in America (even though it was designed here) takes the popular position that the deal-breaker for Apple was a lack of workers among the ranks of the middle-skilled.  If only we had a ready supply of technicians, it suggests, Apple might have stuck around.

Well, yes and no.

Yes, “middle skill” positions have been largely ignored in the popular and political discussions of the economy, and that’s a mistake.  It’s great fun to highlight the story of the up-from-poverty kid who became a cardiologist and established a charity to help others do the same, but the far more common story is of the up-from-poverty kid who got an associate’s degree in criminal justice and became a cop.  And there’s nothing wrong with that; we need police (and nurses, and technicians...), and a solid working-class life beats the crap out of poverty.  It’s an attainable goal, and one that community colleges in particular have helped millions fulfill.  It’s reality, it’s welcome, it’s underappreciated, and it’s a real contribution to the public good.

But no, it’s not the entire story.

The Apple story refers to middle skills, but it’s really about the ability to make major changes quickly.  In China, the entire logistics chain is already there, as are tremendous numbers of qualified employees.  Perhaps more importantly, those workers have no meaningful workplace rights.  They’re at the mercy of the employer at a level that Americans would find inconceivable.   The wages are lower, but the real issue is the ability of the employer to stop on a dime and change direction as the market dictates.  That’s just not culturally possible here, at least for now.  

For a counterexample, take Kodak.  It was so tied up in the life of its home city that it couldn’t bring itself to make difficult changes.  Last week it finally filed for bankruptcy.

What killed Kodak wasn’t a lack of middle-skilled workers.  It had plenty of those.  What it didn’t have was the ruthlessness to say ‘no’ to internal constituencies as the market shifted from under its feet.  

I’m happy to see community colleges get some recognition for the valuable work they do in helping prepare some vital parts of the workforce, and in providing a realistic and non-exploitative way for students from economically challenged backgrounds to get a foothold in the middle class.  And it shouldn’t surprise anyone that I’d like to see more support for that role, especially as older options for entry into the middle class -- unionized factory work, for example -- fade away.  Besides, some of these jobs need to be done here -- law enforcement, say -- and really can’t be sent to China.  

But at the end of the day, as important as education is, it’s only part of the picture.  Asking it to do more than it ever could is just setting it up for failure.  I don’t want to see it get punished for not being able to do more than it ever should have been asked to do.  Kodak didn’t fail because it lacked an educated workforce.  For that matter, the economy didn’t fail because it lacked an educated workforce.  The issues, and failings, were far more complex than that.  By all means, let’s give the developers of middle skills the respect they (we) deserve, but let’s not mistake one good idea for the solution to the economy.  It just isn’t.

Thursday, January 19, 2012


Apple for the Professor?

Apple’s latest foray into the education market caught my eye.  It’s promising, but I can’t get past some sticking points.

As I understand it -- and I don’t claim to fully get it -- Apple is making several moves.  It’s releasing a software package for prospective authors, to make it easier to format books to sell on ibooks.  It’s partnering with several of the major textbook publishers to issue ipad-only versions of textbooks in several basic courses, complete with interactive bells and whistles.  And it’s making available about 100 courses from name-brand universities, though it’s not entirely clear just what “making available” means just yet.  It sounds like more than just podcasts of lectures, but how much more isn’t obvious.

First the good stuff.  Textbooks age fairly rapidly, depending on the field.  Scientific ones need to be up to speed with the latest discoveries, technical ones need to be conversant with the latest iterations, and political ones need to keep pace with changes in governments.  (At my last college, well into the 2000’s, the pull-down maps in several classrooms still featured the USSR.)  That’s hard to do with “dead tree” books; once they’re published, they’re published.  But e-books should, in principle, be easy to update.  They could easily contain “quiz yourself” widgets, touch-activated glossaries, active links to relevant sources, and the like.  

That’s in addition to some of the cool and useful things that ipads can already do.  They’re great for journalism classes, since they can record interviews, hold notes, and even scan police radios.  The allied health folk are tripping over themselves to get ipads for students in clinicals, since that’s the route hospitals have already gone.  And they’re kinda fun.

But I’m not there yet.  Before supporting widespread adoption across the college, there’s a host of issues to address.

The most obvious is cost, which I suspect is a dealbreaker for k-12.  The cheapest ipads start at about five hundred dollars, and they go up quickly from there.  Apple touted the low price of e-textbooks, but if you need to first spend five hundred bucks before getting any savings, I don’t see that happening.  Then you have to assume some level of loss, breakage, water damage, and the like.  Unlike glass screens, textbooks don’t shatter when you drop them.  And any parent of young children can tell you they’d get dropped.

At the college level, the argument might be a bit more convincing, if the cost of the ipad would be covered by financial aid.  Even here, though, the savings only happen if the student is saving money on a whole bunch of courses.  (The current offerings are few and far between, though I expect they’ll grow.)  If you use the e-text for, say, Intro to Biology, but then switch to dead-tree versions after that, you come out behind.  

E-texts also defeat the used book market, which has historically been the way for savvy students to take the edge off textbook costs.  If you have to buy ‘new’ every time, then I understand why publishers are on board, but the advantage for the student diminishes further.

I have to admit being really bothered by the platform exclusivity.  If e-books were available as websites with logins, then it wouldn’t matter (much) how you got to the website; ipads would be great, but laptops or android tablets or even desktops would get the job done in a pinch.  But for a college to force all of its students and faculty to work with a single vendor puts a hell of a lot of trust in one vendor.  Some of us have moved away from Blackboard and towards open-source solutions for the LMS precisely to get away from the single-vendor problem.  I’d hate to fall back into it on an even larger scale.

And on a really basic level, ipads lack keyboards.  Students who’ve gotten around that by buying macbook airs or netbooks or low-end laptops would suddenly be on the hook for yet another expensive device, and would have to have both of them at the ready for various tasks.  (I’d hate to write a five-page paper on the onscreen keyboard.)  Even within the single vendor, I’d expect Apple to at least make versions of the texts available on macs.  If I had just bought a macbook air for school, I would be pissed.

The “courses” raise a host of other issues, but I don’t understand them well enough yet to comment.  As they take shape, I’ll certainly be curious to see what they include.  As potential study aids, they might provide helpful alternatives to the Gates Foundation/Kahn Academy model.  (Even now, it comes down to Gates vs. Jobs...)  But it sounds like they’re aiming higher than that.  We shall see.

Wise and worldly readers, what do you think?  Do you foresee assigning ipad textbooks next year?

Wednesday, January 18, 2012


The Girl, Amateur Chemist

“Daddy, I figured it out!”

I never get tired of hearing that.

The Girl got a couple of chemistry sets for Christmas, and we broke them out and started tinkering with them over the long weekend.  They have the basics you would expect: a few test tubes, some rubber pipettes with squeezable bulbs, a measuring spoon, and -- most important of all -- safety goggles.  

With the goggles on, she looked like Snoopy in his World War One Flying Ace ensemble.  She loved them.

Chemistry sets for kids are a little more sophisticated now than the ones I dimly remember.   I recall a great deal of improvisation when I got a set somewhere around age nine, especially once my idiot friend and I figured out that putting chemicals on paper napkins and setting them on fire in the basement -- where the concrete floor wouldn’t burn -- was kinda fun.  The experiments the set offered just didn’t seem all that interesting in comparison.

In my defense, I believed that if you already knew the outcome, then it wasn’t really an experiment; it was a demonstration.  An experiment means you don’t know the outcome.  We didn’t know squat, so we experimented.  Admittedly, we were a little loose on details like “reasons” and “procedures” and “basic safety,” but hey, it was the seventies.  Back then they put gas tanks right behind bumpers.  We were just in tune with the zeitgeist.

My high school lab partner -- a faithful reader of the blog -- can attest that by the time I got to high school, any interest in chemistry was long gone.  I treated it as a distasteful obligation to be dispensed with.  I’m hoping not to pass that on to The Girl.  

Since kids have a habit of doing what you do rather than what you say, I had to throw myself back into a discipline I hadn’t engaged in any serious way since the Reagan administration.

Luckily, The Girl was there to rescue me.

We set everything up carefully.  We had newspaper on the kitchen table, paper towels at the ready, and safety goggles affixed.  TG even briefly put her goggles on The Dog, just to see how they’d look.  The Dog demurred.  Science requires sacrifices.

Then we got down to business.

A word to the people who write the instructions for chemistry sets: clarity matters.  The one set had a series of experiments in a pretty rigid sequence: experiment 7 required that you use the products of experiment 6, for example, and experiment 8 drew on both 7 and 6.  Logically, then, a mistake in experiment 6 borne of an ambiguous phrase would render experiments 7 and 8 dismal failures.  

Which, in fact, happened.

We went back to experiment six and tried again.  TG conscientiously filled the test tubes with water to the correct height, added the right chemicals in the right amounts, applied the rubber stopper, shook gently, and watched as the liquid turned sort of pinkish.  (The instructions suggested that it would be a deep red.)  We walked through the instructions step by step, wondering why the next step that was supposed to result in the liquid turning blue resulted in, well, nothing.  So we tried it again; still nothing.

At that point, I had to pick up The Boy from his friend’s house.  I left, with TW still around in case anything exploded.

When I got back, TB in tow, The Girl beamed proudly and announced “Daddy, I figured it out!”  TW denied any involvement.

TG had meticulously retraced each step, and interpreted each next step herself.  As it happened, her sense of how it worked was more accurate than her fortysomething, PhD-bearing Dad’s.  The test tubes beamed their bright, unambiguous primary colors.

She was proud.  I was even prouder.

Rock on, TG.  And pay no attention to the paper napkins on the table.

Tuesday, January 17, 2012


Handling Good News

“How to Handle Good News” should be a handout given to every new administrator.  It’s remarkably easy to handle it wrong.

Happily, I’ve had occasion to reflect on this recently.  A couple of key projects are starting to bear fruit.  These are projects that we’ve done the right way: specifics designed by faculty, assessment mechanisms built in from the outset, time and resources dedicated.  If not for the requirements of pseudonymity, I’d devote serial posts to celebrating them.  Pseudonyms being what they are, I’ll just say that it looks like we’re finally starting to make real progress on some longstanding and serious issues.

That said, it would be way too easy to kill them in the crib by celebrating them the wrong way.

The most obvious mistake is stealing credit.  Anyone who has had the experience of the boss (or advisor) taking credit for their work knows how demoralizing that can be.  It’s a pretty effective way of killing initiative, making the hard workers feel like suckers, and poisoning the well for years to come.  I’d hope that anyone with brains and at least some sense of empathy, if not of ethics, would know that.

A more common one is reframing the project retrospectively to fit into an alien agenda.  (“Alien Agenda” would be a great name for a band.  But I digress.)  If the folks who did the hard work feel like their efforts were hijacked for some other purpose, they’ll be wary of stepping up in the future.  The trick here is knowing where the boundaries are.  (This is where I’d expect admins who come from outside academia to run into issues.  They wouldn’t have as clear a sense of the boundaries.)  For example, trumpeting these projects as proof of the importance of outcomes assessment would probably strike many of the participants as betrayal.  They’ve used assessment well, and that’s to their credit, but it wasn’t really the point.  Turning it into the point after the fact would be bad faith.

I’ve also seen success celebrated in a really passive/aggressive way.  “These folks did something terrific, unlike some people...”  Don’t.  Just don’t.  Annointing some people as favorites leads to awful internal politics, perverse incentives, and tremendous misdirected energy.  Praise the work, not the people who did it.

Good news calls for celebration, but it needs to be constructive.  Notice success, ask questions, encourage more, provide resources, and for the love of all that’s good, don’t steal credit.  Especially in public, praise goes to the work more than the worker.  If Professor Jones did an amazing job on a course redesign, talk about how wonderful the course redesign is, not how wonderful Professor Jones is.  Others can also do great work, but nobody else can ever be Professor Jones.  Highlight verbs, not nouns.

The exact mechanisms vary by personality and context, but the principles shouldn’t.  When success comes along, celebrate it in ways that might actually encourage more success.  Let everyone get the message that they can be celebrated too, if they just step up.

And for the administrator, learn to celebrate vicariously.  It takes some self-discipline, but it’s for the best.

Wise and worldly readers, have you seen success turned into bitterness by being celebrated the wrong way?  Alternately, have you seen an encouragement that really struck a chord?

Monday, January 16, 2012


Basing It All on Graduation Rates

The Presidents of the Chicago area community colleges will keep or lose their jobs based on the graduation rates at their respective colleges.  

This is an awful and great idea.  I’d hate to be in their shoes, though.

The greatness of the idea is that it moves necessary changes from “gee, we really should...” to “we have to do this NOW.”  The culture of higher ed is good at footdragging and terrible at saying “no” to incumbents.  Some level of urgency is probably required if those cultural defaults are to be overridden.

That said, though, it could go wrong very easily.

It wouldn’t take much.  Colleges could start outsourcing the most difficult students into Adult Basic Ed programs, cutting off second chances, and placing none-too-subtle pressure on faculty to grade generously.  They could recruit from different (more affluent) areas, redefine ‘graduation’ by slicing degrees into cascading certificates, and give credit for life experience.  Those would all result in relatively fast “gains,” though at considerable cost to the mission.  

Getting good results the honorable way, though, will take years and resources.  I don’t know how politically realistic that is, but it’s true.

Doing it the right way would involve beefing up full-time staffing among faculty, student support staff, and financial aid.  (Delays in financial aid processing can be devastating.)  This all comes at considerable upfront cost.  On the curricular side, they’d have to take full advantage of the findings coming from the recent literature on shortening developmental sequences.  (The CCRC website is a great place to start.)  

Academic advising would have to become much more intrusive and consistent, with students sticking with the same advisor as they move forward.  If experience is any guide, this may involve serious (and expensive) upgrades to their ERP system.  It may require considerable staffing upgrades for advising, and depending on the current faculty role (and contract), there may be some contractual issues to address.  

Then there’s the tricky issue of climate.  Sustainable gains will require finding new ways to do things, which will require experiments.  Experiments run the risk of not working; the idea is to run enough of them, with enough forethought put into design and assessment, that they don’t all have to work.  If you have enough of them that you can afford to be candid in assessing results, then over time, you can build on successes and pare away failures.  But that requires a few things upfront: resources for faculty and staff time; resources for assessment, IR, and cohort tracking; and enough internal trust that people won’t either flee the experiments or bury weak results in CYA obfuscation to avoid being identified with failure.  If they fear that bad results will be held against them, you won’t get the candor you need to make real progress.

And that’s where intelligent management crashes headfirst into politics.  The graduation measure they’re using is the 150 percent time IPEDS cohort; in other words, the percentage of first-time, full-time students who graduate within three years of starting.  Assuming that any given intervention takes a year to get up and running, and then three years to show the first results, it would be a minimum of four years before the very first set of post-ultimatum data rolls in.  In politics, that’s an eternity.  And if you assume that some initial experiments won’t work, then it could be six to eight years before you get the kind of results on which it would be reasonable to base decisions.

If they’re serious -- which in the context of Illinois politics has to be considered a huge “if” -- they need to pony up some serious cash for the next several years and appoint some freestanding body to monitor progress over the next ten years or so.  The kind of changes they’re asking for would be wonderful, but if they’re real they won’t be easy.  (The old saying about home improvements leaps to mind.  Good, fast, and cheap: pick any two.)  My guess is that the impulse behind the new standards is a desire to cut funding, which doesn’t bode well for the results, but I’d be happy to be wrong.

Good luck, Chicago.  If you take the high road on this, you could set a national example.  If you take the low road, you will do the kind of damage that takes generations to fix, if it gets fixed at all.  

This page is powered by Blogger. Isn't yours?