Thursday, September 28, 2017


A True Confession

Last night I spoke briefly at the Phi Theta Kappa orientation, congratulating the students on being smart and owning it.  

Then I forgot where I parked, and wandered around the parking lot for ten minutes.

Well played, universe.  Well played.

Wednesday, September 27, 2017


Debtor’s Prison and Performance Funding

In ninth grade, I remember hearing of debtor’s prison.  The idea immediately struck me as insane.  How do you pay off your debt if you can’t get to your job?  I wasn’t the only one to connect those dots; nearly everyone in the class did.  Yes, debtor’s prison was an incentive to pay up, but if you couldn’t, it functioned less as an incentive than as a death spiral.  The logic of incentives is powerful, but it only works to the extent that the desired behavior is actually under your control.  

As Slate noted in an uncommonly good article this week, we have a modern version of debtor’s prison.  It’s called the “suspended license.”  Outside of a few urban enclaves, if you can’t drive, you can’t get to work.  If you can’t get to work, you can’t pay your fines.  So people cheat and drive without licenses, at which point they incur more fines.  Or they don’t, they lose their jobs, and they can’t pay the fines they already have.  As the article notes, a disturbing percentage of suspended licenses are for behavior that isn’t dangerous.  It’s often parking tickets, or failing to show up for a hearing, or not paying a fine.  What sounds in the abstract like a slap on the wrist -- “driving is a privilege, not a right” -- ignores the material reality that in most places, if you can’t drive, you can’t work.  

It would be wonderful if we had enough, and good enough, mass transit to make a suspended license a mere nuisance.  But in most places, we don’t.  Bicycles can help sometimes, but they aren’t great in the rain, or in winter, or at night, or on the vast majority of roads that are built in ways that make bicycling dangerous.  For now, in most of the country, a car is a necessity.  That’s a problem, but it’s a different problem.  The article does a nice job of outlining the real-life consequences to some folks from having their licenses suspended; they seem wildly out of proportion to any ‘incentive’ value.

Which brings me to the latest peer-reviewed study on the effects of performance funding on public colleges.  The theory behind performance funding is that colleges respond to incentives, and have it in their power to improve dramatically their student outcomes; they just need to be prodded.  Kick ‘em in the budget, the theory goes, and they’ll find ways to improve.

Except that they mostly don’t.  In this case, I think the misunderstanding runs even deeper than debtor’s prison or suspended licenses.

In most cases that I’ve seen (and worked under), performance funding isn’t new money; it’s a reallocation of existing money or a little bit less.  In other words, the mandate for improvement is unfunded at the outset.  Given the high fixed costs and thin margins of most public colleges, especially in the two-year sector, it’s rare to find colleges making dramatic gains without significant external grants.  So there’s a bit of the debtor’s prison issue: if you can’t afford to improve, your funding will be cut, making you less able to afford to improve.  What was marketed as a prod becomes a death spiral.

The motivation is less direct, too.  The people who have to make the changes in what they do don’t get paid more.  If the college gets, say, a three percent increase in its operating subsidy, it’s immediately swallowed by that year’s health insurance increase.  The employees on the ground who made it happen don’t get any more than if they didn’t.  (There may be fewer layoffs, but that’s abstract until it isn’t.)  And public colleges aren’t for-profit; that’s not their motivation.  Yes, they have budgets to balance, but they don’t exist for their budgets.  

As Bailey, Davis, and Jaggars noted in Redesigning America’s Community Colleges, smart reforms can decrease the cost per graduate, but they increase the cost per student.  For colleges that are already barely above water financially, increasing the cost per student is a heavy lift.  In other words, the distance between the desired behavior and the ostensible reward is too great, and the reward feels too abstract to many employees to matter.  And that’s without even getting into the myriad of factors beyond a college’s quality that affect student completion.

The comparison to grants is revealing.  Grants can generate wonderful results because the means to achieve those results are supplied upfront.  You don’t have to raid other parts of the operation.  And colleges have some control over which grants they apply for, so they’re likely to pass on the ones they know wouldn’t make sense for them.  Grants spare you having to rob Peter to pay Paul, allowing you to devote more resources to improvement and fewer to infighting.  There’s value in that.  Significant new operating funding could achieve the same thing.

This argument is a difficult sell because it requires going beyond the first step.  Yes, improvement is possible.  Yes, incentives matter.  But so does investment.  When performance funding becomes effectively punitive, it reinvents a mistake hundreds of years old.  My ninth grade class could see the flaw in that.  I hope we still can.

Tuesday, September 26, 2017


Reminding Myself to be Naive

Sometimes I have to remember to forget.

Experience has its undeniable virtues.  It allows you to anticipate things that might not otherwise be obvious.  It can alert you to landmines.  It helps you sift through the irrelevant to get to the stuff that matters.  It can save a lot of time.

But if you’re not careful, it can also get in the way.  I fell into that trap recently.

Without going into too much detail, for obvious reasons, I’ll just say that over the years, I’ve learned that certain sorts of objections are usually pretexts for other things.  They’re used ritualistically, rather than for their own content.  “It’s not what she said, it’s how she said it,” usually actually means “it’s what she said.”  I’ve heard people say, of layoffs, that they should have been carried out with more care, as if the primary objection had to do with etiquette.  And does anyone actually believe that the fuss about Hillary Clinton’s emails had anything to do with emails?  At this point, it’s hard to keep a straight face and make that argument.

But occasionally -- not often, but more than never -- the pretexts aren’t actually pretexts.  And being too quick to remember how they’re typically used can get in the way of seeing how they’re being used now.

Management, teaching, and parenting all involve choices to forget.  In each case, the point is to help the employee, student, or child improve over time.  That can mean letting some grievances go when they aren’t productive anymore.  Sometimes, growth requires a period of awkward struggle.  Pretending not to notice the awkwardness, or being quick to look away from it, can be helpful.  It allows for the possibility of doing better.  

(The universe showed me real mercy by waiting until I was out of the teen years before inventing youtube.  Some moments are best left deep down the memory hole.)

In the moment, it can feel a bit disingenuous, but it’s really a version of playing the long game.  

I remember being struck the first time I heard the concept of “suspension of disbelief.”  If you’re unwilling to suspend disbelief, the world must look awfully grim.  When presented with, say, a proposal for a new program, it’s easy to poke holes.  It’s easy to refer knowingly to other programs that never got rolling, that got rolling but fell short, or that had unintended consequences.  Every single one of those objections can be true, but they don’t mean the new one will necessarily meet the same fate.  

My greatest fear with long-term employees isn’t that they fail at something.  It’s that they stop trying.  It’s that they lose the ability to suspend disbelief, to generate that naivete that allows breathing room for real progress.

I made this mistake myself recently.  Based on very real experience, I was too quick to discount something that I should have taken at face value.  I forgot to forget, and looked with jaded eyes instead of fresh ones.  That was my mistake.  Happily for me, an observer of goodwill pointed it out to me before too much harm was done.  Lesson learned.

Wise and worldly readers, how do you remember to forget?

Monday, September 25, 2017


My Recurring Nightmare

I’ll admit to some raised eyebrows reading about the lecturer at NJIT who was recorded apparently praising Hitler in class.  He claims he was taken out of context.

As someone who used to teach political philosophy, a scenario like this is my recurring nightmare.

Among other things, I taught the Greatest Hits of the Western canon of political thought, or, as we called it, “From Plato to NATO.”  I assigned actual texts -- in translation, but still -- and spent class time helping students decipher them.  Some of it involved reading comprehension, but much of it involved trying to get the overall perspective of each thinker.  A middle-class American 18 year old may not find, say, Locke’s Second Treatise terribly relatable at first blush, so part of my task involved painting word-pictures and trying to provide context.

Sometimes that meant role-playing, or playing the devil’s advocate.  At various times over the years, in class, I would role-play a monarchist, an anarchist, a Marxist, a utilitarian, a libertarian, a Hobbesian, an Aristotelian, a Burkean, a feminist, or a Platonist.  Fascism was a tough one, but sometimes I’d try to ventriloquize Nietzsche, which can be great fun in very small, carefully selected doses.  

This was before smartphones and YouTube.  Back then, a single student might misunderstand something I said, but the odds of that student recording it and distributing it instantly to the world were close to zero.  There were times when I would play a character for ten or twenty minutes at a pop, trying to help students understand how a given thinker or school of thought connected the dots.  

If some student had recorded, say, five minutes of the anarchist role-play and posted it to YouTube, shorn of context, I would have been in a bad spot.  But I wouldn’t have been doing anything wrong.

This, to me, is why it matters to have presidents and vice presidents who have actually taught.  If some ideologically-driven student or organization starts pulling this kind of stuff and trying to shut down real inquiry, you want to have people high up who understand both what’s at stake and what was really going on.  If I couldn’t try to present each thinker’s most compelling claims in the most compelling way I could, I wouldn’t have been doing my job very well.  Getting students to grapple with difficult questions can involve some uncomfortable moments.  

The threat that those uncomfortable moments could be taken entirely out of context and sent to the world as evidence of something sinister is deeply scary.  It cuts to the heart of the teaching role.  The panopticon-from-below is such a severe threat because it’s so easy to pull off.  The original panopticon took actual effort to build.  Now anyone with a midrange phone can do both surveillance and mass distribution.  

I don’t know whether the NJIT case involved thoughtful pedagogical role-playing, unhinged ranting, inappropriate recruiting, or what.  It could have been any of those, or some combination of them.  But on general principle, I’d be deeply wary of drawing conclusions from a single recorded clip.  It’s just too easy to mislead.

Sunday, September 24, 2017



Don’t just watch the ball.

My Dad took me to my first professional baseball game when I was nine.  I had seen some on tv, but the Rochester Red Wings at Silver Stadium -- back before pigeon droppings consigned it to the dustbin of history -- offered a different experience.  At one point he pointed to a player shifting position in anticipation of what the batter would do.  He was just trying to help me understand what was going on.  I don’t remember the specifics, but I vividly remember the feeling that you could absorb a lot by looking where the ball wasn’t.  On tv, the camera pretty much followed the ball, but in person, you could look wherever.  Sometimes the most important or interesting stuff is happening where the ball isn’t.

I think of it as the “meanwhile…” principle.  While you’re focused on shiny object A, unnoticed events B and C are unfolding, and they may wind up mattering a lot more.  

In higher education, we have plenty of shiny objects to look at.  There’s the graduation rate, concerns about student debt, fears about job placement, and the omnipresent concerns that someone, somewhere, might have a political opinion.  Most of those are valid, at some level, but there’s a lot of ignored “meanwhile…” going on.

So, here’s one.  While we’re focused on a gradual enrollment drop, flat aid, and incremental improvements to student success, our health insurance premiums continue to climb at a double-digit rate.  This year at my college they’re up 13 percent.

That’s the kind of increase that inexorably reshapes the contours of the possible.  It barely counts as news, because it has been going up at roughly 10 percent per year for a while now.  But cumulatively, the impact is devastating.

Within higher education, we don’t talk about it much, because it isn’t unique to us.  Outside of higher education, health insurance draws plenty of attention, but rarely with any connection to a labor-intensive industry filled with highly educated people.  The one area where the connection routinely gets made is in discussion of adjuncts, but there, the focus is usually just on the cost to the adjunct of going without.  That’s a valid concern -- hell, it’s a scandal -- but it misses the “why.”  

With every passing year, total tuition revenue goes up a percent or two; state aid remains flat; local aid remains flat; and health insurance climbs by double digits.  Play that out for, say, five years, and see what happens.  

It isn’t pretty.  It makes union contract negotiations that much harder, because both sides of the table are squeezed ever harder by a third party that simply dictates what it will take.  It makes balancing the budget a fresh challenge every year, because the heroic measures of last year are swamped by yet another increase this year.  In the context of local planning, it operates like a natural disaster, except that there’s nothing natural about it.

A couple of days ago I was doing a volunteer shift at a local museum, and got to chatting with the receptionist in a slow moment.  She said that she’s a music graduate looking to break into the arts; for now, she’s working two part-time jobs in the arts.  As she put it, “I have until 26.  Then, I’ll need a job with benefits.”  A part of the Affordable Care Act that was intended to take pressure off of young people just starting out -- and does, to some extent -- also works as a Sword of Damocles.  She’s acutely aware of her time running out.  I couldn’t blame her.

When the ground rules get less forgiving every single year, it’s easy to blame people for the choices they make within those rules.  But when we do that, we miss the fact that the rules themselves weren’t handed down from the mountaintop on stone tablets.  We’re so busy following the ball that we don’t notice the outfield walls moving farther out after each inning.

For example, it’s possible to decouple health care from employment.  Most advanced countries do.  Do that, and many of the dilemmas we’re struggling with now become moot.  

But we don’t.  Instead, we blame individuals for the choices they make under circumstances that get a little more constrained every single year.   

Yes, pay attention to the usual indicators.  But the “meanwhile” story is shaping those indicators far more than we usually recognize.  Don’t just look at the ball.  Look at the rule book.

Friday, September 22, 2017


Note to Readers

I hate to do it, but the comment spam has gotten out of control.  I've had to shut down comments on this site.

Comments can be posted to the site at

Wednesday, September 20, 2017


Whatever Happened to French? And German? And Arabic?

This one isn’t my field of expertise, so I’m hoping folks who know it at a deeper level than I do will chime in.

At the three community colleges at which I’ve worked, I’ve seen the same trend in language departments.  Spanish dominates the field, and American Sign Language is picking up strength.  Every other language is niche, declining, or dead.

It wasn't always so.  There was a time in my memory when French was vital.  At many colleges, undergraduate German was, too.  Now, we can’t run enough sections to justify a hire. (If you follow Rebecca Schuman’s darkly comic series about job postings in German, it’ll become clear quickly that this isn’t just a quirk of a few places.).  At various points, Japanese, Arabic, Portuguese, Russian, Italian, and even Latin have had flashes of interest, but none has lasted.  The jury is still out on Chinese; we haven’t been able to get steady instructors to really find out one way or the other.

From an administrative standpoint, the challenge with languages is twofold.  First, they’re sequential, so unless you have lots of well-populated sections at the 101 level, it’s unlikely you’ll even be able to run 200 level courses.  Educationally, I suspect that the payoff from language learning increases as you go along, so we’re running lots of the high-effort, low-payoff stuff, and very little of the more interesting stuff.  Second, languages aren’t interchangeable, so if student demand shifts from German to Spanish, I can’t just shift someone’s load from German to Spanish.  Some pairings are more common than others -- Spanish and French are commonly found together -- but if, say, Japanese comes in low, I can’t reasonably ask the professor to pick up a section of Arabic to make up for it.  That’s not how languages work.

I’m glad to see Spanish doing well, and ASL has been a pleasant surprise.  But what happened to the rest of the world?  Where did the interest go?

Some of it may be a function of high schools.  The Boy took French in junior high and early high school, but when he wanted to take the IB program, they only offered Spanish and Latin.  So he dropped French, and started taking Latin.  If high schools phase out teaching French, or German, or whatever, it’s not shocking that the pipeline for college classes starts to dry up.  Though at this level, most language enrollment is at the 101 level; we don’t get a lot of students coming in at the 200 level.  

There’s probably a demographic component, too; the Latinx population is increasing, and some of the kids from those families come in with colloquial Spanglish and a desire to learn the real thing.  Other kids notice the population shift and decide that Spanish is the most useful option.  (Growing up near Canada, as I did, French seemed more practical.)  But that doesn’t really explain the relative dearth of students for other languages.  It just seems odd to me that a college of 13,000 students can’t support a single section of second-year French.

The trend pre-dates the Trump administration, so whatever you want to say about Trump, I don’t think he’s the critical variable.  It’s something else.  

So, I’ll throw it open to wise and worldly readers who understand this at a much deeper level than I do.  What’s happening with all of the other languages?  

Tuesday, September 19, 2017


Standards and Standardization

Should every section of an online class be the same?

I was fascinated to read about the HLC raising concerns about online courses at Scottsdale Community College because those same concerns are endemic to the industry.  From the IHE account -- which I’ll admit is a secondary source -- it sounds like the accreditors are concerned about inconsistency of delivery and quality across sections.  It calls for greater standardization.  The college responded by pointing to faculty contracts and academic freedom as constraints on how tightly they could control online classes.

In philosophy, these are called “incommensurate premises.”  If the two sides don’t agree on the ground rules, they won’t be able to come to agreement.  Or if they do, it will be fragile, because it will be based on a misunderstanding.

As an administrator, I had to sympathize.  It’s frustrating to be dinged for a lack of standardization when the local rules don’t let you standardize.  It’s a bit of a double bind.

To be fair, some of Scottsdale’s issues sound self-inflicted.  For instance, it’s apparently running multiple platforms at the same time.  There’s nothing unusual about a college picking a single LMS and requiring everyone to use it.  To my mind, that’s the equivalent of picking the classroom where a class will be held.  That’s a core, and necessary, administrative function.  Running multiple LMSs at the same time forces students to jump from platform to platform depending on who’s teaching, and it places a much greater burden on the helpdesk.  This one seems like an easy fix, at least conceptually.

Mandating training is more of a gray area.  I don’t see any reasonable “academic freedom” objection to requiring a basic “how-to” session for using a platform, any more than I see an academic freedom objection to observing fire drills or snow days.  Certain basic logistical functions require that everyone rows in the same direction.  But beyond the basics, the issue gets murkier.

I don’t expect every onsite section of a class to be interchangeable.  They should be similar enough in goals and standards that a student who passes Intro to Psych is ready for the next Psych class, regardless of who taught Intro.  But I assume some variation from course to course based on the style and pedagogical choices of the professor.  Student learning outcomes are goals that every section of a course should meet; how they get met is the domain of the professor, within general guidelines of ethical practice.  I don’t know why that same principle wouldn’t apply online.

We don’t typically require pedagogical training for onsite sections.  That doesn’t mean that anything goes; we still do class observations and solicit student feedback.  We also have peer-to-peer classified observations that don’t go into the personnel file, but those are entirely voluntary.  The mandatory observations are a sort of quality control, done to ensure that everything is at least at a professional level.  The voluntary observations are for the sake of improvement, of going from “already good” to “even better.”  As such, they can only work when the observed are actually willing to hear it.  And they’re premised on the assumption that some variation from one professor to the next is normal, natural, and fine.

Particularly for online classes, the gray area in which many of us struggle is in something like response time.  How long is it reasonable for faculty to allow inquiries to go unanswered?  How often should they check in?  

Honestly, my greatest frustration with online teaching isn’t too much idiosyncrasy; it’s too little.  Many publishers provide “e-packs” that are essentially canned courses, and over my career, I’ve seen too many faculty use them as a sort of autopilot.*  That meets the desire for standardization, but at the expense of real engagement, and students can tell.  I’d rather have professors actually engage, even if that means that Smith’s section has a different feel than Jones’.  Their classroom classes do, and we’ve survived that.

Maybe it’s the political scientist in me, but the best solution I’ve seen is a sort of federalism.  Yes, some things need to be standardized, such as the LMS.  Others may be set at the department level.  The rest should be open to customization.  The boundaries can get a little fuzzy, but the general shape seems clear.  Faculty aren’t robots, and I don’t want them to be.  Academic freedom applies to online classes, too.  

*I’ve seen the movie “Airplane” enough times that I can’t use that word without smiling...  

Monday, September 18, 2017


Full “Dad” Mode

Middle age brings with it certain superpowers.  Invisibility is the most obvious, but I also have the power to embarrass my teenage children beyond measure, often without even trying.

I’m not above enjoying that.  What follows is a barely-edited transcript of last night’s dinner conversation.  For context, The Girl is 13.

The Wife: You can attend the (school event), right?

Me: Yup.

TW: Will you sit with The Girl, or will you be separate?

Me: I’ll try to sit with her, but if I can’t, I’ll still find a way to embarrass her.


Me: It’s easy!  I’ll just stand up in back and yell “I’m TG’s Dad!  Woooooo!!!!”

TG: nooo...

Me: Or I could start dancing.  You’ve seen me do the “cabbage patch” (demonstrates)

TW: Oh, do the overbite!

Me: Absolutely.  (adds the overbite)

TG: daaaaaaaaaaad…

Me: Or I could do the lawn sprinkler (demonstrates)

TG: There won’t be any music playing!

Me: I can wear headphones.  Or say I’m dancing to the sounds in my head.

TG: (sighs)

Me: They’ll be all “Mr. Reed, please sit down.  I’ll say, “That’s Dr. Reed to you!” and then start vogueing.

TG: (shoots death stare)

Me: No?

TG: No.


Me: (exaggeratedly droopy) oooookayyyyyy…

She wins this round.  But next time...

Sunday, September 17, 2017


Reverse Calvinism: Thoughts on “The &*()& Survival Guide,” by Robert Sutton

Several years ago, Robert Sutton made waves when he published the book “The No Asshole Rule.”  The book detailed the damage that even high-performing jerks (I’ll use the term ‘jerks’ from this point forward) can do, whether to coworkers, customers, or a culture as a whole.  Now Sutton has published a survival guide so those of us who have to work with jerks can survive with our dignity and sanity intact.

It’s a quick read, and Sutton supplies enough (and vivid enough) anecdotes to make the cases painfully clear.  Most of us won’t have any trouble coming up with examples of our own.  

Sutton makes a few core assumptions.  First, he points out that jerkish behavior is contagious.  The longer it goes unchecked, the likelier it is to spread, whether out of habituation, self-defense, or perverse incentives.  Second, it has characteristic patterns.  Third, it’s counterproductive, at least in the long term.  Finally, in some cases it has roots much deeper than standard workplace interventions can hope to handle.  

That fourth one strikes me as the most interesting by far.  Sutton draws a distinction between an occasional or situational jerk, and what calls a “certified” (or chronic) one.  The occasional or situational jerk could be anybody, and chances are, we’ve all been that at some point.  We’re likely to be at less than our best when we’re overtired, overextended, overstressed, or otherwise off-balance.  When someone who’s usually pretty congenial and composed is uncharacteristically snippy, it’s often a good idea to extend some benefit of the doubt.  Depending on the relationship, sometimes just taking them aside and letting them know how they’re coming off may be enough to set things right.

The certified jerk, though, is awful most of the time.  This is the person who’s always demeaning, undermining, or using others, often just for the sheer sport of it.  In these cases, Sutton’s many strategies tend to boil down to variations on escape.  Escape personally, by leaving the situation; escape psychologically, by investing less in it or otherwise using ‘framing’ to create distance; or, in rare cases, escape organizationally by gathering allies and staging a revolt.  Sometimes, the jerk simply has to be cast out.

It’s a sort of reverse Calvinism.  Rather than constantly scrutinizing ourselves (and others) for signs that we aren’t truly good, Sutton advises constantly scrutinizing others (and ourselves) for signs that we’re certifiably awful.  He suggests treating “certified jerk” status as a sort of residual diagnosis -- to be used only when all other options have been exhausted.  But once they have, it’s time to unleash drastic measures.

As with many how-to books, much of the content boils down to “it depends on context.”  That’s true, of course, but not terribly helpful.  Is it a good idea to take the high road?  Yes, unless it comes off as weakness or enabling.  It is a good idea to hit back?  No, unless it is.  Does humor help?  Yes, except when it doesn’t.  Sutton’s honesty is admirable, but it tends to reduce the book’s usefulness.  (My favorite tip, which I learned from a former boss: when a student or parent starts berating you with “I pay your salary!,” respond with “Oh, that’s you?  I’ve been meaning to talk with you about that.”  It confuses them long enough to interrupt their momentum.)

Most of the book addresses corporate settings, where some level of turnover is relatively normal.  It never addresses higher education, which is a real oversight.  Sutton notes that “employees in Tepper’s study who were trapped -- who didn’t leave abusive bosses because it was too hard to find other work -- were less satisfied with their jobs and more depressed; they also suffered elevated emotional exhaustion and conflict between work and family.” (38)  For many faculty who have full-time jobs but aren’t national superstars, the relative lack of opportunities for lateral moves can effectively force unhappy people to stay.  Given Sutton’s insights about bad behavior being contagious, it’s unsurprising that academia provides a haven for certain kinds of jerkish behavior.  If you’re a full professor of, say, English at a community college with twenty years’ experience there, and you’re unhappy, the odds of you finding a comparable job at comparable pay elsewhere are slim.  The industry doesn’t work that way.  You’d have to move into administration, take a pay cut, or find another line of work altogether.  Instead, you’re likely to stick around, albeit unhappily, and gradually either check out or act out.  If enough others do the same, over time, the culture can go in some unhappy directions.  It may be situational jerkiness, but the situation can go on for a long time.

That’s the book I’d like to read.  When “exit” isn’t a viable option, either voluntarily or involuntarily, what’s the best long-term strategy?  What if you can’t cast out the sinner?  “It depends” is true, but unhelpful.  Some contexts are common enough to be worth specifying.

Thursday, September 14, 2017


Friday Fragments

Someone on Twitter asked this week about the usefulness of Twitter among students in a large lecture class.

I was intrigued.  We don’t have large lecture classes; we top out at 35, and even that is rare.  There’s nothing here comparable to the 300 student intro lectures at Rutgers.  (That strikes me as a selling point for community colleges, but that’s another post.)  All of which is to say, I’ve never seen it tried.

My guess is that some sort of group texting app would work better for classrooms, since they’re specific to a given group.  Twitter is public, which means that comments made in one context will often be read, and reacted to, in another.  Yes, Twitter has “lists,” so students could winnow down their feeds to a given class if they want, but what they put out there isn’t limited to the list.

That said, though, I’m guessing, and curious.  Most students have the ability to access Twitter at this point, and I’m a fan of stuff that’s free.  So I’ll ask my wise and worldly readers.  Have you seen Twitter used well in a large lecture classroom?


According to The Girl, who is wise in the way that 13 year olds are, the term “goth” has been replaced by the term “emo.”

Which immediately triggered thoughts of Emo the Muppet.  Picture Elmo, but paler, and wearing lots of black.

When I did the Elmo voice and said “Emo feels misunderstood,” TG laughed as hard as I’ve seen her laugh in a long, long time.  It did my heart good.

Naturally, I then had to google “muppet emo,” which led me to discover a Muppet emo band called...wait for it...Fragile Rock.  

As long as there are people out there who can create, and share, something as transcendently silly as Fragile Rock, I have hope for our culture.


This piece on comedians and the ages at which their stage personae make sense struck me as especially relevant in the wake of Jerry Lewis’ death.  

I was of the generation that knew Lewis, if at all, as a shiny-haired host of a Labor Day telethon.  When I got older, I saw some of his earlier stuff, and was shocked at the contrast.  In his younger days, he had a manic style that reminded this Gen X’er of an Ace Ventura-era Jim Carrey.  His humor wasn’t mine, either in his early days or his later ones, but the contrast between 50’s Jerry and even 70’s Jerry was genuinely jarring.  His humor only worked when he was a younger man; as an older man, it came off as pitiful.  (The same could be said of Jim Carrey, come to think of it.)  His only worthwhile acting work in my lifetime came when Martin Scorsese cast him basically to play himself, because Scorsese thought, correctly, that he’d make a fascinating grotesque.

Lewis and Miles Davis were born the same year -- 1926 -- and both took drug-addled five-year sabbaticals from show business in the late 70’s, both of which concluded with some of the worst work either one ever produced.  (The recent Don Cheadle movie about Davis focused on that period, which I considered a daring choice.)  Davis died much younger than Lewis, and nobody would have accused Davis of aging gracefully; by the end of his life, he kind of looked like a lizard.  But his persona made sense as he got older.  Even as his musical style evolved, and his fashion style, the “coolly distant” vibe still worked.  To this day, he’s the only musician I’ve seen (other than a conductor) turn his back to the audience in concert for longer than a spin.  I understood the reason, to some degree, but it was still a little surprising.

It’s hard to imagine a young Lewis trying to do the Martin-and-Lewis shtick with a young Davis.  I suspect it would have ended violently.  Davis was a pretty good boxer in his time, and wasn’t known for suffering fools gladly.  And neither man was a shrinking violet.

I’ve long been convinced that some personalities, and personas, make the most sense at particular ages.  Bad boys may be charismatic when young, but they don’t age well.  Nerds often improve with age, since they can’t remain quite as single-minded as life experiences accumulate.  Bodies have “set weights” that they want to be at given times; I can’t help but think that personalities have “set ages” at which they make the most sense.  Some of what looks like social awkwardness may be someone whose personality is out of sync.

I don’t know if there’s science on that, but I suspect some wise and worldly readers do...

Wednesday, September 13, 2017


The Not-Yet File

This week I had a chance to open up the not-yet file and pull something out.  You’ve probably done it, too.

The not-yet file is my shorthand for the mental spot that I put ideas that I’m pretty sure will come in handy at some point, but for which the stars haven’t aligned yet.  Like any good file, it has multiple subfiles:

Then, there are the trickier cases:

I’m sure there are more.  Wise and worldly readers, what does your not-yet file look like?

Tuesday, September 12, 2017


Design Thinking and Shared Governance

What does “fail fast” look like when folks have tenure?

The Chronicle had a piece a few days ago on “design thinking” as a way to innovate.  It’s based on the design thinking lab at Stanford, where they teach that you should have a “bias toward action,” “begin with the end in mind,” and “fail fast” so you can “iterate” more quickly.

In other words, try stuff, see what works, then try again.

The key element, though they don’t usually put it this way, is speed.  It’s about getting from idea to execution with minimal friction in between.  

Although its champions are not, as a group, humble, design thinking is based in a certain epistemological humility.  It assumes that there are issues we can’t anticipate prior to attempting things, so the way to learn is to jump in with both feet.  Let the results speak, and don’t be afraid to pivot as the results dictate.  If we assume that the world is a hugely complex system that defeats our attempts to know it -- a safe assumption, in my view -- then there’s an argument for giving up the fantasy of certainty and instead taking acceptable risks.  It’s a sort of anti-perfectionism.  

It makes sense that design thinking would find a home at Stanford.  In part, that’s because design thinking is popular in Silicon Valley, where speed beats perfectionism almost every time.  It fits the ethos of a small startup, in which a few people with a clever idea try to win the backing of a very small number of people with massive amounts of money.  “Failing fast” is a great way to cut losses, but it incurs severe losses; the game is playable only if you have lots of spare money.  If you can survive multiple face-plants en route to the big payoff, design thinking offers a chance to do great things.

Stanford has absurd amounts of money, and it’s populated by brilliant young people with lots of unstructured time.  The model fits.

In the context of community colleges, though, it’s a tough fit.

At the most basic level, we don’t have the resources to survive multiple face-plants.  Our per-student funding is a single digit percentage of what Stanford has.  And we don’t filter out the students who need extra help, like Stanford can.  We take the students who need more, and we have less with which to do it.  

Even if we had far more resources, though, it would still be a tough fit.  Design thinking works really well with very small groups of people who can execute their own plan.  Teams of four people who convince one person with money can make it work.  But we have a shared governance tradition in which major changes aren’t supposed to occur without the advisory input of multiple constituencies across campus.  It isn’t a matter of winning over one skeptical investor; it’s a matter of winning over the faculty senate, the staff union, the trustees, and the local government.  And doing it without the promise of a huge financial payoff if the idea works.

In a shared governance setting, each constituency will prefer to make its own stamp visible on the idea.  I think that’s at the root of much resistance to either national ideas or data-driven ones; both of them have the emotional effect of outsourcing the brain work elsewhere.  Administrators like to complain about campuses that shoot down good ideas on the grounds of “not invented here,” and the complaint is often correct, but it’s rooted in a reluctance to give up the design role.  Adopting an idea from someplace else, no matter how good it is, can feel like surrendering agency; in labor-studies terms, it feels like deskilling.  That’s why otherwise-intelligent people can rattle off one half-baked objection after another to an idea that makes sense on the merits.  They aren’t actually (often) objecting to the content.  They’re objecting to the role that somebody else’s idea implies for them.  Contrarianism that’s independent of content is often based in anxiety about the prospect of a reduced role.  If the only way I can make my presence felt is to scream “NO!,” then scream I shall, whether it makes any sense or not.

That dynamic leads to all sorts of dysfunctional places and terrible decisions.  I’ve seen good ideas sacrificed on the altar of status anxiety often enough to recognize the signs early.  

Succeeding in this environment requires a very different kind of design.  It’s not about a few people hashing out an idea in September, launching it in October, and pivoting in November.  It’s about creating an environment in which large groups of people with very different emotional and material interests are both willing to acknowledge a problem and willing to bat solutions around.  The process will be slow, often frustrating, and vulnerable to all manner of provincialism and self-dealing.  But it’s fairly well-suited to a setting in which resources are scarce, results are slow, and the objects of many experiments -- students -- are vulnerable.  

I don’t want students to fail fast.  I don’t want them to fail at all.  And many of these ideas require widespread support if they’re going to work.  That’s not to say that the loudest voices are necessarily the most representative, or that a cultural ethos of “nah” deserves deference.  But it is to say that four guys in a room, no matter how smart the guys or how nice the room, can substitute for the cultural work we do on campus.  The challenge is doing that cultural work in the service of ideas drawn on something deeper than habit.

This page is powered by Blogger. Isn't yours?