Thursday, January 31, 2013

Friday Fragments, Chock Full of Linky Goodness


- The Boy continues to careen into tweenhood.  On Wednesday morning I dropped him off early at school for jazz band.  As we approached the front entrance, a girl started to cross, and I stopped to let her pass.  He got oddly quiet.  Later that night, he told me that she’s his crush, and that he was silently praying that she didn’t see him in the car.

Reader, I embarrassed him.

And so it begins.

- Walter Russell Mead can be maddening sometimes, but I read this piece a week ago and haven’t been able to shake it.  It’s basically Ortega y Gasset’s Revolt of the Masses turned upside-down.  If the history of elite support for public higher education was based on a semi-fatalistic sense that the masses were rising economically -- and therefore politically -- anyway, so it would be prudent to civilize them, then what happens when the masses seem to be sinking economically and politically?  

- My kingdom for a printer that consistently works.  They seem weirdly prone to random malfunctions to a much greater degree than just about any other electronics.  And they seem immune to progress.  

- My current obsession is the Accelerated Learning Program at the Community College of Baltimore County.  (Check it out here.)  It’s a variation on just-in-time developmental English. Wise and worldly readers -- have any of you taught this way?  I’d love to get a first-person perspective on it.

- “What Writers Can Learn from Comedians.”  I love love love this post.  And not only because it name-checks Bob Newhart.  

- Super Bowl Sunday looms, with its usual conflicts.  I don’t watch football as much as I once did, partly due to life changes and partly due to increased awareness of what it does to players.  Also, the Bills haven’t been watchable since the Clinton administration..  

The Super Bowl is usually fun to watch, since it combines spectacle, sport, and great commercials.  (Last year we all loved the Darth Vader kid who started the car with The Force.)  But the whole “bro” culture that it celebrates is not how we live our lives.  I want TB and TG to know that that culture exists, and to be conversant enough in/with it that they can navigate the world around them as it is.  But there’s a difference between acknowledging a presence and endorsing it.  

- Signs of hope...

- She got more than the usual amount of crap for it, but Lee Skallerup Bessette’s column from Wednesday, about hoping that her brilliant young daughter avoids academia, struck a chord.   

Read it generously.  This is the thoughtful reflection of an aware parent.  It’s vulnerable, but that’s what parents are.

Wednesday, January 30, 2013

Peeling the Onion


A couple years ago, The Onion ran a story headlined “Unemployment High Because People Keep Blowing Their Job Interviews.”

I was reminded of that in reading about the governor of North Carolina, Patrick McCrory, and his fusillades against the liberal arts in general and gender studies in particular.  He’s the latest in a string of governors to declare that the recession lingers because students keep studying the wrong things, like he did.  (McCrory was a double major in education and political science.)  If only public colleges and universities would stop teaching the liberal arts and just focus on STEM, he implied, all would be well.

Um, no.  

The layers of mistake are many and heavy, so I’ll just pick a few of the more flagrant ones.  

The job market for college graduates was pretty good from about 1997 to 2000; it fell apart in 2001.  Whoever invented the liberal arts in 2001 must have felt like a real jerk.  Then the market got pretty good again from about 2005 to early 2008 before falling off a cliff in late 2008.  Were 2008’s graduates markedly dumber or worse-trained than 2007’s?  Did philosophy suddenly outpace business as the most popular undergraduate major?

Questions like that are symptoms of “critical thinking.”  While some people may consider critical thinking suspect and even subversive, the business case for it is pretty strong.  It helps avoid stupid mistakes, and it helps prevent wasteful uses of resources.  It isn’t everything, of course -- creativity matters greatly, and awareness of the past is no small thing -- but without it, it’s awfully easy to fall prey to fools and fads.

Like governor McCrory, I was a poli sci major.  But that wasn’t all I took.  I also took chemistry, and math, and music, and history, and literature, and sociology, and religion, and economics.   In the course of doing that, I learned plenty of facts that I’ll never use on the job -- did you know that there were five eclipses in 1678? -- but also some skills that I never stop using.  

Unlike governor McCrory, I have attended any number of employer advisory board meetings for various academic programs at several different colleges.  And the employer feedback is always the same, regardless of program: employers can train, but they’re counting on us to teach.  That means the basics: communication skills, work ethic, problem solving, and a basic sense of how the world works.  These can come from various places, but their traditional home -- and if Academically Adrift is correct, their most successful home -- is the liberal arts.  That’s why even our technical majors have “general education” requirements.  We want to ensure that future engineers, nurses, and chemists are capable of discerning meaning from complicated prose, of juggling multiple points of view, and of making themselves understood in writing.  

The gender studies example is particularly bad.  To my mind, the hallmark of an educated mind is learning how to re-see something you thought you understood, using an entirely new perspective.  Gender studies is particularly good at that.  It’s the same skill set that the best technologists have.

Speaking of, anyone who follows technology knows how quickly today’s cutting-edge skill becomes tomorrow’s afterthought.  I saw that firsthand at DeVry in 2001, when all those telecom majors abruptly became unemployable as Y2K came and went and the dot-com boom crashed.  If you don’t have the adaptability that comes from a deeper understanding, you can go from hot ticket to unemployable in short order.

But honestly, I don’t think even he believes what he’s saying.  Duke and Chapel Hill didn’t achieve national prominence on basketball alone, and I have to assume he knows that.  Recessions don’t happen because candidates blow interviews or because colleges teach history.  Let’s stop pretending that The Onion is the truth.

Tuesday, January 29, 2013

Thoughts on the New America Foundation Report


The New America Foundation released a wonderful and thought-provoking paper proposing a serious overhaul of Federal financial aid.  It’s a lot to digest, and I’ve only had the chance to give it a brief run-through.  That said, a few thoughts:

First, there’s plenty to like.  Turning Pell grants into entitlements -- that is, with predictable annual funding -- makes a world of sense, as does restoring summer Pell.  (The previous round of summer Pell was too brief for colleges to adjust as much as would have been helpful.)  Income-based repayments have a lot to be said for them, and direct lending is clearly and vastly preferable to running public loans through banks.  

I was heartened, too, to see an open call for restoring the eligibility for students who enroll under “ability to benefit.”  Until last year, students who lacked a high school diploma or GED could take an exam and establish that they had the “ability to benefit” from higher education.  That came in handy for people from a host of different life circumstances.  With the GED becoming more expensive -- and possibly more rigorous -- losing that safety valve shuts a lot of people out.  

Expanding “experimental site authority,” as the paper suggests, also strikes me as a must-do.  There are entirely too many perverse incentives in the current system, but folks on the ground are mostly powerless to do anything about them.  Letting more colleges try more innovations strikes me as a low-cost way of road-testing different approaches, and thereby generating data to show which approaches are worth broader adoption.

That said, though, I have several reservations that I’d love to see the NAF consider.

First, the proposal suggests capping time-to-degree at 125 percent of the “normative” time.  (Right now it’s 150 percent.)  A student would have 2 ½ years to get a 2 year degree, or 5 years to get a 4 year degree.  

That’s tricky on a number of levels.  Most basically, it recommits financial aid to time-based measures of learning.  If we need to get past the credit hour -- Amy Laitinen, one of the authors of the paper, has written brilliantly on that elsewhere -- then we need to allow financial aid to flow to programs that aren’t based on time. To the extent that we write normative measures of time into law, we make it that much harder for colleges to experiment with credit for prior learning, competency-based credit, and the like.  Given that one of the goals of the paper is cost containment, I would think that escaping the iron grip of Baumol’s cost disease would be a high priority.

Second, the proposal misconstrues developmental education, and inadvertently reinforces its worst features.  The paper excludes developmental students from the 125 percent limit, which sounds fine if you assume that students are either entirely developmental or entirely not, and that they take their developmental coursework all at once.  But that’s not usually the case. Students who take developmental math, for example, also frequently take other college-level courses at the same time.  Making progress towards a degree matters to students, and so does the ability to take courses that hold their interest.   

At a more fundamental level, many colleges are experimenting with building developmental coursework into college-level courses in real time.  (The Carnegie Foundation has supported a variation on this through its “mathway” and “statway” projects.)  As I understand it, the mathway model involves placing most students directly into college-level courses from the outset, but then providing just-in-time extra review as they go along. (The class has extra hours to accommodate the extra help.)  The goal is to get students what they need, when they see that they need it, and in a format that allows them to perceive -- correctly -- that they’re actually progressing towards the degree.   A financial aid rule that relies on a bright line distinction between developmental and college level coursework would just get in the way.  I’d hate to see a promising academic innovation sacrificed to a financial aid rule, even if the rule were well-intended.

The 125 percent rule also makes little sense when applied to credit-bearing certificates, which are increasingly popular in community colleges.  “Stackable” certificates are all the rage now.  They work by building employable off-ramps into degree programs.  For example, on the way to a nursing degree, a student might be able to pick up a CNA certification.  That way, if the student has to stop out or drop out, s/he isn’t leaving empty-handed and debt-burdened.  She’s walking away with a credential that can help her get a job quickly.  Then when she returns, she’s already on her way to her nursing degree.  

Stackable certificates can offer the relatively quick turnaround that appeals to adults who got laid off and need a job fast.  But unlike non-credit certificates or whatever it is that unaccredited for-profits offer, they also allow the student to make simultaneous and real progress towards a degree.

The catch is that many students who want stackable certificates have developmental needs, and those can’t be addressed if the window is too small.  The 150 percent rule is challenging enough; 125 percent would be that much worse.  

I also wonder about the political realism of the paper.  It outlines a host of proposals, but insists that they’re all connected, and that they don’t make sense if separated.  Color me nervous.  There’s a perfectly valid mathematical and ethical argument for capping student loans and eliminating tuition tax deductions to pay for increases in Pell grants.  But there’s also a strong political argument for making sure the middle class gets its own.  Once the middle class decides that a program is really just for the poor, that program tends to wither on the vine.  If the price of Pell is some tax expenditures for the middle class, that may be a price worth paying.  

Finally, I wonder if the premise of the paper is a shade too easy.  Simply put, it assumes that colleges can and will get their budgets under control if financial aid adjusts.

Two thoughts.  First, in the community college world, the idea that budgets have been anything other than tight for the last decade is just otherworldly.  For most of my administrative career, I’ve been managing austerity in one form or another.  Tuition/fee increases lately have been much more about flat or declining state support than about profligate spending.  If you don’t believe me, look at what adjuncts get paid.  It isn’t profligate by any stretch.

Second, at this point, many of the drivers of cost increases are either external or structural.  Assuming that we just need a little “discipline” is about as helpful as assuming that America could conquer its obesity problem if we all just had more will power.  It’s just not that simple.  Yes, the cost curve needs to be bent, but doing that will require a lot more room to move.  Getting more restrictive, when we should be getting more creative, isn’t the answer.

Anyway, those are some first thoughts.  Wise and worldly readers, I really recommend giving the paper a good once-over; if you have a chance, I’d love to hear your thoughts on it.  And if anyone from the NAF reads this, I’d love to hear from them, too.

Monday, January 28, 2013

Linked Online Courses?


Okay, I admit, this is crowdsourcing as a blatant attempt to save time.  Researching this formally would be quite an undertaking, but I’m hoping that some of my wise and worldly readers have seen something like this.

We’ve been experimenting with variations on “learning communities” and “linked courses.”  Different campuses define those terms differently, and not every nuance is relevant here.  For present purposes, I’m looking at two or three courses in which the students are the same but the professors and subjects change.  (Here, we call those “linked courses.”)  Everybody in Professor Freud’s Psych 101 is also in Professor Van Helsing’s Intro to Phlebotomy, say.  The idea is that the students are likelier to bond with each other, form support groups, and the like, if they see the same faces from class to class.

We’re running multiple variations on the format; right now the finding I’m comfortable sharing is that the model works best when it doesn’t take up every course in a student’s schedule.  A little variety goes a long way.  But when you offer, say, a bloc of three courses that students have to take in unison, the logistical conflicts multiply.  So many of our students work 30 or 40 hours a week for pay, often with variable hours, that the rigidity of a linked course model may defeat some of its possible gains.

In trying to figure out how to balance the “group bonding” benefit of a cohort model with the obvious need for flexibility, someone suggested including an online course in the mix.  

I was intrigued.  On the one hand, it would obviously introduce some flexibility into the scheduling.  On the other, I’m not sure whether the group bonding would carry over from the classroom to the screen.

Has anyone out there seen or tried that?  Does it work?

Sunday, January 27, 2013

A Different Game


Are more expensive colleges better?

Working at a community college, I’d have to say “not necessarily.”  They could be, and sometimes they are, but it depends on how they use that money.  That’s why this piece -- about a study of liberal arts colleges showing little relation between cost and value -- didn’t especially surprise me.

In the community college sector, we focus intently on teaching the first two years of the undergraduate curriculum, along with some pre-undergraduate (“developmental”) material.  We also do a fair amount of non-credit workforce development and personal enrichment instruction.  But we don’t do junior and senior level courses, graduate education, high-profile sports, or cutting-edge scientific research outside of the scholarship of teaching and learning.  

Community colleges tend to spend the least of any sector, and to charge even less than that.  (Public subsidies make up the difference.)  That means there isn’t much to spend on non-essentials.  Budgets tend to be modest, and focused pretty clearly on instruction.  Even student services tend to be focused on helping students succeed academically, such as tutoring or providing accessibility technology.  We don’t have a climbing wall, let alone a football team.  (Though to be fair, that may be a Northeastern thing; I’ve heard of community colleges in other states with football teams.)  

Colleges that do those other things are playing a different game, in some ways.  In that sense, direct comparisons can be difficult.  

Does having a football team help first-year students learn?  

Directly, no.  But indirectly, it’s possible: if a successful football team leads to alumni giving, and the alumni giving becomes far more than is spent on football, then it’s possible that the payoff from alumni giving will have salutary effects on students in the first year, whether through scholarships, general operating support, or various gifts for various programs.  If the football program is high-profile enough -- I’m thinking here of the University of Michigan -- it can draw strong students who want the football Saturday experience; their presence enriches the experience of other students who could care less about football.

And that’s where I see the more expensive schools playing a different game.  Community colleges (and most tuition-driven colleges) have to focus pretty intently on the core function.  Grants are helpful in developing experiments and taking the longer view, but even there, the experiments and view in question are pretty close to the core function.  For example, would shortening a developmental sequence improve student completion rates?

But the higher-cost places are doing what pool players call bank shots.  Instead of focusing more resources on the core function, many of them make a choice to pour their greater money into ancillary functions, in hope of harvesting even greater incidental payoffs over time.  Those payoffs could come from technology transfer, or alumni giving, or political favor, or nearly anything else.  If we have the best rowing team, the the alums of our rowing team will give more!  If we become a national power in basketball, then we’ll attract the upper-middle-class kids who love sports!  

Bank shots are great when they work.  But they don’t always work, and even they do, the payoff is often delayed.  I endured the godawful Rutgers football teams of the 90’s when I was in grad school there, and it was hard to watch all that money be spent on a series of terrible teams while graduate students lived three to an apartment in New Brunswick.  I can safely say that the money the typical community college spends on a tutoring center does far more for first-year student learning than the money that Rutgers spent trying to pretend it was Ann Arbor.

The paper elicited plenty of critiques in the comments on IHE, including some pretty good ones.  Yes, it was based on self-reporting, which is notoriously sketchy.  Yes, different colleges categorize spending differently.  But I’m thinking that if you go cross-sectoral -- which, to be fair, the original paper didn’t -- you’d find that some of the gap comes from having different goals.  If your goal is to help the student right in front of you right now, you’ll get a much better bang for your buck at many community colleges than you would at some much more expensive places.  The more expensive places have something else in mind.  

Thursday, January 24, 2013

On Googling Job Candidates


I don’t know if I’m just the fluky exception or if this is indicative of a larger truth, but I don’t Google job candidates.  

I hadn’t really thought about it until the MLA conference.  In the course of discussion there, someone who’s on the market talked about what hiring committees find when they Google her.  I mentioned that I don’t Google candidates. Everyone acted like I had admitted still believing in the Tooth Fairy.  But my practice makes sense, and I suspect I’m not the only one.

Hiring managers, and people on search committees, have to go through some pretty specific training from HR about the kinds of questions we are and aren’t allowed to ask.  Certain topics are entirely off limits unless the candidate volunteers them; even then, you shouldn’t pursue.  

The idea behind those rules is twofold.  One, certain kinds of personal knowledge can form the basis of biased judgments, and we don’t want those to prevail.  That can happen inadvertently among people of conscious goodwill, so it’s better just to prevent the opportunity from arising.  Second, once you know something, it’s impossible to un-know it.  If I didn’t know that Jen was a Yankee fan, then my decision not to hire her had nothing to do with my low opinion of Yankee fans.  But if I knew, it’s hard to disprove the influence.

In an interview, you can craft your questions to avoid troublesome areas.  When checking references, you can do pretty much the same thing, adjusting for the third person.  Yes, some people slip through the interview and then disappoint when they get the job, but I’ve had pretty great results over the years with the people I’ve hired.  Not perfect, but the batting average is far better than, say, Jeter’s.  With experience and forethought, it’s possible to learn much of what you actually need to know through the applications and interviews.

But on Google, anything goes, and it goes with varying degrees of accuracy.  And I say that as an avowed fan of Google.

Some people have relatively common names.  (If you Google my name, you land first on a triathlete.  It’s not a striking resemblance.)  Even if you find the right person, you may or may not be able to trust the source on which you landed.  Or, worse, you may find out something you can’t un-know.  And then you have a real problem.

I just have a hard time squaring the relatively astringent rules for interview questions with the anything-goes information available online.  My personal solution is to separate the two.  There may be a better way, but I haven’t found it.

Yet for all this, job candidates seem to take it as given that they get Googled.  And maybe they do.

Wise and worldly readers, have you found or seen a way to square the increasingly strict rules about interview questions with the ubiquity of information on the web?  

Wednesday, January 23, 2013

But Wait! There’s More!


My Dad used to teach courses on advertising, which was how he justified his weird attention to commercials.  He used to pick up on odd turns of phrase from locally produced ads -- anyone who lived in Rochester circa 1980 will remember those awful/great Hill T.V. ads, with Dick and Linda Hill -- or anything by Ronco.  Three easy payments!  But wait, there’s more!  Now how much would you pay?  If you act now...

He may have been ahead of his time.

The kind of payment gimmicks that used to be restricted to the appliance or pocket fisherman biz are finding their way into higher education.  

Last week, Union College (KY) announced that it would allow students to take their last semester for free if they got good grades and engaged in campus life.  The idea was to offer a tangible reward for sticking around and doing well.

This week, Cleveland State, Florida International, Lamar and Utah State Universities and the Universities of Arkansas, Cincinnati, Texas at Arlington and West Florida agreed to offer a first course free in the MOOC format, as a way of enticing students to stick around after the freebie is done.  

(If memory serves, a certain blogger suggested a deposit refundable upon graduation in these very pages back in 2010.  Just sayin’.)

Higher ed pricing is usually somewhere between obscure and opaque.  There’s a “sticker price,” sort of, that covers the basics.  (It’s usually called tuition.)  There are fees for various purposes -- lab courses, technology, high-cost programs, student life, or even, in the case of Worcester State, walking (!).  For residential colleges, add room and board.  Then start subtracting.  “High tuition/high aid” colleges live and die by the “discount rate,” or the amount that they’re willing to discount (“presidential scholarship”) the sticker price for various students.  Financial aid in various forms, scholarships of all shapes and sizes, and both need- and merit- based programs affect how much a student actually has to pony up.  

The advantage of the complexity of higher ed pricing is that it supports the kind of institution-based redistribution that allows students from economically modest backgrounds to get far more than they could ever pay for.  There’s real virtue in that.  It also tends to tamp down competition on price, directing competition instead towards perceived quality and various amenities.  While there’s plenty of noise in the ways people perceive quality, there’s a valid argument to the effect that maintaining some sort of meaningful standards for higher education is a real public good.  

But when complexity starts to become opacity, especially in the face of faster-than-inflation increases over time, some potential students probably get discouraged.  And that’s where the radical simplicity of the “free semester as a senior” model can cut through the noise.  

Admittedly, it’s a little jarring to hear “the first one is free” coming from people who sell intellectual growth rather than, say, crystal meth.  But there’s no principled reason that the “loss leader” model couldn’t work for colleges too.

My fear is that we’re inadvertently replaying the old airline model from the 1970’s.  When fares were regulated, airlines competed on amenities.  When price competition became viable, a sort of race to the bottom ensued; now airlines compete almost entirely on price (and hidden fees), and they pass the savings on to you by making flying as unpleasant as humanly possible.  Any pretense of luxury is long gone; now you’re lucky to get a bag of peanuts, and legroom for anyone over about five foot ten is a distant dream.

That’s not entirely bad, except for the legroom.  Flights are usually pretty brief, in the grand scheme of things, and some brief unpleasantness may be a fair trade for a lower price.  If you get from point A to point B safely and affordably, the flight has done what it set out to do; you can accept the take-off-your-shoes ritual as a cost of doing business.

But college isn’t like that, and shouldn’t be.  Real education takes time and investment.  I don’t think our current methods are the only possible methods, heaven knows, but I’d hate to see us race to the bottom.  

If we want to avoid falling into the same category as the inside-the-egg-scrambler, we need as a sector to give some conscious and deliberate thought to how to blend quality and economic sustainability.  That will probably require some pretty significant changes.  We can’t rely on the old Pan Am model forever.  The world is changing whether we give it permission or not.  Colleges starting to adopt the marketing tactics of cheesy television commercials may be understandable and even sort of refreshing, but it probably doesn’t lead anywhere we want to go.  

If we don’t do our homework in a serious way, that’s where we’re heading.  Now how much would you pay?

Tuesday, January 22, 2013

Wandering Eyes...


Nate Kreuter’s latest is well worth checking out.  It’s about the fear -- sometimes founded, sometimes not -- that people have when they look for other jobs that their current employer will hold it against them.

Academia has some pretty conflicted attitudes on this.  On one side, there’s a popular myth -- that may be true in some settings -- to the effect that “disloyalty” to an employer will be punished.  (This is one of the only areas in which graduate students have an advantage -- nobody takes umbrage when they go on the market.)  On the other side, many colleges and universities will only give pay raises as counteroffers.  If you buy the “loyalty” framework, then the only way to get ahead is to be disloyal, whether you mean it or you’re just looking to provoke a counteroffer.

Both strike me as dysfunctional.  

In my own unionized setting, there is no such thing as counteroffers for faculty positions.  Salaries are determined by a formula that’s spelled out in the contract.  While that can be frustrating at the point of recruitment, it does reduce the incentive to solicit offers just for the sake of soliciting offers.  If you receive a better offer, taking it or not is your call; I couldn’t respond if I wanted to.  We’ve lost some great people to better offers elsewhere, and that always hurts, but that’s the nature of the system.  That’s hardly universal, though.

I’ll say upfront that anyone who blames an adjunct for looking for full-time work is a jerk.  Wanting a living wage and health insurance is entirely rational, and to the extent that employers like the flexibility of adjunct faculty, they should realize that flexibility cuts two ways.  On the occasions that adjuncts in my areas have found full-time jobs elsewhere, I’ve just congratulated them and wished them well.  Piece rates are bad enough without asserting some kind of ownership.

But I really don’t see the ethical violation in anybody looking for other positions, either, even if they already have a full-time position.  Some people have survivor guilt about having full-time jobs in this market, but if you don’t like where you are, I don’t know who you’re helping by staying there.  It’s not a crime to want to be happy.

I’m more conflicted about the period after receiving an offer.  It’s fine to ask for a set time period to think the offer over, discuss it with a significant other, and so forth.  That’s standard, and it’s generally accepted to bargain a bit over the length of the waiting period if you’re waiting on another prospect to come in.  But saying “yes” to an offer and then backing out later -- especially if it’s significantly later -- can do real harm.  By that point, the college has turned away other applicants and has started making plans; leaving it in the lurch can leave a bitter taste.  

On the other hand, though, I’d be hard-pressed to explain why a change of heart shortly before starting is bad, but a change of heart shortly after starting is okay.  Viscerally, the two feel different, but I’d have trouble defending that perspective.

I like to think that service to the profession dictates that when someone gets a better offer, the polite thing to do is to take it as a compliment.  If someone else thinks that the person I hired at a middling salary is worth much more, well, I must have great taste in hiring.  

Wise and worldly readers, does it ever make sense to punish people just for looking?  As long as they aren’t neglecting their current job, I’m having a hard time seeing it.

Monday, January 21, 2013

Syllabus Bloat?


An alert reader sent me this piece by a professor complaining about syllabus bloat.  I had to smile in recognition.

In college, I don’t remember spending much time worrying about syllabi.  They’d typically be about a page, and would include the professor’s name, office number, office hours, the course title and number, sometimes a brief description of the topic of the course, a list of major (graded) assignments, some due dates, a list of required books, and maybe a late paper policy.  Many of them looked like they had been copied from copies from years before, and they may have.

In grad school, syllabi were even more bare-bones.  I remember one that was a half-page long, and students who had taken that professor before marveled that one existed at all.

Now, of course, syllabi are usually multiple pages, with much more mandated content.  Some of the new stuff is pretty unobjectionable -- email addresses are now standard, which seems reasonable to me -- but the new content often goes far beyond new versions of contact information.  If you were happy with the old version, the new version may seem cumbersome and unwieldy.  

Unfortunately for those of us who prefer brevity, there are actually good reasons for some of the expansion.  In other words, I don’t see the trend reversing unless and until we devise other ways to address the valid concerns that expanded syllabi address now.

As students have become more willing to challenge the grading judgments of professors -- and courts have become more willing to hear them -- it has become harder to fall back on the old “appeal to authority” as the answer to any challenge.  “Because I said so” doesn’t hold up in court. If a student comes forward with an allegation of some sort of irregularity in grading -- Susie got an extension but Johnny didn’t, say -- the first line of inquiry is the syllabus.  What rules did the professor set out at the beginning of the term?

From an administrative perspective, it’s usually pretty easy to defend anything that’s clearly stated, and not completely insane, on the syllabus.  If your syllabus says that you give four exams and count three, then that’s that.  If it says that late papers are penalized one letter grade per week, or that absences beyond the first three will result in set deductions, or that you drop the two lowest quizzes, then so be it.  When the ground rules are clearly stated and evenly applied, they’re easily defended from challenges.  

Things get trickier when the written terms are ambiguous or absent.  I can easily defend a determination that a student had more than three, or five, or eight absences.  I have a harder time defending a determination that her absences were “excessive,” if no other clarification is given.  Are two missed classes excessive?  Four?  What if your colleague in the same department allows five?  If a syllabus says that a professor “will” deduct points for lateness, that’s fine; if it says she “may” deduct points, I get nervous.  

The real nightmares are from rules improvised on the fly.  When a professor changes the rules of the course halfway through the semester, it’s much harder to defend, even if the changes are reasonable in themselves.  Sometimes that’s unavoidable, as when the original professor gets sick and someone else has to take over, but that tends to be the exception.  In practice, students tend not to contest deletions of assignments or extensions of deadlines, but they do contest additions.  

Of course, there’s a distinction between a syllabus for a given section and a generic syllabus for an entire course.  Ideally, the latter is devised by the entire department, and it gives room for customization.  The rule of thumb should be that the generic syllabus gives goals, and the specific syllabus gives means.  If the math department decides that fractions should be covered in the first developmental course, then the folks who teach the second one will assume that students have already had fractions.  If too many instructors at the first level decide to skip fractions, they’re setting up those students for failure.  Some level of agreement about content areas is necessary to make the sequence work.  (In my case, categories of what must be included in a syllabus are spelled out in the faculty union contract, but I know that’s not true everywhere.)

Where the boundary between generic and specific syllabi is can be a bit murky in practice, but the principle seems clear.  

Outcomes assessment adds another layer.  Again, here, the principle seems clear: students should know upfront what the goals of the course are.  How much specificity is needed to get the job done is a judgment call, but it’s hard to argue with the concept.  

I know it’s a pain to write them, but relatively detailed syllabi can save a world of time later.  If the policies about lateness, plagiarism, disruptive conduct, and the like are already spelled out, then backing them up is easy.  If the professor makes up policies on the fly, backing them up is a lot harder.  It may be inconvenient that we live in a world in which the old appeal to authority doesn’t work, but that ship has sailed.  Better to protect yourself upfront.

Thursday, January 17, 2013

Friday Fragments


The kids (and TW) had a snow day earlier this week, so she took them sledding.  At one point, when she was standing on a sled, the sled went one way and she went another, with her knee sort of splitting the difference.

Yesterday was devoted to the doctor’s office and the pharmacy, and to avoiding stairs as much as possible.  

She’ll be okay -- it’s a bad sprain -- but there’s nothing like an abrupt loss of ability to shine a new light on a house’s layout.  As a shift of perspective, there’s value in it.  Of course, I say that as the one who isn’t in pain...

----

The academic Twitterverse has been all about Moody’s “downgrade” of higher education.  There’s no shortage of snark -- this is the same Moody’s that applied AAA ratings to CDO’s that represented little more than nonsense on stilts -- but it’s hard to take issue with some of the underlying reasoning.  

As perverse as it may sound, though, the report actually gave me hope.  The report is a warning shot, and the danger of which we’re being warned is both real and severe.  In other words, it’s at least possible that the report will contribute to a badly-needed sense of urgency to make some fundamental changes.

There’s no shortage of clues that the current path is unsustainable.  Just this week, a small, tuition-driven private college in Minnesota -- the College of Visual Arts -- announced that it will close at the end of this semester.  The University of Phoenix is reducing campuses and staff.  Discount rates at many small privates are getting to the point where further tuition increases are simply futile.  The decades-long trend of adjunctification is nothing if not an effective admission that the underlying economic model is straining. Better to connect the dots now than to wait for the next round of casualties.

I just hope that in connecting those dots, we don’t forget to consider health care costs.  

----

Okay, Netflix, you win.  I wrote off Netflix about six months ago, and really hadn’t missed it since.  But then it offered Freaks and Geeks on streaming.

TW and I were among the six or seven Americans who caught the show on its first run.  Most people who remember it at all -- the few, the proud -- remember it as the breakthrough moment for Seth Rogen, Jason Segel, and James Franco.  But has there ever been a more fully realized young female character on television than Linda Cardellini’s Lindsay Weir?

Watching it now, the show holds up remarkably well.  I’m even more sympathetic to Lindsay’s Dad than I was the first time -- age has a way of sneaking up on you -- and the ambivalence of the characters still rings true.  Lindsay Weir is an utterly believable character: she’s exasperated by her parents, but she still loves them and doesn’t want to disappoint them.  She likes the bad boys, but not too bad, and she knows when they’re bluffing.  She’s both a quasi-burnout and a mathlete.  

After that show, I expected Linda Cardellini to be the breakout star.  She pops up in things from time to time, but she’s nowhere near the level of her male costars.  Watching the show now, I still think she should have been the breakout star.  

----

Governor Patrick revealed his budget proposals for next year earlier this week.  I’ll set aside the higher education part for now and focus on the part I consider the single best idea to come from the state in years: a passenger rail line running from Springfield to Boston, via Worcester.

For folks who don’t know Massachusetts well, Springfield is about 90 miles due west of Boston, with Worcester in between.  They are the three largest cities in the state.  Right now, if you wanted to take a train from Springfield to Boston, you’d have to go down to New Haven, Connecticut, and switch.  That’s roughly the equivalent of flying from New York to Chicago via Atlanta.  It’s silly.

A direct line has all manner of virtues, not the least of which is avoiding gridlock on the Mass Pike into Boston.  (The last couple of times I’ve driven it, the last five miles took almost as long as the first 85.)  If they’re really smart, they’ll have it terminate someplace where you could catch the T.  (That’s how New Jersey Transit handles going into New York City; you can switch to almost anything in Penn Station.)  Add a stop at Logan, and something close to Fenway, and you have a winner.  

Based on what I saw in Jersey, too, a rail line should have highly salutary effects on the economies of the places it stops.  This is no small thing.

An economic development idea that benefits the entire state, gets cars off the road, and makes it easier to get around?  Yes, please.

Wednesday, January 16, 2013

Safety Nets


American Career Institutes, a for-profit higher ed chain that specialized in computer-based career-oriented majors, just closed abruptly.  It had campuses in Maryland and Massachusetts, including one in Springfield, which is HCC’s largest feeder city.  

The non-credit side of the college is taking a look at its offerings to see what it can provide to the students who were stranded.  If there’s a way to “teach out” a program so that they can walk away with something to show for their efforts, that could minimize the damage.

We provide a safety net when experiments fail.  Nobody else does that.  As such, we enable experiments to happen in the first place.  Only the public sector is capable of doing that reliably.

This is the glaring hole in American political discourse.  We talk about risk and safety as if they were somehow mutually exclusive.  We forget that safety can enable risk-taking.

I was reminded of that a few days ago, in a discussion with a Canadian colleague.  We have similar senses of humor, so we got to talking about The Kids In The Hall, SCTV, and national styles of humor.  (For my money, “Brain Candy” is a neglected classic of dark, dark, dark comedy.)  She offered the theory that Canada punches above its weight culturally because its social safety net -- health care most conspicuously -- makes it possible for people to take chances on creative careers.  As a result, they get Holly Cole, and we’re left with Adam Sandler.  

It’s not a perfect theory -- we aren’t exactly saddled with a shortage of actors -- but it does address something I’ve seen before.  I’ve known plenty of people who stick with jobs they don’t particularly like specifically for the benefits, so their spouses could try startups.  Having one spouse ensure that nothing catastrophic happened made it possible for the other one to go out on a limb.  No benefits, no risk-taking.

Yes, there’s such a thing as too much security.  I had to smile at the article earlier this week that showed that college students whose parents pay their entire way get lower grades than students who work at least part-time.  If things are just a little too pat, it can be easy to get distracted.  

But even that security is often illusory.  Life tenure, for example, is only as secure as the institution that offers it.  Pensions are only as secure as the states that guarantee them want them to be.  At some point, risk as a fact of life will assert itself.  It can’t not.

So yes, we’ll help the students stranded by the latest for-profit to fold.  It’s what we do.  It may not be maximally efficient in isolation, but it’s the kind of thing that makes real progress possible.  Sometimes it’s worth connecting those dots explicitly.

Tuesday, January 15, 2013

Turning In to the Skid


Growing up in western New York, I learned to drive on snow.  Since I didn’t come from money, I learned to drive small cars on snow.  

Small cars are relatively light, and the snow and ice around Rochester could be impressive.  A small car braking on black ice is pretty much a hockey puck.  So to survive, I had to learn the counterintuitive truth that Northern drivers learn early: when you start to skid, turn in the direction of the skid.  You get control back much more quickly that way.  If you refuse to acknowledge the skid, or fight it, you lose control completely and crash.  

Correcting for climate, I was reminded of that when reading the latest from San Jose State.

Apparently, San Jose State University has contracted with Udacity to run credit-bearing basic algebra classes -- both developmental and college-level -- at a cost to students of $150.  

Some folks are already manning the battle stations.  My favorite, from the vice president of the San Jose State faculty union chapter:


“My personal opinion is that it’s not by accident that this is being announced at a time when most faculty are not on campus, but I have no evidence for that,” said Preston Rudy, a sociology professor at San Jose State who serves as vice president of the chapter.
(emphasis added)


It has to be sinister.  It just has to!  What other explanation could there possibly be?

And that’s where the conversation should start.  What other explanation could there be?  What’s the appeal?  

Cathy Davidson claimed earlier this week that “if we (profs) can be replaced by a computer screen, we should be.”  Her point was that it’s no longer plausible to argue that face-to-face instruction is clearly the only possible way to convey information.  If the best instruction that a college can offer is a sage on a stage lecturing to 300 freshmen, whom that sage will then duck afterwards to get back to writing, then it’s hard to argue that a video presentation would be markedly worse.  If anything, it may be better; at least with a video, you can play back parts you missed the first time.  And the cost advantage is not to be ignored, particularly when tuition and student loan burdens are the highest they have ever been, even after inflation.

Davidson is gracious enough not to say so, but the dirty little secret we all know is that the massive lecture was only ever an economic expedient; it was never a particularly effective way to teach.  Replacing one economic expedient with another, more effective one hardly constitutes an outrage.  

The limits of the traditional approach are particularly clear when we look at student pass rates in developmental and lower-level classes.  Nationally, there’s nothing unusual about a 50 percent fail rate for a developmental math class.  Early MOOCs have had even worse attrition rates, but that’s hardly an apples-to-apples comparison; most enrollees in the first wave of MOOCs had nothing at stake.  Motivation matters.  San Jose already ran its “circuits and electronics” course as a blended MOOC, and found that pass rates were actually higher than in the traditional class.  Whether the same will be true on the “lower” end of the curriculum isn’t obvious, but it isn’t preposterous, either.  And if it turns out to be higher, I’d like to hear the argument against it.

And here’s where I remembered what it felt like doing my first donut on an icy hill in Mom’s Ford Escort in 1985.  

If you read the earlier paragraph carefully, you’ll notice the word “blended.”  Students in the blended class did better than students in the traditional class.  They also did better than students in pure MOOCs.  

To the extent that results matter -- as opposed to tradition, politics, Luddism, or technophilia -- it looks like we get the best results when we turn into the skid.  When faculty use MOOCs as resources, rather than attack them as threats, students thrive.  MOOCs could offer one way to ‘flip the classroom,’ to move exposition outside so the people inside could focus on understanding, applying, and questioning.  They can free up faculty to work with students on the more interesting (and idiosyncratic) process of helping students internalize knowledge, come to grips with it, and sometimes even attack it.  

In a sense, I’m suggesting using MOOCs in ways similar to the ways professors have long used books.  They can be wonderful outside-of-class-time resources for introducing new material.  (They can also be wonderful in-class resources for closely guided analysis.)  But unlike books, they come with real-time data analytics, so they can be refined as they go.  And they’re a hell of a lot cheaper for students, which is no small thing.

TechCrunch opined yesterday that San Jose State’s move “spells the end of higher education as we know it.”  I suspect that higher education will outlast TechCrunch.  But the teaching side of higher education will only thrive if it’s able to turn into the skid and use the new resources to its advantage.  This is not the time to jam the brakes.  If we do, we’ll crash, and be replaced -- rightly -- by Davidson’s screens.  This is the time to use some unexpected momentum to get back on the track we should have been on in the first place.  

Monday, January 14, 2013

There’s Planning, and There’s Planning


I read somewhere that the mark of an educated mind is the ability to hold two contradictory thoughts at the same time.  If that’s true, then I’m feeling particularly educated of late.

In a discussion about planning for the next several years, I realized that I’m stuck believing two very different ideas.  

On the one side, I accept that good forward-looking plans are relatively concrete, with measurable goals and specific ways of getting there from here.  That’s what distinguishes them from daydreams.  In “strategic plans” as such, they’re usually written as “we will increase graduation rates by x percent by year y, by using the following interventions.”  The prose is ghastly, but the idea is to tie budgeting to some sort of conscious purpose in a deliberate way.  At that level, it’s hard to object.  As dreary as they are to read, plans like these can prevent good intentions from coming to grief on the shoals of unconscious incrementalism.  Tying strategic planning to budgeting offers the prospect of actually putting money where it needs to go.  This is no small thing.

On the other, though, I’m increasingly convinced that the real issue is less about improving this percentage or that one by a few points, and more about recognizing and coming to terms with much larger changes in higher education.  Given the reality of Baumol’s cost disease and increasing political friction around student debt, how should we revisit the ways we use various online resources?  What would a competency-based system look like, as opposed to a credit hour system?  How can we change the academic calendar to help students be more successful?  

The problem is that in practice, the two ways of thinking tend to conflict.  (They don’t have to definitionally, but they tend to in practice.)  It’s hard to specify in advance concrete, measurable outcomes to such speculative questions.  You can’t nail down the future like that.  But time and energy spent on one set of questions typically takes time and energy away from the other.

I’ve heard that some tech companies -- 3M and Google, famously, but I’m sure there are others -- set aside time within certain employees’ workweeks for working on speculative projects.  (I think that’s where Google Docs came from.)  I’d love to have some sort of venue on campus for something like that, but it would be easy for it to fall prey to hobbyhorses.  To work, the discussion would have to be both relatively constrained in terms of topics -- the first ground rule would be “no nostalgia” -- and relatively open and rigorous in terms of treatment.  In other words, no “brainstorming” in the traditional sense -- people would have to be able to raise objections, poke holes, and refine.  

I’m not quite sure how to translate something like that to a community college setting.

Wise and worldly readers, has anyone seen this done well in a campus setting?  I’d love to get responses in the comments, but anyone who’d rather reply privately can email me at deandad (at) gmail (dot) com.  This seems like the right moment, but I’m having a hell of a time figuring out the mechanics.  Any wisdom born of experience would be welcome.
I’d hate for our plans to miss tectonic shifts because we were too focused on things we could define, and measure, in advance.

Sunday, January 13, 2013

Personification


There’s a great scene in the film version of The Grapes of Wrath that I don’t remember in the book.  A farmer is facing down a man on a bulldozer; the man on the bulldozer has been hired by the bank to repossess the farm, because the farmer has defaulted on a loan.  

The farmer pulls a rifle and points it at the man on the bulldozer, threatening to kill him if he gets any closer.  The man on the bulldozer objects that he’s just doing his job, he needs the job, and it’s not his fault that the loan is due; if he farmer shoots him, there will just be someone else the next day.  Deflated, the farmer asks plaintively “well, who do I shoot?”

I was reminded of that scene last week.  In the comments to the post about administrators as inkblot tests, someone mentioned that on campus, administrators are often taken as personifications of larger forces.  Since it’s impossible to attack those larger forces, some people use the nearest administrator as a sort of stand-in.  You can’t attack the recession, but you can take a dig at your dean.  

For as long as I’ve been doing this, I’m still not entirely used to that.  

In my early days of administration, I used to sort of distance myself from my role.  It didn’t work; if anything, it felt like bad faith.  I had to learn quickly that certain comments that were entirely fine in my faculty role were not fine in a dean’s role.  At the end of the day, if you’re the one writing somebody’s evaluation, then you need to own that.  You can’t do an evaluation ironically.

(This isn’t unique to administration, of course.  Grad students teaching their first classes, especially if they’re young, sometimes make a similar mistake.  If you’re the one assigning the grade, then you’re the authority figure, whether you feel like an imposter or not.  At some point, you need to make peace with that if you’re going to do it well.  And you’ll find that some students will have issues with anyone they consider an authority figure, no matter what you do.)

Being taken as the personification of some larger abstraction can be disorienting.  Most people don’t think of themselves that way -- it would be weird if they did -- and they don’t make their choices as conscious representatives of abstractions.  The shock is particularly strong when moving directly from a faculty role, in which there’s so much more autonomy.  In graduate school and then on faculty, there’s a tremendous premium on fine-grained distinctions.  After many years of that, it’s hard not to be put off by reductionist tirades against “the adminisTRAtion.”  Administrations consist of many moving parts, each with its own imperatives, and a host of different personalities.  They aren’t the Borg.  But some people won’t make the distinction.

You’ll also find yourself blamed personally for decisions made (or allegedly made) by your predecessors.  The trick is in not getting defensive before you know what actually happened.  That can be tough when you’re under attack, but it comes with the job.

I’m posting this not in a spirit of whining, but in a spirit of warning.  New admins are often blindsided by the abrupt shift in the ways people act towards them.  Some people just need a villain to hold their story together.  Knowing that you’re the villain ex officio can help, even if you never completely get used to it.

Thursday, January 10, 2013

Friday Fragments


The book is out, and shipping!  

--

Fearless prediction time: The City College of San Francisco will get some sort of extension -- possibly presented as “probationary” -- on its deadline to show improvement or lose accreditation.  Given the size of the college, and the political impact of a loss of accreditation (which would almost certainly lead to closure), I just don’t see the hammer falling in March.

Unfortunately, it appears that many of the folks who need to agree to drastic changes are making that same calculation, and foot-dragging as a result.   

The saving grace, perversely enough, may be rapidly declining enrollments.  It’s one thing to argue with an accreditor; it’s quite another to argue with the public.  Here’s hoping that sane minds prevail on campus and they get their stuff together before the death spiral becomes unstoppable.

--

I don’t know if the “flex degrees” that Wisconsin is rolling out will work, but they strike me as a promising start.  It appears that they’ll be focused more on competencies than on credits, which is a prerequisite to any meaningful progress.

I’m hopeful that this option, and others like it, become sufficiently popular that the Department of Education starts to get a little more willing to move on financial aid rules.  Right now, those are the greatest barriers to campus innovation.  Charge the Department of Ed with improving results, rather than preventing change, and we can get somewhere.  On, Wisconsin!

--

Voice is data.  I don’t mind so much paying for monthly data use, but breaking out voice as its own separate item leaves a bad taste.  Why can’t it all just fall under data?  Verizon, I’m looking at youuu...

--

This story about Pell grant restrictions struck me as inevitable.  Among the effects of reducing Pell grant eligibility are reduced enrollments by low-income people, increased reliance on loans (especially private ones that aren’t subject to federal limits), and increased time spent on working for pay, rather than studying.  

Historically, social mobility has been an antidote to the political effects of economic polarization.  When that mobility goes away, it’s harder to explain to a have-not why she shouldn’t resent the haves.  There was a time when conservatives understood this.  Nelson Rockefeller didn’t pour money into SUNY because he was a closet socialist; he poured money into SUNY to offer people who might otherwise have been threatening a stake in the system.

And yes, degrees do still help with mobility.  I noted with gratitude the latest study showing that while a degree isn’t any sort of guarantee, in the aggregate, you’re much better off economically having one than not having one.  That’s why recessions increase enrollments in community colleges.  (That, and the lower opportunity cost of enrolling when other opportunities are scarce.)  Folks on the ground know that education matters.

I’d rather have students work fewer hours for pay, have time to study, and graduate with less debt.  I just don’t see what’s so radical or threatening about that.