Monday, July 31, 2017
On Monday I had the chance to address the annual gathering of state directors of community college systems, in Nashville. It’s a knowledgeable and self-assured group, unafraid to challenge speakers. That’s good and bad. The discussion was lively, which was fun, but I had the embarrassing experience of realizing live, in front of an audience, that I hadn’t fully thought through one of my points. This will be an attempt to think it through a little more, even if not fully.
(Just one aside about Nashville, which was lovely: a country band I saw on Sunday ended its set with “Comfortably Numb,” by Pink Floyd. It sort of worked, but it would never have occurred to me that I’d hear that song by a country band.)
The point was about the geographic distribution of wealth in the US. Roughly half of the community colleges in the country were built in the 1960’s, and the vast majority was within a few years of the 1960’s in either direction. That was the height of the geographic dispersion of wealth across the country. At that time, you could find solid middle classes in most of the country. Productivity, in economic terms, was relatively even across the states. There were exceptions, such as Appalachia and the rural deep South, but they were exceptions. Community colleges fit that economy well, and found an enthusiastic reception because they fit emerging local needs.
Fifty years later, the economy has changed. Now, wealth is much more concentrated, both socially and geographically. A few major metros have taken off spectacularly, and a few smaller ones have really energized. But much of the country is stagnating or declining. You can see the difference easily in housing costs. If you compare the ratio of costs between, say, Buffalo and New York City fifty years ago to the ratio now, you see New York City really pulling away. It’s not a perfect indicator, but it’s a pretty good one. And the same holds if you compare rural areas to urban ones in various states.
But the geographic distribution of community colleges, and the funding mechanisms for them, remains pretty much what it was fifty years ago. That means that in some areas of the country, they’re preparing students to leave. That doesn’t fit cleanly with local funding, local political support, or local philanthropy.
Someone objected that, in fact, community colleges supply the workforce that will attract new businesses. Yes and no. They supply the workforce that _may_ attract new businesses. Nearly any area is better off with a well-prepared workforce than without one. (My favorite argument for public higher education in both Massachusetts and New Jersey: we don’t have oil, or sunny/warm climates, or cheap land. If we’re going to compete, it’s going to be on quality of workers.) But in areas of the country that have shrinking populations and years of economic decline behind them, the argument that supply of workers creates demand for them may be hard to sustain. Heck, the academic job market itself refutes the idea that supply creates demand.
Someone else challenged me with some variation on “so you’re saying that we have too many community colleges, especially in rural areas?” (I didn’t write it verbatim.) That wasn’t the intent, but I was caught flatfooted by the interpretation. No, that wasn’t what I was saying. So what was I saying?
I see it more as an argument for rethinking both funding and delivery. To the extent that students are likelier to pack up and leave after graduation than they used to be, I see an argument for shifting the funding source up the food chain. (After the talk, someone from EMSI told me that now, about 70% of community college grads remain in the local area. That’s high in absolute terms, but twenty years ago it was 85%. In other words, in the last generation, the percentage that leaves has doubled, and is still climbing.) In other words, during the generation that has seen significant reduction in state-level support, graduates are likelier to conduct job searches statewide. The increasing mismatch between funding sources and reality on the ground may explain part of why it’s harder to get support than it once was.
Historically, community colleges have distinguished themselves from “state” colleges. Typically, the former offered associate degrees and the latter offered bachelor’s. (Florida and Ohio, among others, call some community colleges “state” colleges, but the general pattern holds.) But the distinction may be getting harder to justify. It may be time to speak of a higher ed ecosystem, with the different parts feeding each other. That may also help justify arguments for cross-sector funding parity, which would be a welcome change.
I can’t pretend that I have a fully thought-out answer, but the line of inquiry strikes me as important. What are the implications for a distributed system when wealth has concentrated in a few specific places?
Anyway, my thanks to the state system directors for the chance to speak to them, and for correctly identifying a gap in my argument. Wise and worldly readers, I need your help. What are other implications of wealth concentration for what community colleges should do?
Sunday, July 30, 2017
A Twitter exchange Sunday between Sara Goldrick-Rab and Ken Lindblom touched on a favorite topic, but one that I don’t think we take seriously enough. SGR asked for a definition of “college readiness” in accessible language; Lindblom responded that graduate schools don’t teach accessible writing.
They don’t. It’s a real problem.
The postmodernist trend of the 90’s had its strengths, but one of its greatest flaws was a semi-intentional premium on incomprehensibility. When you’re supposed to show your sophistication with terms like “always already” and “overdetermined,” simple statements come across as naive. Traditionalists used to make great sport of quoting particularly opaque sentences out of context, poking fun at highfalutin word salad. Admittedly, the search for sentences like that was often like looking for hay in a haystack.
Postmodernism aside, though, academic writing isn’t typically geared towards the educated public. We know the reasons for that, and some of the reasons make sense. Making a narrow point seven levels into an argument requires using shorthand for the first five or six levels, or you’d never get it done. (A few months ago a mathematician was asked to leave a plane because someone in the seat next to him found his notes jarring. They were a complicated math problem.) The public isn’t really into footnotes. Specialists use shorthand that non-specialists find daunting, and it makes sense that they do. On campus, I don’t stop to define “accreditation” every time I say it. It wouldn’t help.
But our failure, as a sector, to engage the public has created a vacuum. When we leave the public sphere to others, with their own agendas, they take advantage. Now the stories making the rounds about academia are about “dropout factories,” student loans, and political correctness. Those stories are based on varying degrees of truth, but our stories are missing. Why aren’t we hearing about those “lightbulb over my head” moments that changed lives? Why aren’t we hearing about universities as places for experiments? For that matter, why aren’t we hearing that one generation’s wild radical student cause is the next generation’s common sense? Alternately, why aren’t we hearing about the impact of sustained educational austerity on the next generation? Why don’t we get our stories out there?
We don’t train for that. We don’t hire for that. Maybe we should.
In one way, community colleges have an advantage. We hire for teaching ability, as opposed to research. In research, obscurantism can sometimes pass for profundity. But in teaching -- and especially in teaching students who may be the first generation in college -- clarity matters. Clear and effective communication matters more here because teaching matters more here. But the teaching loads, and the relative lack of help, can make it difficult to keep up a prolific writing schedule. I can attest personally that a prolific writing schedule isn’t easy.
Michael Lewis, Ta-Nahesi Coates, and Sara Goldrick-Rab have shown us that there’s a market for substantive, academically informed non-fiction if it’s written well. It can be done. Until now, academe has treated it as a distraction or worse. It shouldn’t. If we don’t win the public, others will. As I’ve told my own kids through their respective baseball and softball careers, strikeouts are part of the game, but if you must strike out, I’d rather see you go down swinging. Let’s start.
Thursday, July 27, 2017
A few days ago, a reader wrote to ask about how professors police cheating during in-class exams, now that smartphones are pretty much ubiquitous.
Wise and worldly readers who teach, how have you adapted in-class exams for the age of the smartphone?
Actual in-car conversation, having just picked up The Girl and some of her friends from a party:
Friend 1: She’s such a Veronica!
TG: She’s more of a Heather, I think.
(drop off the other girls)
Me: What was that?
TG: Oh, they’re talking about this musical called “Heathers.” I think it’s based on a movie. Have you heard of it?
Me: It’s only THE BEST MOVIE EVER MADE!!!
TG: That’s what (friend)’s Dad says, too!
It’s good to see the classics get their due. Generation X’s mark on the culture may have been fleeting, but I’ll happily own this one. Neither Winona Ryder nor Christian Slater was ever quite that good again.
“When you have an area that just isn’t working like Upper New York State…” - Donald Trump
In an interview this week, Donald Trump advised residents of “upper” New York to move to Wisconsin.
A few thoughts.
First, nobody there calls it “upper” New York State. It’s Western New York, or, in some cases, Upstate. As a lifetime resident of New York City, I would have expected him to know that. But that’s a minor point.
The President of the United States is writing off regions of the US with millions of people in them?
Um, not okay.
In Western New York, where I grew up, there’s a chronic sense of being in the shadow of New York City. NYC dominates state politics, and it dominates the state’s national image. When I got to college and people asked me where I was from, I learned quickly that if I just answered “New York” they’d assume I meant The City. When I mentioned Rochester, a classmate asked me which subway line it was on. Rochester is farther from NYC than Washington, DC is.
The rule is that you’re only allowed to criticize it if you’re from there. If you haven’t personally washed down a white hot with a Genny Cream, or you have no idea what a Garbage Plate is, I don’t want to hear it.
Those of us who grew up there and moved away -- some of whom even work at IHE -- have complicated feelings about the place. But we earned those. And they’re based on knowing what we’re talking about.
It’s a new era, I know that, but I’m still put off by national political figures trashing states they didn’t win. That’s not what a _national_ figure is supposed to do. I don’t recall the presidents Bush trashing Massachusetts, or Obama trashing Alabama. The Clintons liked Upstate so much that they moved there (sort of).
Early next week I’ll be in Nashville for a conference. I intend to go the entire time without indulging in any regional stereotyping. Anytime Mr. Trump would care to learn from my example, I’d welcome it. Besides, he doesn’t seem like someone who would turn down a Garbage Plate.
Wednesday, July 26, 2017
Apparently Oregon and New York, each having announced a variation on free public college, are both falling short of full funding for their programs.
When “Promise” programs don’t keep their promises, I would expect some blowback. As Robert Kelchen pointed out, this will give scholars of disappointment effects (!) an excellent natural experiment. (“When I grow up, I want to study disappointment!”) But I can’t claim surprise.
In both cases, to my knowledge, the programs don’t have dedicated funding streams. Instead, they’re discretionary spending, subject to political horse-trading and the usual legislative shenanigans. The programs are new enough that we can’t blame growth. Worse, the underfunding is happening at a point in the business cycle when states are as flush as they can reasonably expect to be without major policy changes. States can’t run deficits, so when the next recession hits -- and it will -- revenues will drop at the exact moment that demand for college increases. If they’re falling short now, they’ll fall catastrophically short then.
In my perfect world, nobody would be allowed to serve in a legislature without being able to demonstrate a first-level knowledge of Keynesian economics. I can envision a few possible policy responses. (I can also envision a few choice words from disappointed students…)
One is the usual policy-wonk retreat to “means testing,” or what laypeople call “slow murder.” The argument will be based on a sort of economic triage: if there isn’t enough for everybody, then help the neediest first. But in the political climate of my adult lifetime, that’s a recipe for decline. Programs for the poor become poor programs. If you want an expensive program to survive, its benefits need to be broad-based, and preferably universal. Otherwise you start to get into bureaucratic nightmares of income verification, apocryphal or exaggerated stories of abuse, and the like. The unsung heroes in our financial aid offices know how that works.
I could also imagine benefits being narrowly targeted at desired majors, as Arkansas is doing. The argument there would be that the state needs more STEM majors, or computer majors, or nursing majors, or whatever, so let’s build incentives for that. It’s better than nothing, but it assumes that entering students know what they want, and/or are utterly indifferent to what they study. Neither is true. It also runs the risk of saturating certain job markets over time.
Alternately, states could move to “first come, first served.” There’s a simplicity to that, but in practice it will tend to be regressive. It also defeats the predictability that made the initial promise of free college a potential game-changer. If a full-tuition scholarship is guaranteed, that’s an incentive. If you follow the rules but your application only came in 30 days early instead of 35 so you get nothing, that’s not much of an incentive. If anything, it’s a kick in the teeth. Once some sympathetic stories of students who were denied for heart-tugging reasons start making the rounds, things will get ugly.
Tennessee has taken the utterly brilliant step of actually designating a specific funding stream. It uses lottery revenues. One can argue about the moral hazard of that, but it does provide a line of revenue (relatively) protected from political mood swings. As a result, Tennessee isn’t having the shortfalls that Oregon and New York are, and enrollments in community colleges received a healthy boost. At the state level, this strikes me as the best realistic outcome. Kudos to Governor Haslam - a Republican, for those keeping score at home - for getting this one right.
Eventually, a free college policy would make the most sense at the Federal level. Unlike the states, the Feds can run deficits, so sudden enrollment spikes in recessions wouldn’t threaten everything else. Give it a dedicated funding stream as a baseline, and use borrowing to cover gaps just as Keynes recommended 80 years ago. States that aren’t being contrary for purely ideological reasons will figure out quickly that having a healthy higher education sector will mean getting their share from the Feds; as incentives go, that’s pretty good. As for the states that will be contrary just for the sake of it, well, all the more for everyone else.
Of course, that’s not where we are, politically. In the short run, it looks like the Tennessee model makes the most sense, and I commend it to other states.. But when the next recession hits, and it will, the rules against state-level deficit spending will come back to bite us. Because eventually, the political winds will shift again. Promise.
Tuesday, July 25, 2017
My friend Christine Nowik posted a great question on Twitter this week. Linking to a piece about department chairs who have stayed too long, she noted that in many departments there’s nobody willing to step up if the current chair steps down. In some cases, chairs stick around less out of eagerness for the position than out of a lack of alternatives. What to do when that happens?
I’ve seen this happen several times over the years. It’s particularly common in small departments, where the personalities involved are few and long-entrenched. Let’s say you have a department of three full-timers. One has been the chair for a very long time, with middling performance in the role. One of the others is nearing retirement and couldn’t be dragged by wild horses to do the job, and the other is a dedicated clock-puncher. Budgets make a new hire a non-option for the near future.
In that situation, the de-facto-chair-for-life may be the least bad option. You won’t get greatness, but the basic tasks will get done. With either of the other two options the basic tasks probably won’t get done, at least not reliably.
Sometimes, the best option in a case like that is either a merger with another department, or a threatened merger with another department. I’ve seen people who swore up and down never to step up change their minds when threatened with what they saw as a forced takeover. The threatened loss of autonomy can be enough to overcome a distaste for administrative tasks.
Depending on context, it can also make sense to reconfigure the role. Larger departments have had success with splitting the role between two people. The key there is in a clear delineation of duties. Having “co-chairs” as pure equals simply doesn’t work; you introduce a whole new level of ambiguity, and people learn to play the two off against each other. But if one co-chair deals with, say, the full-time faculty and department meetings, and the other is the go-to person for the adjuncts, that can work.
In some cases, unwillingness to step up can be a symptom of a larger organizational dysfunction. In my own career, I’ve declined to apply for positions when the people to whom I’d have to report didn’t meet my sense of acceptability. It can be a barometer.
But it’s frequently more a combination of the general academic distrust of “going over to the dark side” combined with individual personal priorities. Personally, I don’t mind when people step up to chair roles with an eye towards eventually moving into deanships. Those folks have something to prove, and therefore an incentive to do a really good job. That’s a good thing, even if there’s a cultural taboo against admitting it.
I’ve heard of colleges moving away from department chairs altogether, on the theory that faculty are hired to teach, and the skill set for management overlaps only slightly with the skill set for teaching. I get the logic, and there can be specific local circumstances in which it makes sense. But as a long-term strategy, I’d be concerned about losing the talent development pipeline. Chair positions are often a toe in the water of administration; they operate as de facto audition periods on both sides. I’ve seen chairs who thought the position looked great decide quickly that accepting it was a tragic mistake; I’ve seen others discover previously untapped talent for management. The in-between status of chairs allows for a relatively low-risk exploratory period; if it doesn’t work out, returning to the faculty isn’t that hard. That’s much less true for full-time administrative roles.
I’m pretty confident that Nowik and I aren’t the only people ever to have seen this. Wise and worldly readers, have you seen a reasonably elegant solution to the problem of nobody wanting to step up?
Monday, July 24, 2017
On The Bernie Mac Show, the late, lamented Bernie Mac had a recurring bit in which he’d show frustration or disbelief by just staring silently at the camera and tapping his fingers. It slayed me every time. His body language conveyed silently that he was somewhere between “can you believe this?” and “what the...?”
Reading IHE”s piece yesterday about an analysis of budgetary “pass-throughs” of state budget cuts in the form of tuition increases had me in full Bernie Mac mode.
The article is a summary of the results of a study of the effects on tuition at public colleges and universities when public funding was cut. The article doesn’t mention community colleges, and the original study is paywalled, but my impression is that the study focused on four-year colleges and universities.
For me, this was the key paragraph:
State and local divestment accounted for 16.1 percent of tuition and fee increases paid by the average student since 1987. Disinvestment accounted for a greater share of tuition and fee increases more recently, though. It is responsible for 29.8 percent of the tuition and fee revenue increase since 2000 and 41.2 percent since 2008.
That lends itself to interpretation, of course. One, offered by Jason Delisle, was:
Policy makers will still wonder why, if appropriations cuts really drive tuition higher, the pass-through rate isn’t 100 percent, said Delisle of AEI.
(stare at screen, tapping fingers)
Okay, I know that policy folk look at practitioners roughly the way that biologists look at butterflies, but I have to respond. As someone who has spent the last decade dealing with flat or declining public funding at public colleges in two states, I can offer confidently that anyone who assumes that budgets have not been cut simply doesn’t know what he’s talking about.
Take a look at the change in adjunct percentages since 1987, just for starters. Why do you think colleges have moved so heavily in the direction of part-time faculty? On my own campus, I’ve been authorized to replace fewer than half of the full-time faculty who’ve left over the last two years. Why do you suppose that is?
It’s because adjuncts cost less. That’s where much of the lost funding shows up. To the extent that we offset “money not received” with “money not spent,” we reduce the amount we have to raise tuition and fees.
Of course, it’s not just adjuncts. Look at offices run with fewer staff, tutoring centers with fewer tutors, or thirty-year-roofs in their fortieth years. Deferred maintenance is another version of “money not spent,” until it abruptly has to be. (For the DC pundit class to grasp this, just look at the Metro.) Look at travel budgets, as I mentioned in yesterday’s post. Look at the health insurance packages that employees get to pick from, and compare them to the plans from, say, ten years ago. Look at “hiring freezes,” raises foregone, and position consolidations.
Then look at non-optional costs that have increased over the years, whether in compliance, IT, or mandatory student services. As worthy as they are, they put pressure on everything else.
As I’ve been pointing out for years, colleges have handled flat or reduced support by splitting the difference between spending cuts and price increases.
The contribution of this study, to my mind, is that it shows that the era of less painful cuts is over. The steady increase in the “pass-through” rate shows that it’s getting harder to maintain a level of service without finding other revenue. Anecdotally, that’s spot-on. Shrinking a department from ten full-time faculty to nine is painful; shrinking it from three to two is much worse. And as much as policy folk don’t want to hear it, the supply of good adjuncts is finite. There comes a point at which the low-hanging fruit has been picked. Barring some sort of sea change, I’d expect the pass-through rate to continue its rapid climb.
Part of the reason I’ve written this column/blog for as long as I have is that I’m still struck by the absence of knowledgeable practitioner voices in the discourse around higher ed. After all these years, it’s still true. I understand the career politics behind that, but at some point, we who actually live this stuff need to speak up. Bernie Mac’s frustration was funny, but ours isn’t. We need to step up and speak the truth. If we don’t, these ideological abstractions win by default, and we all suffer.
Sunday, July 23, 2017
Academic travel is expensive, but academic isolation is more so.
Pamela Gay has a good piece in Medium about the unacknowledged costs of academic travel, particularly for early-career academics. She notes, correctly, that the lag between spending and receiving reimbursement amounts to an interest-free loan from the employee to the college; that may not matter much for folks with salaries high enough to pay off the credit cards in full each month, but for everyone else, it’s a real cost. Tips and incidental expenses often go unreimbursed, because they’re unrecognized or hard to prove. (How do you prove that you left a few bucks for the housekeeping staff?) Many flights and hotels charge extra for wi-fi, but that’s not always a reimbursable cost. And for folks in the early faculty years, even appropriate conference dress may be an extra expense, but it’s assumed to be the responsibility of the traveler.
In my experience, graduate school was when the travel issues were the worst. We were given something like a $200 per year cap on travel reimbursement. This was the 90’s, not the 50’s, so $200 didn’t go far even then. Even with shared rooms, grad student registration discounts, and the cheapest travel methods I could find, I took a significant loss every year. Then I’d see full professors at the conference going out to reimbursed dinners and, well, let’s just say I noticed who looked out for grad students and who didn’t. Some of the ones who proclaimed their commitments to social justice the loudest were the most selfish. I’ll withhold names to protect the guilty.
Grad school was at a research university, so at least travel funding existed. In the community college world, travel funding tends to be scarcer. Part of that is a relative paucity of grants, but most of it is a combination of a lack of a publication requirement, a lack of money, and a sense that travel is a “soft” budget line and therefore easy to cut. And in the very short term, it is.
But having seen the effects of a long-term underfunding of travel, I can attest that the cost of information missed and connections not made is cumulative. After a while, people don’t know what they don’t know. Too much time in a local bubble leads to a lack of a comparative perspective, and a tendency to conflate the way things have been with the way they must be.
You’d think that wouldn’t happen. Academics’ defining trait, as a breed, is supposed to be intelligent curiosity. This is a group of people -- among whom I proudly count myself -- who pursued lines of inquiry much farther than prudence would have dictated. The single strongest argument for tenure is that it’s supposed to create the security from which to pursue truth in whatever direction it goes. Presumably, that would sometimes involve looking in other places and talking to other people.
I echo Pamela Gay’s sense that we need to update some of the processes by which we allocate travel funding, such as thinking to include wifi as an expense. I’m also a fan of the “per diem,” as opposed to itemized meals; it covers tips, and it lets people allocate meals as they see fit. A previous college had a strict rule about caps for breakfast, lunch, and dinner; I never saw the point of that. Give me a daily cap, and if I choose a more ambitious lunch and a cheaper dinner, well, who cares? But these are small things.
The broader point is that we need, as a sector, to start to confront the long-term costs of discouraging travel. Why do innovations move so slowly across the sector? In part, because there may only be one or two people on a given campus who have ever heard of them. In a setting of shared governance, the “nah” contingent can wield considerable power. Without a critical mass of people who have seen other things, the only common reference point is the local past. That’s limiting, at best, and often self-defeating. Teleconferences are great, but they work best as followups. Without actually seeing what other people are doing, and the assumptions they take for granted as they do them, it’s easy to default to “that’s how we’ve always done it.”
There will be times when individual people can’t travel much; when the kids were in preschool, I kept travel to a minimum. But when entire colleges keep it to a minimum, they cut down the future to the size of the present. That should be the last thing academics should do.
So yes, by all means, let’s reform the processes for travel costs to make them fairer and more relevant. But at a deeper level, especially for community colleges and teaching-intensive institutions, let’s stop pretending that travel is for Other People. If we don’t get out of our own bubbles, we’ll keep making the same mistakes over and over again.
Thursday, July 20, 2017
Apparently, the California community college system is considering allowing students in non-STEM majors to fulfill a math requirement by taking statistics, rather than algebra.
The idea behind the proposal is twofold. First, algebra generates more student failure and attrition than almost anything else. (One of the guest speakers at Aspen said that his one piece of advice to any college president looking to improve graduation rates would be to fire the math department. We laughed, but he didn’t seem to be kidding.) Second, in many fields, algebra is less useful than statistics.
The objections are obvious. Most basically, it looks like watering-down. If the solution to increased college completion is to get rid of anything difficult, then college completion itself becomes meaningless. There’s an “exposure” argument, too, that says that many students don’t know they like math or STEM until they’ve found themselves wrestling with it; deprive them of that exposure, even “for their own good,” and the downstream effects are predictable. Pragmatically, there’s an argument from transfer; many four-year colleges won’t take math courses that don’t have an algebra prerequisite. And from within, there’s a valid argument to the effect that if you don’t have basic algebra, you won’t be able to generate most statistics; at most, you might be able to consume them.
For example, when I took stats, I was captivated by the idea of controlling for a variable. (“You can DO that?”) But the idea of a variable came from algebra. If you don’t have some level of algebra, I’m not sure how much sense the concept of controlling for one would make. Correlations and standard deviations also rely on some knowledge of algebra. “Base” and “rate” make sense algebraically. I’m not sure how the course would work.
It’s true that drop/fail rates for algebra courses tend to be higher than for stats courses. It’s also true that in my own scholarly discipline, and in my line of work, I use stats far more than I use algebra. I find the “median” of a distribution on a regular basis, but I don’t remember the last time I used the quadratic formula. It just doesn’t come up.
But the argument from usefulness is stronger against many other fields, and it doesn’t get deployed against those. We have a history requirement for A.A. degrees, for example. From a pure ‘usefulness’ standpoint, that’s hard to justify. But there’s a general consensus that the skills students develop through the study of history are valuable, even if it can be hard to demonstrate in as linear a way. (Knowing the future would be far more useful, but it’s hard to find good materials.) We have a humanities requirement that’s entirely independent of usefulness. Honestly, if we want to argue usefulness, I could imagine a compelling argument that history and literature majors shouldn’t have to take lab sciences. The usefulness argument is a slippery slope.
Stats courses tend to lend themselves to “general education” kinds of applications very well. I’m a fan of questions based on epistemology: “what statistical evidence would prove this claim?” Political journalism offers no shortage of “how to lie with statistics,” which can be excellent fodder for sharpening students’ critical thinking. Just being able to distinguish between correlation and causation is valuable.
I’d be curious to hear from folks who’ve taught a stats class that didn’t assume any previous knowledge of algebra. Can you sneak the relevant algebra in through the stats? Are the students able to grasp concepts like “control for a variable” without knowing what a variable is? If the students are able to get the critical thinking and quantitative reasoning skills from a stats class without an algebra prereq, I’m on board. I just don’t know if they can.
Wednesday, July 19, 2017
Course cancellations are sort of like snow days: no matter what you decide, someone thinks you’re wrong. And chances are, sometimes you will be.
We’re getting to that point in the summer when we start looking closely at section enrollments for Fall, and making go/no-go decisions on the small ones. It’s a frustrating process, made all the more frustrating by the inevitable uncertainty.
For economic reasons, we need a decent number of students per section in order to make ends meet. Some sections will have to run small for one or more of a panoply of good reasons: it’s the only section of a required class; it’s the only evening section; it’s the only section at that location; every other section is full; it’s the last course in a sequence and the students need it to graduate; it’s a clinical site. Eating the cost of the necessarily small ones requires setting the default minimum slightly higher than a strict average, to compensate.
In the community college sector, though, it’s not that easy. (Folks in private industry can replace that with “the community college space,” if it helps.) Students register in two big waves, with a lull in between. The early wave happens when registration opens in the Spring. The late wave happens in August, sometimes continuing into September. Early to mid summer is much slower.
Optimizing the numbers, then, would mean waiting until the first day of classes.
But that doesn’t work for the students whose sections got cancelled out from under them. Even if there are seats available in other sections of the same course, they may not be able to adjust their schedules. Late changes wreak havoc on financial aid, too, especially if they involve crossing the 12 credit threshold. From a student perspective, it’s much easier to make changes with a month’s notice or more than abruptly at the start of the semester.
But our information a month or more in advance is pretty spotty. And for faculty who are hoping that their sections will run, an early cancellation comes as a slap in the face. The inevitable pushback comes in the form of angry declarations that “it would have made it if you had given it a chance.” That’s probably true some of the time; it’s unprovable either way.
If students registered earlier and stuck with their choices, we could optimize easily. If we had peak enrollments, everything would run just because students would take whatever they could get. (That happened around 2009-10.) If we had infinite resources, we wouldn’t have to sweat small sections; if anything, we could see them as educational treats. If enrollments were steady from year to year, we could settle into patterns. But that’s not this world.
Data analytics hold some promise for helping with predictions, but not necessarily at the level of the individual section. Knowing that overall enrollment is, say, four percent lower than the previous year doesn’t necessarily tell you whether the Tuesday afternoon section will run. And we don’t have data fine-grained enough to predict that, at least at this point. If someone has seen software that helps at the level of the section, I’d love to see it.
Wise and worldly readers, in the absence of either omniscience or a visit from the money fairy, is there a better alternative to the cancellation shuffle?
Tuesday, July 18, 2017
Like nearly everybody else, I saw Rebecca Schuman’s piece eviscerating the University of Illinois at Chicago for posting a job ad for someone with a Ph.D. to direct, and teach in, a German language program for $28,000 a year. If you haven’t yet read it, I recommend checking it out.
Schuman does some quick math on the length of time the various components of the job would probably take if you did them just well enough not to get fired, and calculates that it adds up to more than full-time. I quibble with one element of her math -- coordinating courses is not the same as developing courses -- but her larger point clearly stands.
She followed it up with some lurid fan fiction based on how such an ad might have come about.
She’s a better humor writer than I am, so I won’t try to compete there. But I’ve been in enough tense discussions about resource allocation for programs and positions in which I had to make a decision and catch flak for it that I thought I’d try my hand at portraying how such a thing might have happened. (The obligatory disclaimer: this is based on experience in the industry, not at UIC. I don’t have any inside information specific to UIC.)
Chair: Our German language program director left. We need someone to step in.
Dean: I’ve got ten position requests on my desk. I can fund two. Is this more important than (names several others)?
Chair: But it’s a replacement position! The money is already in the budget!
Dean: No, that money was already scooped up to fill the deficit. Every new person counts as a new hire, even if they’re just replacing someone.
Chair: That’s ridiculous!
Dean: The state, in its infinite wisdom…
Chair: I know, I know. But if this isn’t filled, there won’t be anyone to keep those classes from going off the rails.
Dean: Should we close the program?
Chair: There’s no time. We need a quick fix. September will be here before you know it! Besides, the classes are full, and we need the enrollments.
Dean: Hmmph. Will any full-timers do it for a course release?
Chair: (withering stare)
Dean: Worth a shot. What about adjuncts? If we split the funding for a position between this and (names another), would that be enough to entice an adjunct to step up?
Chair: (strained voice) Maayyyyybeee…
Dean: It’s better than nothing…
Chair: I guess…
Dean: Of course, to satisfy HR/union/state requirements, we’ll have to post the thing. But I can’t imagine anyone from outside jumping at this.
And the rest is history.
Schuman is clearly right that the job is absurd on its face, and I agree that anyone who doesn’t already work there would be well-advised to steer clear. It’s not the sort of job to relocate for. But if there’s a freeway-flying adjunct already teaching there, I could see her making the rational decision that it’s better than otherwise.
Our narratives aren’t all that far apart, really. Hers is a comedy; mine is a tragedy. In mine, basically well-meaning people are trying to patch a ridiculous situation with the budgetary equivalent of baling wire and bubble gum. The end result isn’t pretty, and doesn’t come anywhere close to the kind of job for which graduate students spend years of penury earning doctorates. But it doesn’t rely on an assumption of cluelessness, malice, or whim. It’s a story of conflicting imperatives making a terrible option the least-bad one.
That sort of thing happens more than one might like.
Admittedly, I’m assuming good faith. Someone along the way may just be a sadistic jerk. I can’t dismiss the possibility, but I’d hate to assume it. If that were the entire problem, the solution would be easy enough: fire the jerk. But if the problem is structural -- and based on the national job market, it has to be -- then swapping out the admins won’t help. It’s not about them.
None of this is to defend the position, or UIC, or, heaven knows, the state of Illinois. It’s just to say that if we start to come to grips with how well-meaning people could do this, we might actually start to make progress on fixing it. Thanks to Rebecca Schuman for catching this one and calling attention to it. If this is the bloody flag that rallies the masses for more funding for public higher ed, I’ll take it.
Monday, July 17, 2017
The dog, Sally, is scared to death of flies. One got in the house yesterday, and she spent the better part of the day cowering behind the couch or under a bed. This is the same dog that wandered around the forests of New England for 17 days a few years ago and emerged relatively unscathed. I don’t know why she’s afraid of flies -- she can’t tell us -- but I have to assume there’s a reason. It probably means something, though heaven only knows what.
I’ve been reading lately about the shift in the American political economy from the postwar era to the last couple of decades, and thinking about the different fears at different times.
Broadly, the economic shift was from The Great Compression -- relatively low ratios of high wages to low ones -- to the new Gilded Age, in which the ratios are much, much larger. The turning point was somewhere around 1980, give or take. From the end of World War II into the 70’s, wealth was spread more evenly, both by class and by geography. This was the period of suburban expansion, and a time when economic opportunity became more evenly spread around the country. It was also the time when most community colleges were established. They were part and parcel of the Great Compression, built to fill the demand for an expanding middle class.
The literature and art of the time were obsessed with themes of conformism. Conformism was often portrayed as mindless or soul-deadening -- think “The Man in the Grey Flannel Suit” -- but not fitting in was also terrifying, as in nearly every episode of “The Twilight Zone” ever made. Part of that, I think, came from the recent experience of two major wars with widespread military conscription; the military’s premium on conformity is obvious. And part of it came from horror at the “collectivist” roots of fascism and communism, as Americans understood them. (You don’t really see the word “collectivist” much anymore.) Ayn Rand’s hyperindividualism took collectivism as its foil; she just took “The Twilight Zone” and reversed the polarity. Friedan’s “Feminine Mystique” used bland suburbs as a foil, too, though to different ends.
Looking back, it’s easy to see where someone predisposed to fears of “massification” (another oldie but goodie) would find ammunition. There were three television networks to choose from, so each had to try to be an inoffensive to as many people as possible. For all practical purposes, there were three brands of car, each imitating the others. Heberg’s “Protestant, Catholic, Jew” showed that religions were gradually watering down and resembling each other, at least as practiced on the ground in the US.
Race stood as an obvious and glaring counterpoint to the narrative of growing equality, which I think is part of why so many midcentury thinkers had such trouble with it. But the narrative was widespread just the same.
Now, wealth is being concentrated both socially and geographically. Richard Florida’s latest book captures the dilemma facing many young people now: in the few places where opportunity is abundant, cheap housing isn’t. The President of the United States brags about his wealth, and has no qualms about toning it down. We’ve been defunding collective goods for decades, and amassing the proceeds among the top (pick your small number) percent. Our political parties are far more clearly divided ideologically than they once were, and “swing seats” in Congress are vanishingly rare.
Rather than “collectivism” or “massification,” we’re obsessed with either “diversity” or “cultural breakdown,” depending on your politics. The science fiction stories now are about grinding poverty for the many while the few live in pilfered opulence. The vision of the future in The Hunger Games is markedly different from the original Star Trek, for all of the latter’s flaws. Now we don’t fear mindless conformity; we fear a Hobbesian war of each against all, or at least, of each subgroup against all. An explosion of cultural choices -- the kids literally don’t believe me when I tell them how few channels we had when I was a kid -- coexists easily with a massive concentration of wealth. Each cultural choice has to distinguish itself from all the others; Heberg’s description of religion in America reads like science fiction now.
Where once the common culture seemed oppressively ubiquitous, now it seems stretched beyond recognition. A disbelief in common purpose follows.
For community colleges, the shift has been both devastating and largely unacknowledged. They were built in one era, and designed around the assumptions of that time. But circumstances have changed. Community colleges are spread around the country, as wealth once was, but increasingly isn’t. In some places, they fulfill their mission by preparing students to move away. And the idea of education as a public good has been supplanted, as have most other public goods. Hobbesian warriors and Randian entrepreneurs don’t want to be bothered funding something that might benefit other people. Add race to this logic, and it gets ugly fast.
The fear now isn’t of being swallowed up; it’s of being left behind. Or of being held back by other strivers dragging you down. If I’m doing all I can to avoid falling into the pit, the last thing I want to do is to try to pull someone else up. Over time, that becomes self-reinforcing.
I don’t know how the stories of these fears will play out. But I do know that Logan’s Run never actually came to pass. Sooner or later, stories change. Sally comes out from behind the couch. I just hope we don’t lose track of our story in the meantime.