Tuesday, June 27, 2017
In the 1960’s, community colleges were established at an average rate of one per week. Now, new ones are rare birds. So a story about the new one emerging in western Pennsylvania seems worth noticing.
The new one, clunkily named Rural Regional College of Northern Pennsylvania, is starting out as a de facto extension site of Gannon University. Apparently its classes are conducted by “interactive television,” which in this context seems to mean synchronous distance classes held at various centers. I can see why they did that: the format provides the tight control, regular schedule, and human interaction of a classroom class, but can be run over distance. And given that broadband is not ubiquitous in rural northern Pennsylvania, dedicated connections at particular sites can provide reliable connectivity that students may not have at home.
(A few years ago, vacationing in rural northern Pennsylvania, I saw a roadside stand with a sign advertising “live bait and wifi.” I wish I had taken a picture.)
The format could get trickier as they move into more technical classes, but the basic concept strikes me as plausible. If they deploy tutors or advisors to the various centers, along with some generalist student support, they may be able to make it work reasonably well.
But I hope they don’t settle for that.
During the rapid growth period of the 1960’s, institutional isomorphism was the trend, mostly by default. There’s no faster way to get something off the ground than to copy something that already exists. (Brookdale was an exception with its embrace of “mastery learning,” a sort of competency-based approach before it was cool.) The cookie-cutter approach had the considerable merits of speed, economy, and simplicity, and it helped people avoid some basic mistakes. But it also meant that some pretty standard ways of doing things got entrenched without anybody really thinking them through. Now, after decades of kludge, people who want better results have to bushwhack through layers upon layers of sedimentary past decisions.
RRCNP -- it just rolls off the tongue -- has the chance to become a proving ground. It’s largely free of the kludge of legacy systems, “past practice,” and people who’ve done their jobs the same way since the Nixon administration. It has a unique opportunity to build entire systems based on what we know now. And it can even perform the service to the industry of becoming a sort of demonstration project.
To do that, it would probably need some level of philanthropic support, as well as considerable assistance in research design. It would likely be money well spent. Most community colleges are programs already in progress, but this one isn’t. It’s a rare chance.
So, Gates folk and ATD folk and Lumina folk, here’s a chance to do something you couldn’t normally do. (And I say this with no personal connection to RRCNP.) A relatively small investment of money, and a larger one of expertise, could be a game-changer. Then we can talk about that name…
Program Note: it’s vacation time! We’re heading to Canada, hoping to see the six-story rubber duck in Toronto harbor. (Seriously. Google it.) The blog will be back on Monday, July 10.
Monday, June 26, 2017
A biology professor at Youngstown State University turned down an overload class paid on a pro-rated basis, and for some reason, it made national news. I’m sure I’m not the only community college administrator scratching his head at that one.
Pro-rating may or may not be a good idea, but it’s common practice, and there’s nothing unusual in a professor declining to teach a section on that basis. I’m of divided mind on it, precisely because there are arguments on all sides.
Most colleges, at least at this level, have minimum numbers of students that a section is supposed to have in order to run. That’s mostly for economic reasons. Now that tuition is the majority of most community colleges’ revenue -- a historically new development -- paying full freight for lots of small sections would be a budget-buster. One way to handle that is to have a relatively strict go/no-go cutoff number, with the usual exceptions for classes with different facility requirements. (For example, clinical sections in Nursing always run below our standard cap.)
The pro-rating to which I’m referring happens when a section of a class has lower enrollment than would normally be required to run. Rather than cancelling the class outright, some colleges will offer the professor a “per-student” rate. The idea is that if the cutoff is, say, fifteen, the fact that only ten signed up doesn’t necessarily mean the ten don’t need it. (This typically only applies to sections taught on an adjunct or overload basis; I’ve never seen in-load prorating, though I suppose it’s conceptually possible.)
Pro-rating has its advantages. Most basically, it makes it easier for the college to afford to run small sections. That means fewer cancellations. Fewer cancellations are a very good thing for students, since every last-minute course cancellation -- and they’re frequently at or near the last minute, since it takes that long to suss out the final numbers -- throws student schedules into chaos.
Reducing the cost of small sections also makes it easier for professors’ pet courses to run. At the community college level, faculty typically teach the same few courses over and over again. Over time, that can get discouraging. Pet classes often fall well short of the enrollment floor. If not for prorating, they generally wouldn’t run at all. Allowing a per-student rate means allowing the occasional passion project to see the light of day, even if only eight students sign up for it.
All of that said, though, the amount of prep time and class time for a section of ten is no lower than for a section of fifteen. The grading is lighter, which can make a difference in classes with heavy writing assignments, but everything else is pretty much the same. And even with grading, it’s tough to argue that the difference between twelve and fifteen is dispositive, but the difference between twenty and twenty-five is luck of the draw. It’s equally difficult to argue with a straight face that an adjunct with a small section should get pro-rated while a full-timer with a small section gets full credit. The work is the same.
Given that community college students often register late, and that the add/drop period is relatively active, colleges that pro-rate have to answer the question of “as of when?” Is the number for which someone gets paid the number on the first day of the semester, the first day of class, or the day (usually the tenth day) that the college reports its attendance numbers externally? If it’s the first day of class, then people are getting paid for students who walk away after the first day and never come back. If it’s the tenth day, then someone who thinks he’s getting paid for ten may find himself only getting paid for seven, at which point it’s too late to walk away without hurting students. In a perfect world, numbers would be set weeks in advance, but that’s just not how students behave.
From a management perspective, the most frustrating scenario is the professor who refuses a section at the last minute. That’s a real danger of prorating. But if we don’t prorate -- if every section gets full pay no matter what -- then we have to cut down the schedule pretty severely. Is it better to offer to run a small section on a per-student basis, or to close it down entirely?
Wise and worldly readers, what do you think? Given late and fluctuating enrollment -- endemic to the sector -- and the lack of enough money to just make everybody happy, is there a more elegant way? If not, which way would you recommend?
Sunday, June 25, 2017
Why do people with credit cards use payday lenders?
Why do people with checking accounts use check-cashing stores?
I just finished “The Unbanking of America,” by Lisa Servon, which answers those questions in a disarmingly simple way. Servon, a professor at UPenn, got jobs working at a payday lender and a check cashing store, and she asked around. She talked to her bosses, coworkers, and customers. By treating her various sources as intelligent people responding rationally to their circumstances, rather than as helpless victims of evil predators, she was able to stitch together a pretty good argument for why people make the choices they make.
In its approach, it reminded me a little of Tressie McMillan Cottom’s “Lower Ed” or Matthew Desmond’s “Evicted.” In their different ways, each book addresses a policy question that is usually framed in terms of smart, crafty, evil people taking advantage of clueless, ignorant, poor people, and blows up the assumption. In no case are predators let off the hook, but the “prey” are actually (mostly) capable and intelligent people doing the best they can. Understanding why this is the best they can do, and what would give them better options, leads to a very different set of prescriptions.
Servon’s argument has that quality that many good ideas have of being obvious the minute after you hear it. What’s the appeal of a check cashing store? Servon answers:
Picture the interior of your bank. Now imagine for a moment that you are a new immigrant. Is information prominently posted to tell you what products are on offer and how much they cost? Now imagine the interior of a check casher - or visit one. It resembles a fast-food restaurant more than a bank. Posters tell you what products are sold, and large signs above the teller windows list every product, along with its price… (19)
The check cashing store, unlike the bank, earns patrons’ trust through transparency. Servon’s book is full of people who have lost patience with a bank after a ten dollar overdraft led to a thirty dollar fee and another overdraft, which led to another fee, and so on. Compared to that, a flat fee of $1.50 to pay a bill is a bargain.
The check cashers and payday lenders also make a point of customer service. Servon notes that in her time at the check casher, about 80 percent of the customers spoke Spanish; her own ability to speak Spanish was key in doing the job. And the tellers at the check casher and payday lender are given a certain amount of autonomy to grant fee waivers and extensions for steady customers.
In other words, the low-end financial service providers may be expensive, but they’re clear. The higher-end ones are sneakier. “Free” checking that always pays the largest bill first in order to maximize overdraft fees is a long way from free. Folks who’ve been burned a few times know that.
The check cashers also offer instant money. If you’ve been lucky enough to have had direct deposit for a while, this may fly below your radar, but for folks living on the ragged edge of disaster, a several-day float for a check is a disaster waiting to happen. If you get paid on Friday but the check won’t clear until Wednesday, and your kids are hungry now, the check-casher’s fee suddenly seems like a pretty good deal. Yes, it’s expensive in an objective sense, but you will get the money when you need it. If waiting until Wednesday to pay a bill means incurring a late fee, the late fee will almost certainly be more than the check-casher’s fee. Servon gives the example of a coworker who took out a payday loan to cover an emergency car repair. She knew it was an expensive way to get money, but without a working car, she’d lose her job. Given those (admittedly bad) options, the payday loan was the least-bad available choice.
Why do people with credit cards use payday loans? Payday loans don’t show up on credit scores. They’re protecting the credit card for deep emergencies, and using the payday loans for everyday emergencies. And given the number of employers who check credit scores when hiring -- using it as a proxy for general responsibility -- protecting the credit score makes sense. Keeping your debt below the radar actually improves the chances of getting a job that will allow you to pay it off.
I’m only scratching the surface of a remarkable book; it’s well worth reading slowly. The distance between the logic of Servon’s interlocutors and the logic of most policy talk is striking. I’ve read and heard quite a bit about cracking down on payday lenders, but they meet a real need. Cracking down on them without changing anything else would put the woman who needed a car repair right away in an even worse spot than she’s already in. If we want to starve this sector out, we need to do it by taking away its reason to exist. Servon offers several worthwhile suggestions, ranging from vastly simplified disclosures -- along the lines of the fast food style menu -- at banks, to vastly shortened “float” periods, to serious attention to decent wages. Some are more likely than others,but they all come from the valid assumption that poverty can be both expensive and self-reinforcing. Breaking those cycles requires conscious, deliberate effort.
As with McMillan Cottom’s and Desmond’s books, Servon sheds light through legwork. She gives us a patron’s eye view of a sector that’s usually either ignored or blindly vilified. It’s the kind of information without which we’ll keep making decisions -- with all of the best intentions -- that make hard lives that much harder. Check it out.
Wednesday, June 21, 2017
In a little over a week, the family will pile into the Family Truckster and go on vacation. As I explained it to the kids, they’re going to have a good time whether they like it or not. The trip will involve several five-plus hour days of driving as we make our way from one stop to the next. That’s a lot of windshield time, which means it’s time to crowdsource some ideas for podcasts to listen to on the way.
In the spirit of full disclosure, I’ll lead with some of my favorites.
Marketplace and Marketplace Tech. Although the ads have been getting longer and more intrusive lately, these are still two evergreen faves. They’re news shows about business, the economy, and (in the latter case) technology, but they’re much lighter than the description makes them sound. I considered it a career highlight a few years ago when I was interviewed for a segment, and a brief clip of my voice made it on the air. Over time, you start to pick up on the personas of some of the reporters; nobody does an aural smirk like Amy Scott. They recently added CNET alum Molly Wood, who fits the show eerily well. She and Kai Ryssdal even started a spinoff podcast called Make Me Smart that gives their banter some breathing room.
The Pollsters. This one is for the poli sci nerd in me. It’s a more-or-less weekly show by a pair of pollsters with different political affiliations. Margie, the Democrat, has a bit of Janeane Garofalo or Daria in her delivery; Kristen, the Republican, is typically more chipper. It rarely falls into sparring territory, though; they seem to genuinely like each other, and they share a taste for testing hypotheses against data. (They drop words like “crosstabs” into conversation, assuming that listeners know what it means.) The combination of humor, civility, and empiricism grows on you.
This American Life. Well, yeah.
Death, Sex, and Money. The host, Anna Sale, has three great strengths. She can sniff out a story from an unlikely corner, she gets interviewees to open up as well as anybody, and she has the best laugh in the business. Her recent interview with Alec Baldwin actually made him seem likable, which takes some doing. Bonus points for her jaw-dropping interview with former senator Alan Simpson a couple of years ago.
Reveal. Despite inconsistent audio, the stories are often excellent. It’s investigative journalism from the bottom up.
The Dollop. It’s an American history podcast in which one comedian does a play-by-play of some obscure moment or person in history while the other riffs on it. Except when they have guests, or both start laughing, or Gareth forgets to be funny for a few minutes and just keeps repeating “right.” It’s a sort of audio Mystery Science Theater, but based on real events. The quality of the episodes varies, but the one about Sylvester Graham (featuring guest Patton Oswalt) is great fun, and the one about baseball legend Bill Veeck and disco demolition night at Comiskey Park had me laughing out loud as I drove. The language can be a bit salty, so it’s not kid-friendly, but the better episodes are worth it.
Higher Ed Happy Hour. I don’t especially care to listen to people drinking, but the conversations are often quite good. Kevin Carey and Libby Nelson come off as good sports, and the guests are often quite smart. I’ve got this one on my “career goals” list, too.
Comedy Bang Bang. This one is not family friendly. It’s set up as an interview show, but it follows the cardinal rule of comedy improv: “yes, and.” Given its length - well over an hour for most episodes - “yes, and” can lead pretty far afield. Paul F. Tompkins’ occasional impression of Andrew Lloyd Webber is a highlight, and anytime they get Jessica St. Clair’s “Marissa Wompler” is more than worthwhile. (“Womp it up!”)
The Dana Gould Hour. It’s usually much more than an hour, and it comes out when it comes out, but it’s often brilliant. Gould is a comedian who used to write for The Simpsons, and if you follow the podcast long enough, you get to know him weirdly well. He’s obsessed with the Planet of the Apes movies, Kolchak the Night Stalker, and anything on Boston tv in the 1960’s and 70’s. He’s a terrible interviewer, but his guests are good sports, and it’s frequently funny. “Two Guys from Boston” is a great recurring bit that could easily be its own show.
And an honorable mention from the audiobook world goes to anything by P.G. Wodehouse. His stuff can seem stilted on the page, but when read out loud, it’s perfect. He once referred to his books as musical comedy without music, and there’s truth to that, but just listen for the use of language. His politics were obtuse and the plots insubstantial, but who cares -- he played the language like a maestro. Listen with writer’s ears.
Wise and worldly readers, what would you suggest? Heard any good podcasts lately?
Tuesday, June 20, 2017
Who should control curriculum at public colleges? States, or individual departments?
Each answer has a pretty clear implication for transfer.
I’m facing this one because some of the four-year schools in my state -- not naming any names -- have recently declared that they’re each unilaterally changing the rules for which courses they’ll take in transfer. In at least one case, the impetus seems to have come from one department within the receiving institution before spreading. This in a context of a state with a law mandating transfer, but with asterisks that seem to grow over time.
If you’re a believer in the autonomy of academic departments, this act of nullification may strike you as a blow for freedom, or standards, or some other good thing. If the Basketweaving department at Compass Direction State U wants to hoard credits for itself, it can simply declare ex cathedra that similar credits from community colleges supported by the very same taxpayers that support CDSU just aren’t good enough. Or similar enough. Or whatever enough.
If each college -- hell, each department within each college -- is permitted to nullify statewide agreements, though, it won’t be long before the requirements of the different four-year schools start to conflict with each other. (That’s already happening.) If you’re a community college with a strong transfer function, whose curriculum do you mirror? To put it differently, whose curriculum becomes your default choice?
I understand the temptations of devolution. A few weeks ago I got into a colloquy with Richard Florida about devolution in the context of the power relations between the Federal government and cities; broadly speaking, he favors more local autonomy, and I consider it a trap. His argument is that many cities are far ahead of national governments in addressing some very real social problems, and they shouldn’t be hamstrung by an agglomeration of rotten boroughs from doing what needs to be done. My argument is that in actual historical practice, locally oppressed groups have sought to socialize conflicts for a very good reason; it’s no coincidence that the rhetoric of “states’ rights” was deployed in the service of systematic racism. The smaller the venue, the easier it is for local tyrannies to reign.
The stakes are juuuuust a bit smaller in this case, but the basic logic is the same. If we conceive of public colleges as entirely freestanding, then yes, the argument from devolution makes sense. But if we see them as part of a larger ecosystem, then the idea that any department can go rogue at any time, for any reason, quickly becomes ridiculous. Assuming that the taxpayers would prefer not to subsidize the same student taking the same course twice at different schools -- a safe assumption, in my experience -- there’s a compelling argument for some sort of central authority to be able to override local preferences.
That can be done with varying degrees of grace, of course. In Massachusetts the state did it right, starting with a series of statewide gatherings of faculty from both sectors, clustered by discipline. It basically locked each discipline in separate rooms and told them not to come out until they had the outlines of a workable agreement about what everybody would teach and what everybody would accept. The state mandated that the colleges agree, but remained neutral as to the content of the agreement. That struck me as a smart way to do it. I sat in on the poli sci discussions, during which it quickly became clear that the expectations of the two sectors were almost entirely distinct. (Intro to American Government was the only point of consensus.) And while it would have been easy for a given department on a given campus to tell a dean who asks too many questions to go pound sand, it was much harder for a professor at Compass Direction State to tell a counterpart at Local Community College that he didn’t count.
Because at the core of these battles are students who lose credits in transfer. We know from the literature that credit loss upon transfer is a powerful predictor of attrition, and for obvious reasons. We also know that neither legislators nor taxpayers relish paying for the same thing twice. If we want to provide “guided pathways,” it would be much easier if the pathways upwards were consistent with each other. Otherwise, we’re left with an advising task of such complexity that it guarantees failure at scale.
Wise and worldly readers, is there a better way than the Massachusetts model? Or are we stuck with a choice between centralized dictatorship and entropy?
Monday, June 19, 2017
For Father’s Day, one of my gifts was control of the tv for the day. I used it to show the kids Monty Python and the Holy Grail, a classic to which they might not otherwise be exposed. (I fast-forwarded past the “Castle Anthrax” scene, for reasons obvious to anyone who has seen the movie.) The Boy, very much his mother’s son, liked it okay; he laughed a few times, and recognized the “it’s only a flesh wound” scene from some memes. The Girl, very much her father’s daughter, laughed out loud throughout. The catapulted cow brought an abrupt belly laugh that I never get tired of hearing.
I even took some parental pride in noting how she was able to follow the dialogue in the “king? We have no king!” exchange. A chip off the old block.
Later in the day, The Boy and I were out on some errands when the topic of white privilege came up. I wanted to convey that it’s a real thing, and a problem, without making him defensive, so I used the example of my daily drive through a high-income area on my way to work. In two years of driving through some places I could never afford to live, I’ve never been pulled over. That’s a sign of a certain benefit of the doubt that isn’t universally shared. It’s not my fault, or his, that we don’t get pulled over; if anything, I think that non-practice should extend to everyone. But to pretend it isn’t there is to miss something basic. He seemed to accept that.
Later, he started talking about putting together a college tour this summer. I warned him that I intend to be an absolute, unapologetic nightmare on campus tours. He’s resigned to it.
I bring these up because a couple of recent articles threw them into relief for me. First was Eric Hoover’s piece in the Chronicle about a college adviser at a Texas high school with many low-income, homeless, and/or undocumented students. The second was Annie Lowrey’s piece in the Atlantic reacting to Richard Reeves’ new book about upper-middle-class opportunity hoarding.
The college adviser in Texas works with students for whom the FAFSA can be an insurmountable obstacle, whether due to low income, familial chaos, or immigration status. One student had to write an essay to the financial aid office explaining how she lives on almost no money; I can’t imagine an upper-middle-class kid being asked to justify how he lives. Bureaucracy that more fortunate students can effectively delegate to their parents becomes a real problem when the parents either aren’t there or aren’t in a position to help navigate it. The recent decision to take down the IRS site that allowed access to previous years’ tax returns made an already difficult task that much harder, unless your parents were able to afford TurboTax or an accountant.
The advantages we’re giving our kids - lots of books, frequent discussion of politics and current events, a good school district, a stable home - will make it likelier that they’ll do well economically. The advantages accrue over time. That amounts, at some level, to the kind of hoarding that Lowrey/Reeves describe. That’s not anyone’s fault, but it’s real.
The issue is structural, as are the solutions. I don’t apologize for giving my kids lots of books, or for putting them in situations likely to help them thrive. As a parent, I consider that part of my job. They’re great kids -- I’m biased, but still -- and I want them to be able to develop into the best versions of themselves that they can. In my perfect world, every kid would get that chance. The ethical obligation here is to use politics to pay it forward. After all, we have no king.
Sunday, June 18, 2017
How representative a class do you give when you’re being observed by a dean?
I’m guessing that for many people, it’s sort of like how they drive when there’s a cop in the rear-view. It’s not faking, exactly, but there’s an element of wanting to put on a good show.
In the case of the cop, that’s harmless enough, and probably inevitable. In the case of teaching, though, that’s kind of a problem. The only people who see the truth of long-term performance are students, and they often have agendas -- conscious or unconscious -- of their own.
That’s why I was so taken with this piece in IHE last week. Alison Cook-Sather, a professor at Bryn Mawr and Haverford, used grant funding to develop a system in which students who aren’t taking a given class are hired to sit in on it and provide constructive, non-evaluative feedback to the professor on a regular basis.
It’s a nifty idea for several reasons. For one, it separates feedback-for-improvement from feedback-for-accountability. They really need to be separate, and in practice, that can only work when they’re done by separate people. Having students provide feedback that never makes it to the dean can help the faculty improve before a problem becomes a performance issue.
Even better, the constructive feedback is from a student’s-eye view, and not based on a single day. As hard as a peer or a dean might try to emulate a student’s view, it’s hard not to bring experience and training to what you see. That can be a good thing, but it’s not the same thing. Many years ago I observed a professor that most students considered a hateful jerk, and came away wondering why; he had a sly sense of humor that I really enjoyed. They may have sensed that they weren’t getting a joke, or there may have been something else. A student observer might have had a better shot at ferreting out the real issue. To this day, I don’t know what it was.
I also like that it employs students in a way that treats their student status as an asset. It suggests that they have something useful to say, and offers a venue in which they can say it.
Too many colleges have some version of feedback-for-accountability, but no formal mechanism for feedback-for-improvement. Instead, they try to get a single system to serve both purposes. In trying to serve two purposes, it does justice to neither. If I’m teaching, and I suspect that a shaky performance on observation day will have negative professional consequences for me, I’m likely to stick to the hits on observation day. That’s individually rational, but collectively destructive. I’m unlikely to break the cycle myself, though, because if I offer a warts-and-all performance and nobody else does, I look like a problem. But splitting the roles between two people allows for both functions to be served fully.
I’ve long held that we could and should try something similar in the classroom with grading. If we separate teaching from grading -- as in, swapping papers -- then the psychological dynamic between professor and student becomes simpler. Instead of the fraught and complicated “I’m helping you, but I’m also judging you,” it becomes “I’m helping you; that guy behind the curtain is judging you. It’s you and me against him.” Athletics often work like that, with either the clock or the other team serving as the de facto grader.
At Brookdale we’ve established a cadre of faculty who are “on call” to provide non-evaluative observations and feedback for other faculty. Uptake so far has been modest, but I remain convinced that there’s merit in the model. The observers are sworn to secrecy about who they observe and what they see; all I ask at the end of the semester is a total number. Anecdotally, the observers report that the post-class conversations are often quite good, since there’s nothing stopping them from telling the truth. And from an administrative perspective, I’d much rather have faculty performing well than performing badly for a whole bunch of reasons. If the occasional tune-up helps someone who’s struggling to get back on track, everybody wins.
Kudos to Prof. Cook-Sather for finding a way to take that model and expand it to include students. It’s a terrific idea, and the concept -- if not the funding -- is easy to replicate. I tip my cap.
Thursday, June 15, 2017
This piece on the idea of a universal basic income connects dots in some pretty disturbing ways. I’m still not sure what to make of it.
UBI is the proposal to make a certain income level a basic right of citizenship, like in Alaska. On the left, it’s usually conceived as a supplement to other benefits; on the right, it’s usually conceived as a substitute for them. The idea has caught on recently among some techies as a response to technology-driven worker obsolescence. Yes, automation, IT, and the emerging internet of things may cause mass unemployment, they concede, but with a good UBI program, who cares? Besides, the internet is the greatest mass entertainment system ever devised; as long as all those useless people have screens and connections, what’s the problem?
The UBI idea has a long history. It showed up in various forms of 19th century utopian socialism, for instance. Oscar Wilde argued in “The Soul of Man Under Socialism” that between technological advances and political ones, we could harness technology to free workers from boring tasks so they could focus instead on higher things, like art and philosophy. (This was before Netflix.) In the middle of the 20th century, some very smart people worried about the ever-shortening workweek and the cultural deadening that would follow from excessive leisure. John Maynard Keynes speculated about the coming fifteen hour workweek, extrapolating from hundred year trends. In the 1960’s, no less a thinker than David Riesman titled a collection of essays “Abundance for What?” Now, instead, full-time jobs often expect 60-plus hours per week, and we sneer at the fuzzy-headed idealists who think that parental leave should extend at least until the baby is old enough to walk.
Part-time jobs have grown, but with less-than-proportional pay. Making a living often requiring stitching several of them together, at the expense of the art and philosophy -- and, yes, Netflix -- that Keynes and Wilde thought reduced hours could enable.
So would just paying people to step off the treadmill solve the problem?
Jobs are about more than income, as important as income is. They’re about usefulness, and all that entails.
Tyler Cowen notes in The Complacent Class that the cities in the US with the most integration by economic class -- Cincinnati, Buffalo, Rochester -- mostly reflect the 20th century economy, and are losing ground. The cities with the most segregation by class -- Boston, New York, Seattle, Austin -- reflect the 21st century economy, and are pulling away from the rest. It’s possible to read the 2016 election in part as a reaction against a growing, inchoate, visceral sense of irrelevance. The reaction is one thing, but the sense of irrelevance is another.
When students talk about wanting jobs and careers, part of what they talk about is wanting a place in the world. A good job offers an income, but it also offers a reason to get up in the morning. It offers the possibility of mobility, and a felt sense of agency. And in some cases, it offers real political and economic power. It’s no coincidence that the labor movement was based in, well, labor; having something of economic value to trade -- the ability to work -- offered leverage that could be used to gain political power. Lose the value of labor, and you lose that leverage. Apps don’t unionize.
People who feel useless aren’t at their best. To the extent that the economy cleaves into one group that works 80 hours a week and another that’s told to be happy watching screens in economic backwaters, I don’t see a healthy polity. Political polarization isn’t just about bad manners; it’s rooted in a deeper economic polarization. Add a difficult racial history to the mix and the picture gets even uglier.
Brookdale’s 50th birthday is next month. It, like most community colleges in America, was founded at the height of what economists call The Great Compression. It was the period when income polarization was at its lowest. Community colleges are physical incarnations of the assumption that prosperity and its demands are spreading everywhere, and for a while, that assumption was correct. Now prosperity is retreating into a few enclaves. Public higher education wasn’t built for that. It’s a difficult adjustment, and the jury is still out on how best to handle it.
At a basic level, though, I think we need to reject the idea that consigning much of society to permanent uselessness is somehow okay. There’s talent in unlikely places, and possibility where you might not expect it. And there’s a dignity that comes with having a contribution to make, whatever that contribution is. For all of their flaws, the single most appealing trait of community colleges and similar places is their belief in the dignity and possibility of everyone, no matter their income or family situation. They empower people to make better lives for themselves. They live out the assumption that usefulness isn’t the exclusive property of anybody. It isn’t.
I’m perfectly happy to consider UBI as a policy tool among others for improving lives. But to the extent that it rests on an assumption of widespread uselessness, I have to say no. A belief in empowerment and dignity for everyone may be dated, or fuzzy-headed, or corny, but I don’t care. It’s what gets me out of bed in the morning.
Wednesday, June 14, 2017
James Baldwin once noted that being poor is expensive, and that people who’ve never been poor don’t know that. He was right. If your car is old and unreliable, you lose job opportunities due to breakdowns, and emergency repairs throw your budget into disarray. If you need to use a laundromat to do laundry, you lose time and energy on a task that other people can do in the background. When you live in a low-income area, other people’s issues are inescapable; when you live in a more affluent one, you can buy a certain insulation from them. Their issues can become your issues.
The same principle applies to institutions. Paradoxically, austerity is expensive.
From a layman’s standpoint, I’d say that American treatment of passenger rail is a clear case of expensive austerity. The maintenance nightmares in, say, the DC Metro, the BART in the Bay Area, or Penn Station in New York are all direct and predictable side effects of too little maintenance for too long; at this point, their cost in both direct repairs and lost productivity from delayed riders outstrips any marginal savings on maintenance. Similarly, many Amtrak routes are both slower and rarer than they should be; as a result, ridership dwindles, which leads to still more cuts. At a certain point, the service becomes so degraded that it’s hard to argue for saving it.
Colleges have direct and indirect costs of austerity, too. For example, not replacing full-time faculty who leave can lead directly to an inability to offer enough sections of certain courses to meet enrollment demand. That leads directly to reduced enrollment, which forces - wait for it - more cuts. The upfront savings from not paying somebody are more than consumed by the lost revenue from sections that couldn’t run.
But the long run costs, while subtler, can be more devastating.
One, as with trains, is deferred maintenance. That’s an easy trap to fall into, especially when money for new construction is easier to find than is money for maintenance and renovation. In a given year, deferring some expensive repairs can be prudent, but you can’t do it forever. Small leaks become big ones, and eventually, patches of patches of patches fail. When that happens all at once, the collateral damage -- lost property, cancelled classes, and, in the worst cases, injury -- can make the deferred costs look like bargains by comparison.
Another is stasis, which leads inexorably to loss of quality. Yes, conference travel can be expensive, and I grant without argument that some people are good at blurring the line between professional and personal travel. But my nightmare isn’t that someone goes to a conference and sneaks out to an interesting lunch place between panels. I can live with that. My nightmare is that someone with tenure doesn’t go anywhere, ever, for decades. It’s a sort of deferred maintenance, but for the intellectual infrastructure, and it has similar effects. It leads to losing touch with developments (and people) in the field. The effects are gradual, but cumulative, and they start to show in performance.
Reduced performance is the greatest cost of austerity. It happens in several ways. As people stop traveling, they lose touch. As more full-timers get replaced by adjuncts in high-turnover fields, hiring managers gradually lower the bar to the level necessary to get the classes covered. Eventually, a sort of fatalism sets in, in which ambition itself is brushed aside with a cynical wisdom based in just enough truth to be hard to refute.
I mention all of this not because I have access to the Money Fairy -- I don’t -- but because in the absence of a willingness to try something different, austerity is the default path. In the very short term, it’s easier than taking a risk. But over time, it catches up to you. I’d rather take some risks while the roof is still holding. It’s cheaper that way.
Tuesday, June 13, 2017
I remember my hair. It was never my best feature, but it did its job. It even endured some questionable styling choices, including an 80’s mullet of which the less said, the better. It’s gone now, and I miss it. But it’s not coming back.
Part of getting older is coming to terms with certain kinds of loss. I wouldn’t want to be 19 again, heaven knows, though I miss the hair and metabolism of that age. Middle age brings with it some undeniable drawbacks -- anyone who has heard my knees when I stand up after a while knows what I’m talking about -- but it brings perspective, a certain social standing, and a different kind of confidence. I will never be The Hot Guy, but I no longer care, and there’s a power in that.
Serious students of history smile indulgently when they hear people talk about golden ages, broadly defined. (That’s not to deny that individual people or projects can have hot streaks; most listeners would probably agree that Paul McCartney’s work in the 60’s was better than his work in the 80’s.) Golden Ages rely on partial and selective memory. Youth brought with it a certain physical invincibility, but also an anxiety that pervaded almost every aspect of life. Now, when anxiety exists, it exists for a reason. That wasn’t always true, and I wouldn’t go back for anything.
I’m butting up against some Golden Age thinking among peers, and it’s frustrating. It’s getting in the way.
In much of the country, community colleges are in a secular decline in enrollment. They’re up against greater public and political scrutiny than they once were; arguments from professional deference have largely given way to demands for accountability, even as many of the older deference-based rules have remained in place. Their funding is flat or nearly so, if it hasn’t been slashed or eliminated. Health insurance costs continue to climb much faster than any revenue source. Some tuition-driven four-year schools are lowering their standards to fish in our pond, exacerbating the enrollment problem.
But digging in heels and opposing anything new won’t bring the old days back. In fact, the old days led inexorably to the new ones. Had the old ways been sustainable, they would have been sustained. They weren’t.
In looking at ways to adapt to the new environment, I keep butting up against longing for the return of the golden age. If we just refuse to budge, the argument goes, the universe will relent and it will be 1977 again, only with more diversity and cooler phones. We can stand athwart history, yelling Stop!
Except that we can’t. And refusing to engage with the future amounts to giving up any meaningful agency in shaping it.
On a personal level, coming to terms with loss takes time. The same is true on an institutional level. My fear is that the longer we spend in denial, the less room we’ll have to move. I’d rather have some say in shaping the future than in simply having it happen to me. But golden ages die hard, even when they’re already dead.
Monday, June 12, 2017
Ms. Mentor has her niche and I have mine, but her column on portrayals of deans in literature was too much fan to pass up. Apparently, a recent spate of literature has characters killing their deans, who often go unmourned. Ms. Mentor notes that deans are “squeezed and budget-cutted [sic] from above, harangued and belittled from below,” which is about right. But their relative invisibility offers a useful perspective on a larger truth.
Most management books are written from the perspective of the CEO or the founder. They assume that your job is to make the big decisions and to hold other people accountable for executing those decisions. That’s a pretty simplistic view of how decisions are made, especially in academe, but it has a kernel of truth, at least in the corporate world. (In my observation, boards of trustees often take much larger roles at nonprofits than the literature tends to assume, constraining even the CEO’s autonomy. But that’s another post.)
But that’s not what most managers are. Most managers are closer to deans than to presidents. They’re in the middle. They can try to nudge upwards, and sometimes that works, but they don’t set the overall direction. And while they can facilitate good implementation, it’s not like they can bark commands and demand obedience. That’s simply not how this works. They have some room to move, but far less than most management literature assumes. And it’s not unusual that they find themselves tasked with carrying out policies with which they personally disagree. When success in a position relies largely on “soft power,” having to carry out positions with which you personally disagree can be a real strain.
Obviously, if the disagreements become too large or frequent, the right move is to step out of the role. But that’s the exception. More commonly, there’s a vague sense of “I wouldn’t have done it that way” that falls well short of a crisis of conscience, but can be enough to sap motivation. That’s especially true when budgets are tightening and adverse decisions are made for you.
There are great books waiting to be written on the dilemmas of middle management. (I like to think my own book on the subject is pretty good, come to think of it…) Maintaining your own credibility while nudging people to comply with policies about which you have reservations yourself is a tricky maneuver. The proper literary mode for a character like that is probably tragedy.
From the perspective of the middle, a clear sense of “why” is crucial. It’s much easier to work with a policy, even one you don’t love, if you understand the reason it exists. That puts a burden on central leadership to be clear about the why, and to make some sort of plausible connection between means and ends. The connection will have to be at a high level, leaving the details to folks closer to them, but the storyline has to be clear. In the absence of a narrative, people will make up their own. And their own will reflect other agendas.
Most of the time, loathing of deans is misplaced. They stand as symbols of The Administration on one side, while being suspected of being collaborators on the other. Most of the time, they’re neither. They’re somewhere between translators and coaches. But almost nobody who hasn’t done the job knows that.
I tip my cap to the deans out there, melancholy or not. They don’t deserve to be cast as villains. Tragic heroes, on the other hand...
Sunday, June 11, 2017
Why is it that meal plans covered by financial aid at elite universities, but cafeteria meals at community colleges aren’t?
Sara Goldrick-Rab has been asking lately why the free lunch program in public K-12 schools doesn’t extend to community college.
She has a point.
We know that students who are distracted by hunger can’t focus as well as students who are fed. We also know that substantial numbers of community college students are only precariously housed, often couch-surfing or bouncing from one bad situation to the next. We know from national data that significant percentages of community college students skip meals because they can’t afford to eat.
I’m thinking that it might actually be easier to implement in this setting than in K-12.
Most college students don’t attend class five days a week. Even full-time students frequently only come to campus three or four days per week. Assuming that four lunches are cheaper than five, that cuts the per-student cost significantly.
And at this point, most community colleges have accounts by which students can pay for food in the cafeteria with their ID’s. That means that a student who gets a free (or reduced price) lunch with her ID looks exactly the same to everyone else as a student who doesn’t. The stigma of standing out is removed.
Tying the free lunches to continued enrollment can provide an incentive for struggling students to finish their semesters; dropping out would mean losing access to food!
The details would take some tweaking, since most colleges have outsourced their cafeterias to various for-profit companies and run them as auxiliary moneymaking enterprises. But that doesn’t strike me as a deal-breaker. If the cards were loaded with x dollars per day, for instance, they would work perfectly well with outsourced enterprises. The key would be ensuring that the allocation is reasonably related to the cost of food.
Residential colleges, including some very expensive and high-toned ones, routinely include the cost of a meal plan in the total cost of attendance, and students can use financial aid to cover it. Why meals can be covered at, say, Harvard when they can’t be covered at Bunker Hill Community College isn’t entirely obvious.
Or maybe it is. But I prefer to think it isn’t.
To be clear, when I say “lunch,” I actually mean “food.” That could mean breakfast or dinner. The key, I think, would be some sort of reasonable per diem on the ID card that students could use at the campus cafeteria. Because it would be a dollar figure, we wouldn’t have to get into the bureaucracy of deciding which items count and which don’t, or when breakfast ends and lunch begins. We could make it simple. Alexis gets, say, $7 a day to use on campus towards food. How she uses it is up to her. Keeping it simple would keep administrative costs to a minimum, allowing more of the money to go directly to food. Each day could be “use it or lose it,” so it would only cover the days students are actually on campus; that would keep costs down, and would provide an incentive to show up for class.
The actual cost would be fairly low, given that most students would only get three or four meals a week on campus.
It’s far cheaper than free college, but it would make a real difference. Students would know that they could get lunch in between classes, even when they’re otherwise broke. They could focus on their work, and maybe have some relaxed time with friends between classes. It’s the sort of thing we subsidize routinely at Stanford and Williams. Why not here?