Wednesday, December 20, 2017
Every so often I hear a question that makes me wonder why I wasn’t wondering about it already.
We’re getting a new governor shortly. He ran on “free community college,” though it’s still unclear what that will mean in practice.
In a conversation on campus this week, someone mentioned concern about a possible impact on philanthropy. If word of free community college spreads, will donors think that issues of access have been solved and redirect their giving elsewhere?
As Sara Goldrick-Rab has pointed out repeatedly, tuition is only one cost. Taking it off the table -- which I support -- still leaves costs for books, transportation, daily life, and the opportunity cost of paid work not worked to make time for classes. But potential donors may or may not be aware of that, or appreciate what it means.
So I’ll put out a call to my wise and worldly readers who live in places with well-publicized “free community college” programs. Have you seen an impact on philanthropy one way or the other?
Along those lines, it’s worth remembering -- as this piece reminds us -- that free community college is a bipartisan issue.
Thanks to President Patricia McGuire of Trinity Wesleyan University for a shout-out in her keynote speech at NEASC this month. Her speech is a welcome call to action to her counterparts in the four-year sector to take community colleges more seriously as educational partners.
In a speech that references American Nazis, it’s nice to be named as being on the other side...
I’ll be taking a blogging break for the holidays, returning in the first week of January. Best wishes to my wise and worldly readers for peace, happiness, and a 2018 much more worthy of you than 2017 was...
Tuesday, December 19, 2017
About a year ago I was at a workshop in which the participants were asked to write down their managerial superpowers. I wrote “talent scout.” I stand by it.
Talent scouting in academia is different than in sports or show business. In sports and show business, you’re seeing people play or act, just as you’re scouting them to do. You may be looking for clues as to how they’ll perform at a higher level of competition, but the basic tasks are the same. A high school baseball player does the same tasks as a college or professional one. And usually, the prospective players or actors go out of their way to get noticed.
Talent scouting here is different.
In teaching, sometimes it means delivering the tap on the shoulder to let a particular student know that she’s better than she thinks. Clint Smith’s tweet yesterday reminded me of that. I’ve had a few students over the years who were much more thoughtful than they gave themselves credit for; once I gave them explicit permission to be smart, they took off. Teachers have that power, at least sometimes.
Using that power well means distinguishing between the perfection of a flawless task and the errors of growth. Flawless execution of set tasks is well and good. But the ambitious ones who sometimes stretch a little farther than their reach -- the rookies who gets thrown out trying to stretch a single into a double, to extend the baseball metaphor -- are often the ones who benefit from being noticed. Errors of sloppiness or apathy are fundamentally different from errors of growth; the best teachers know immediately which are which. The catch is that sometimes the students themselves don’t know the difference, and can undermine themselves by focusing on their mistakes. When you single out the one who’s stretching and offer constructive praise, the results can be amazing.
In administration, talent scouting often means finding latent abilities in people other than the ones they were hired for, and are paid to perform. That’s very different from the baseball scout who watches the high school pitcher. It’s closer to figuring out which player would make the best manager. The skills overlap to some degree, but they’re distinct enough that the best at one role are not necessarily the best at the other.
When I moved from full-time teaching to full-time administration, I used to describe the change as being like switching from sprinting to distance running. They’re both demanding, but in different ways. The best sprinter may or may not be the best distance runner, and vice versa.
I’ve known some absolutely amazing classroom performers who leveraged their considerable charisma and flair for the dramatic to deliver lectures for which they could have sold tickets. When they were on stage, they commanded it. In the right setting, with the right boundaries, that can be wildly effective. But that doesn’t really translate to administration, where the tasks are often more about collaboration, diplomacy, and patience than about owning the room.
In discussing a controversial campus proposal, for example, I’m typically more impressed by a really good question than by a fire-breathing tirade. The person who can come up with the third, better option to a dilemma catches my eye. And the one who can see past what’s said to get at what’s meant -- even if the speaker isn’t consciously aware of it -- and build a solution on that, is a star.
The limited hiring of the last few years has been frustrating, not least because it has it reduced the relevance of talent scouting. But it never stops mattering. Sometimes education is about showing someone a talent he didn’t know he had, whether it’s a professor with a project or a third grader with a poem.
Thank you, Clint Smith, for a timely reminder.
Monday, December 18, 2017
We cost too much to insure, because our employees’ average age is climbing. And our employees’ average age is climbing because we can’t afford to replace most who leave.
(insert sound of head hitting desk, over and over again)
Insurers express the ratio of what they spend on claims to what they receive in premiums in the form of an “experience rating.” An experience rating of 80 means that the insurance company pays out 80 percent of what it receives from a given employer. An experience rating of 100 means that it’s paying out every dollar it receives. An experience rating over 100 means that it’s actually losing money on that employer.
If your experience rating is over 100, you have no ability to shop around for different providers. Any provider not required to cover you, won’t. You’re too expensive.
My college’s experience rating is over 100, and climbing.
It’s not because our premiums are modest, heaven knows. It’s because the combination of life tenure, the absence of a mandatory retirement age, and years of cutting by attrition -- that is, failing to replace retirees with new people -- means that our generational distribution is top-heavy. In the aggregate -- meaning that yes, there are individual exceptions -- people in their 60’s and beyond consume more medical care than people in their 20’s and 30’s. We can’t afford to hire very many people because we’re spending too much on insurance. And we can’t get our insurance rates down because our average age is high and climbing.
It’s a frustrating spot to be in, and not just for the obvious financial reasons. It’s because the various elements of the dilemma came about independently of each other, and aren’t really subject to argument.
When the Supreme Court outlawed mandatory retirement ages for tenured faculty -- a decision that took effect shortly before I went on the faculty job market -- it did not stipulate that states or counties provide extra money to colleges to compensate for the change. It simply extended tenure, which the AAUP redbook originally stipulated would end at the “normal retirement age,” until death. Suddenly, there was no fixed upper bound on employee age. While that presented a windfall for some incumbent employees, it wreaked havoc on employers. Cutting by attrition suddenly became both more haphazard and more urgent.
Meanwhile, collective bargaining agreements typically base salaries largely or entirely on seniority. Theoretically, a contract could include a salary cap, but it would be impossible to bargain, and would almost certainly fall victim to claims of age discrimination anyway.
Colleges responded the only way they could. They started hiring far fewer tenure-track faculty, and placing more of the teaching onto the ranks of adjuncts. By a sort of risk homeostasis, making one group bulletproof required shifting more risk onto another group. The retirement age may have gone away, but institutional risk did not. It had to go somewhere.
In the short term, it’s hard to argue with the economic logic of the move. You may not be able to do anything about the cost of very senior employees, but you can convert their lines to part-time status when they go. Health insurance may continue to go up, but if it applies to fewer people, you can blunt the rate of increase for a while.
Now we’re seeing the economic limits of that strategy. If you don’t have a decent-sized cadre of younger employees in the insurance pool, the rates go up even faster. (And that’s without addressing the impact on generations X and Y of having what used to be full-time jobs turn to adjunct positions. As a gen X’er who got his start as a freeway-flying adjunct, I’m keenly aware of this.) The very expedient that solved a short-term problem created a longer-term one.
To my mind, the blindingly obvious solution is to decouple health insurance from employment altogether, and make it a right of citizenship. That would make experience ratings irrelevant, and would dissipate perverse incentives. It would also be the morally right thing to do. But that doesn’t appear likely, at least for a while.
There’s a decent “intergenerational justice” argument for reinstating a retirement age, too. Though given the size of the Boomer voting bloc, I’m not holding my breath on that.
So for now, we’re kind of stuck. It’s not any one person’s fault, which is the good news and the bad news. So I’ll just appeal to our political leaders. If these are the rules by which we have to play, we need help. Either change some rules or pony up some money. We can’t keep doing this. My certainty rating on that is well over 100.
Sunday, December 17, 2017
What would we do, as an industry, if one LMS provider paid for “fast lane” internet access and the others didn’t?
It could happen, now that the FCC has voted to repeal the requirement for “net neutrality.”
Without a legal prohibition, landline internet service providers are now allowed to charge websites extra for ‘fast lane’ service, to slow content with which they compete, and to block entirely any content they want. (My impression is that those rules never applied to mobile providers, such as those that cover smartphones. They were about home broadband.) So if, say, Comcast wanted to stop cord-cutters, it could slow or block content from Netflix or Sling TV. And if a politically motivated owner were to take charge of an ISP, it could make life difficult for sites on the other side politically.
As someone tweeted last week, if a liberal CEO of Comcast were to suddenly start throttling pro-gun sites, we’d have net neutrality back within a week. (There’s historical warrant for that. When the Black Panthers started brandishing guns in Oakland, then-Governor Reagan suddenly saw the wisdom of gun control laws.) Laws to stop change tend to favor the folks currently on top. At least for a while.
Ajit Pai, the chair of the FCC, has claimed that moving enforcement of internet rules from the FCC to the FTC will still allow enforcement of judgments against anticompetitive practices. That may or may not be true as a practical matter, but even if it is, it’s only after the fact, and only in a commercial context. The FTC offers no protection to political speech. That’s not a part of its mission. The FCC does, but not for the internet anymore. If, say, Comcast decided to give Fox News preferential access and to throttle the Washington Post, it could.
Netflix would probably be willing and able to pay the toll for a fast lane. But the next Netflix, still incipient, wouldn’t. It’s hard enough to compete with Google now; give Google a literal head start into everyone’s home, and it would be impossible.
It doesn’t take much imagination to see, say, Amazon or Google buying Blackboard, and letting it piggyback on its ‘preferred’ status as a fast lane site. That would immediately put users of other learning management systems in a bad spot. Over time, that would lead to even greater consolidation in a field that has already seen significant consolidation. We’d be stuck paying whatever licensing fee it asked, and living with whatever features and support it wanted to bother to offer. In the absence of meaningful competition, it would have no reason to spend much on improvement. Colleges and students would suffer.
In most places, home broadband suffers a lack of competition now. A disturbing number of students report that they write papers on their phones, because it’s their only internet access off campus. Now the ISP’s will be emboldened to charge more and provide less, simply because there’s no legal or competitive pressure not to. Add consolidation in the LMS market, and our students will be even more marginalized than they already are.
Part of the visceral appeal of electronics and the web has been the remarkable openness and speed of innovation. We stand to lose that now, with the internet congealing quickly into a few giants that stomp out anyone or anything new. And our students stand to be even more at the mercy of a few behemoths, just like we will be. (Tressie McMillan Cottom has a characteristically smart take here; check it out.)
Technically, Congress could override the ruling, but I’m not holding my breath. It doesn’t seem to care much about the popular will or the common good these days. Our best hope, heaven help us all, may be litigation. At least until there’s a more fundamental change.
In the meantime, here’s hoping that the consolidation wave spares the LMS market long enough for a new administration to come in and undo the damage.
Thursday, December 14, 2017
A new correspondent writes:
I have been working at a medium-sized, multi-campus community college district in flyover country for a decade. After years as a full-time faculty member, attaining tenure in a social science discipline and performing years of service in our faculty senate, curriculum development, various committees (even chairing a significant committee), I developed interest in coming up through the faculty ranks and going into academic administration. My goal was to aim for a Dean of Instruction position after two three-year terms as a department chair, and I’ve received a lot of encouragement from both faculty and administration. To make sure it was the right fit for me, I just completed my first year as a multi-discipline department chair. I’ve been handling budgets, scheduling, faculty and student complaints, hiring and evaluations, shared governance, faculty leadership, and teaching a much-reduced load. I’ve taken advantage of professional development funds to pursue This kind of work really appeals to the way my mind works, as much as I enjoy the classroom, and it confirmed that my choice to make the jump into pseudo-administration wasn’t completely incompatible with my personality, skills, and values.
So, due to retirements and restructuring, several Dean of Instruction positions are opening up and I’ve gotten many encouragements to apply. My conventional wisdom says to wait until I’ve completed at least one three-year Chair term. I feel that I need to “prove myself” more and put in the time. I’m aware that this is a very gendered opinion and that while women are often promoted on their merits and experience, men are often promoted more on their potential.
Any advice on whether I should apply for a Dean of Instruction position when it seems to be too early in my career? Any other advice for faculty interested in going over “to the dark side” of administration?
I’ve mentioned before a key moment in grad school. I was angst-ing about whether my dissertation was really finished, or it needed still more revision. The following exchange with my roommate clarified matters:
RM: How many chapters do you have?
RM: How many do you need?
Me: Well, five…
RM: Turn it in. Make them tell you what’s wrong with it.
He was right. I turned it in, and my advisor immediately set up a defense. It isn’t perfect, by any stretch, but it has the undeniable virtue of being done.
The same is true of job applications. Postings will usually specify “minimum” and “preferred” qualifications. “Minimum” is supposed to mean that you won’t even be considered if you don’t have it; “preferred” means it might help, but it isn’t required. If there’s a job you want and you meet the “minimum” requirements, I don’t see any reason not to apply. Make them tell you what’s wrong with it.
Admittedly, this strategy involves risking rejection. But if you don’t have a reasonably thick skin, you really shouldn’t go into administration. Just treat each interview as an opportunity to learn about another college and get better at interviewing. If an offer materializes, great.
In my experience both as a dean and supervising deans, I’ve never found age or experience beyond a certain minimum to be terribly predictive of performance. Other qualities matter much more. Contexts vary, but generally, the best deans have excellent communication skills, poise under pressure, a dedication to the mission of the college, and a strong academic sense. More time in a department chair role may help a little with developing a sense of which conflicts to escalate and which to defuse, but you may very well already have it. And fire in the belly goes a long way.
Rather than scrutinizing yourself, I’d recommend finding out what you can about the places you might apply and the realities of the jobs. Self-awareness, as opposed to self-scrutiny, can help you determine which ones are likely to be good fits. Then, with the jobs that appeal to you, take a shot. Make them tell you what’s wrong.
Wise and worldly readers, what do you think? Should she start applying, or should she wait until a second term as chair?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Wednesday, December 13, 2017
‘Tis the season for evergreens, so I thought I’d bring out this one. It’s from last year, but with a few details changed, it’s just as true today. To update it, just replace every reference to “ten percent” with this year’s figure of “thirteen percent.”
Brookdale recently ratified a contract with its faculty union, after a bit of a bumpy ride. I was on the management negotiating team, so I had a front-row seat for most of the process.
I can’t disclose anything confidential, but I don’t need to. Here’s what it boiled down to:
Union: Health insurance is eating our raises!
Mgmt: Health insurance is eating our budget!
Insurance Company (in the corner): Nom nom nom nom (burp) nom nom nom (chair collapses) nom nom nom
The bulk of the conflict was over how to divide the rapid increases in the cost of health insurance. The rest of it was relatively straightforward.
I suspect we’re not alone in this.
The catastrophic cost -- and rate of increase -- of health insurance is the 800 pound gorilla of higher ed finance. It’s the primary driver behind adjunctification. It’s increasing faster than any of our revenue sources, and it seems to be picking up steam. In negotiation sessions, it’s the sun around which every other issue orbits.
(For those keeping score at home, that makes it a nuclear fusion powered 800 pound gorilla that knows how to drive a steam-powered car, and anchors a series of satellites. Scary stuff.)
To make it concrete, we have three major sources of operating funds: the state, the county, and students. State and county funding have been flat for years, and enrollment is dropping. Meanwhile, the cost of health insurance goes up by at least ten percent per year. Do the math, projecting out a few years. It’s not pretty.
Labor negotiations are difficult because one of the parties -- the one getting the best deal -- isn’t at the table. It just jacks up prices, and the rest of us pay them. Internal disputes are over how much of each year’s cost jump is borne by whom. Nobody internal comes out ahead.
Of course, over the long term, unsustainable trends aren’t sustained. This one clearly can’t be.
Our health insurance system, if you want to call it that, was an accident of history. It emerged in its present form as an end run around wage and price controls during World War II. With pay levels frozen, companies that wanted to recruit workers had to find other enticements, so they developed packages of benefits. By the time President Truman (!) got around to proposing national health insurance, the AMA was able to argue that it was largely unnecessary. Add some red-baiting (“socialized medicine!”) and the racial politics of the New Deal coalition, and the end run became the new normal by default.
That’s why literally no other advanced country has anything like it.
Postwar prosperity made the system tenable long enough for it to start to seem natural, but it never really made sense. Now we’re seeing the flaws in the system get so large that they start to deform or consume other sectors of the economy. Prospective entrepreneurs don’t start companies because they can’t afford to pay for their own health insurance. Employers everywhere pay careful attention to maximum hours for part-time status, because the marginal cost of going over is prohibitive. If you don’t believe me, ask your HR office what the monthly premium for COBRA is.
Locally, we managed to piece together a deal that puts off the day of reckoning for a few more years. I’m glad we did -- really, really glad we did -- but the basic underlying trendlines are still there. That’s not something we can solve locally. That requires a national solution. Absent that, I foresee the rides getting bumpier and bumpier until something breaks.
Bumpier and bumpier? Check. January’s increase will cost the college a million dollars. That’s from a total college operating budget of about 78 million. And that’s just health insurance, just the increase, and in just one year. In a time of declining enrollment.
Trees, even evergreens, don’t grow to the sky. Here’s hoping this one stops before it blocks the sunlight for everything else.
Tuesday, December 12, 2017
Several years ago I was part of a conversation with a professor about the college’s budget. Predictably, the budget was tighter than any of us would have liked. He put forth an impressive wish list, which he suggested funding with a double-digit tuition increase. When I balked at the size of the increase, he responded that “it’s just beer money” and the students could easily afford it.
He didn’t get the increase he wanted, but the comment stuck with me.
Based on what I know of him, I don’t think he was malicious. He actually thought he was right. And there are some students for whom he was right. But for the ones about whom he was wrong, he was terribly wrong. The blind spot could have had serious consequences.
I was reminded of that in reading the latest paper from Katharine Broton and Sara Goldrick-Rab, “Going Without.” It’s based on multiple surveys of tens of thousands of students at two year and four year colleges. Among its findings are that at least a third of community college students are “housing insecure,” including 14 percent who are homeless.
That’s bad enough, but it gets really shocking when Broton and Goldrick-Rab disaggregate the numbers. On Table 5, they indicate that over 57 percent of the students coded as “homeless” are working, and they’re working an average of just over 31 hours per week.
That was a jaw-dropper for me. Students who are working for pay over 30 hours a week are homeless. If you take the unemployment rate as your primary economic indicator, you’ll miss that. Anyone who’s working 30 hours per week and going to college can’t be accused of slacking. That’s not the core issue. The core is a pincer movement in which wages have stayed low while housing costs, textbook costs, and, yes, tuition, have increased.
Among the four surveys they examined, anywhere from 21 to over 36 percent answered “yes” to “Were you ever hungry but did not eat because there was not enough money for food?”
This isn’t a matter of beer money. It’s a matter of basic survival.
Yet the gaps we force students to straddle largely fall into the blind spots of policymakers. Part of that is because of the average age of policymakers, as opposed to students. Part of it is that most policymakers attended four-year colleges, often well-funded ones, and had no direct experience of serious hunger or finding a place to sleep at night. In their own experience, “it’s just beer money” may have been true. Broton and Goldrick-Rab note that hunger and homelessness are far more common at community colleges than at four-year colleges. (What that says about the appropriateness of measuring the performance of the two sectors by the same metrics, I’ll leave as an exercise for the reader.)
One quibble: Broton and Goldrick-Rab claim, in passing, that “[c]ommunity colleges rarely provide on-campus housing…” That’s less true than it used to be; according to Walter Bumphus’ talk at Middle States last week, nationally, 27% of community colleges offer on-campus housing. The number is higher than I expected, and climbing. It varies dramatically by state and region; for example, it’s commonplace in New York but nearly absent in New Jersey. In places where it exists, it may provide one option for housing-insecure students. But even allowing for a growth of on-campus housing in this sector, the numbers of students struggling for a safe place to sleep are staggering.
Sustainable solutions will probably require concerted efforts across sectors. I’ve read about some community colleges designating people on campus whose job it is to connect students in need to available social services in the area, which strikes me as a fantastic idea. Emergency funds, often from college foundations, can make the difference between a student staying housed and getting evicted, or between staying in an abusive situation for the sake of shelter and finding someplace safe. Even something as straightforward as Open Educational Resources (OER) instead of expensive textbooks can free up money for food, shelter, and transportation. That’s well within our power, as a sector. If the same Pell grant has more left over because a school has moved to OER and the student doesn’t have to buy books, those dollars become available for basic needs.
Larger changes require politics. Disinvestment in public higher education is a political choice. Upward distribution of wealth is a political choice. So are exclusionary zoning, financial aid rules based on the assumption that a student is 18 and full-time, and the minimum wage. No college or set of colleges can undo all of that alone.
But a tip of the cap to Broton and Goldrick-Rab for doing the shoe-leather epidemiology that may help us actually start to get a handle on this. No student who works 30 hours a week and takes college classes should be homeless. If we could stop thinking about aid as subsidized beer money, we might actually make some headway.
Monday, December 11, 2017
Remember when it was possible to forget?
Last Friday I was able to attend a panel discussion of several Brookdale faculty, organized by the students of the Phi Theta Kappa chapter here. The topic was supposed to be about shifting standards of beauty over time, but the discussion had a mind of its own, and it quickly became examining the stress caused for today’s students by trying to live “curated” lives on Instagram and Snapchat.
I had never seen as pronounced a generation gap as I did there. The students, mostly of traditional age or close to it -- honestly, the older I get, the harder it is to tell -- told the faculty (and other older folks in the audience, such as myself) about how they live their lives around social media. One young woman on the panel mentioned as an example that before going to a concert, she picks an outfit that she thinks will look great on Instagram, and spends the first half-hour or so at the venue setting up her shot. The point of the concert is the photo. A professor on the panel responded that when she was younger, she’d choose outfits for concerts, too, but the point of the outfit was to pick up guys. Now, the record of the moment becomes more important than the moment itself.
Looking back on my teen years, it’s only by the grace of God that most of my worst cringe-inducing moments have vanished down the collective memory hole. I remember most of them, but there’s no documentary record. They never went viral, and some of them could have. Today’s young people have all of the same pressures around self-image and dating that we did, plus a new layer of public recording that we didn’t.
As useful as social media can be, there’s also something to be said for forgetting as an act of mercy. Those early years are a time of trying on different identities, ideas, styles, and ways of being in the world. Some of them work, and stick, and some of them don’t. Keeping the clunkers around forever could inhibit the trial-and-error that allows real growth.
I’m thinking here of Richard Sennett’s classic “The Fall of Public Man,” in which he argued that the gradual replacement of social “roles” with the idea of social authenticity was actually inhibiting, because in the new regime, flaws or missteps are taken to reveal something broken about the actual person. Hearing 19 year olds now worry that pictures they post of themselves being exuberant may be used against them later when they apply for jobs or run for office struck me as the reductio ad absurdum of Sennett’s argument. The curated self is very much a role, a performance, but we don’t read it as one. We take it literally, and form judgments about the person based on the persona.
As the discussion went on, I wished it had addressed the shift at the turn of the millennium from “mass” media to “social” media. I’m old enough to remember when television had four channels: NBC, CBS, ABC, and PBS. When the means of cultural production were few and tightly centralized, most of us encountered them only as consumers. Now that most young people are producing culture daily through social media, they’re living the pressures of producers, as opposed to consumers. And they’re doing it before they’ve had a chance to grow into themselves as adults.
In many ways, that’s terrific. Young people who don’t fit in to the dominant culture where they live can find kindred spirits online, drawing hope and strength from discovering that they’re not alone. They can share artistic breakthroughs in real time, rather than having to find and charm some skeevy network or music company executive.
But they miss what we had and didn’t know to appreciate: the luxury of having your awkward phases forgotten. They don’t get to rehearse before taking themselves public.
I’ve mentioned before a temperamental allergy to arguments for turning time backward, so I won’t try to argue for that. How do you keep them on the farm once they’ve seen Instagram? Instead, I suspect the way around this is through it. While many, many more of us have seized the opportunity to produce, we’re also all still consumers. If we gradually start to look at what we’re consuming through the eyes of producers, we may start to realize just how ridiculous it would be to punish some future rising star for an unguarded moment she posted at 19. We may not be able to forget anymore, but we can still choose to forgive.
Mercy can come from the fates, but we can produce it ourselves, too. As long as we’re producing everything else, a little mercy might not be a bad idea. My generation and those before it were granted the accidental mercy of a lifetime of do-overs. Today’s students deserve no less.
Sunday, December 10, 2017
The last day of the Middle States conference finally got around to a discussion of issues relevant to community colleges. Walter Bumphus, the President of the AACC, gave the Friday keynote.
It was a kind of survey of issues, not going terribly deep on any of them. It was lively and entertaining, but I came away disappointed that it could have been so much more.
He opened with a reference to Elizabeth Warren’s description of regional accreditors, in the wake of the meltdown of the for-profits, as “the watchdog that didn’t bite.” I was hoping he’d go somewhere with that, but just cracked that “the last thing we need is the Feds in accreditation” before moving on.
That was a missed opportunity. The day before, Peter McPherson mentioned (correctly) that part of the problem with accreditation as a tool for quality control is that it’s binary: either you’re accredited or you aren’t. That may be okay for a startup, but for a mature institution, it tends to lead to a certain skepticism. The City College of San Francisco wasn’t allowed to fail, because it’s too large and important to the city. Loss of accreditation is a sort of nuclear option, but every existing option short of that relies on the believability of the nuclear option. If nobody seriously believes that an accreditor will go nuclear, the intervening threats lack a certain bite.
McPherson proposed instead that sanctions could come in parts or stages. When I asked for an example of what that might look like, he came up with caps on financial aid availability. For instance, a college might become eligible for only 90 percent of the previous year’s Title IV allocation. I don’t think that particular method makes sense -- student loan eligibility follows the student, and some schools have dropped out of the program altogether -- but the concept makes sense. If there’s something between “double secret probation” and complete shutdown, the accreditors might become more willing to act, and colleges would be forced to take the threat of oversight more seriously. A less gun-shy accreditor might be able to head off disaster earlier.
Bumphus went on to give an overview of the contours of the debate around the reauthorization of the Higher Education Act, as well as the politics of higher ed in the Trump years. He fired off some good lines -- “If you aren’t at the table, you’re on the menu” -- but otherwise covered well-worn territory. I was heartened to hear a distinction between “registered” apprenticeships and “recognized” apprenticeships, but otherwise it was largely about enrollment and funding. (To be fair, he did spend some time on DACA, which is a very real issue in this sector.) He took the existing business model for granted.
That’s understandable, but again, a missed opportunity. Part of the widespread presidential turnover throughout the sector that he mentioned -- he cited 250 presidencies turning over per year, out of just over 1100 community colleges nationally -- comes from a failure to come to grips with the need to change the underlying business model. Growth forgives many sins, so the flaws in the model could be tolerated as long as there was a demographic and/or political tailwind. But with demographics in much of the country working against community colleges, and with increasing hostility from various parts of government, the gaps in the model are becoming apparent. That puts presidents in a tough spot, since the short-term political cost of structural change is often higher than simply coasting downwards. And many boards simply don’t understand the issues at hand well enough to distinguish necessary conflict from unnecessary, or what’s under a president’s control from what isn’t. They wind up blaming presidents for demographic shifts, or expecting miraculous change without anyone getting upset. Nobody can live up to that.
Bumpus mentioned a new “onboarding” program the AACC is offering new presidents, and that’s fine. But if you don’t address boards and business models, you’re setting the newbies up to fail. Yes, we’re due for a recession, and that may provide a short-term enrollment boost. But unless we address the longer term issues underlying the sector’s struggles, we’re asking presidents to be superhuman. Nobody is superhuman.
Still, I was gratified to see community colleges take center stage. Now we need to stop pretending that the challenges are entirely short-term, and start digging deeper.
Thursday, December 07, 2017
I’m in Philadelphia at the annual conference of the MIddle States Commission on Higher Education, which is the regional accreditor for the mid-Atlantic states and Puerto Rico.
It’s different from the League or the AACC conference in that it doesn’t focus particularly on community colleges. It covers everything from community colleges to research universities, which means I sometimes get to hear presenters from other sectors of higher education. And I can’t help but notice certain patterns.
The Thursday morning keynote was by Peter McPherson, the president of the Association of Public and Land-Grant Universities. He addressed accountability in higher ed from the perspective of the group he represents. Which is to say, it was odd.
He rightly called attention to the relative paucity of low-income students at some of the more prestigious universities, and also noted correctly that much of the issue around student loan defaults is really about dropouts. By his statistics -- and I didn’t catch the source or the frame of reference -- the default rate for student loans among college dropouts is 24%, as opposed to 9% for graduates. So far, so good.
But it was largely downhill from there.
His solution to the problem of elite campuses getting ever more elite was...drum roll...a Gates-funded program to increase the number of low-income students there.
Nothing inherently wrong with that -- it’s a good thing, as far as it goes -- but it’s a boutique solution. Those of us in the trenches on student success issues know that “boutique” is a dirty word. If you want to make a significant and lasting change, you have to get at structure. In this case, that would mean making it easier for community college students -- a much more demographically diverse group -- to carry their credits with them when they transfer.
That would make a sustainable difference over the long term. It would ratify community colleges as on-ramps to the higher echelons of higher education, as they were intended to be. It wouldn’t even require grant money to keep going. But it would involve political battles, both among institutions and within them.
If he mentioned community colleges at all, I missed it.
To add insult to injury, he also endorsed “risk-adjusted assessment,” which would let elite institutions off the hook for much of what we have to do. It’s annoying enough at that level -- Harvard gets a free ride while we have unfunded mandates -- but it would also enable the unfunded mandates to increase unchecked over time, since the elites who largely set the expectations wouldn’t have to meet them themselves. It would be akin to Congress exempting itself from civil rights and sexual harassment rules. How did that work out?
McPherson was speaking as a representative of research universities; I get that. His suggestions made sense from that perspective. But higher education isn’t just research universities. Any serious discussion of class polarization in higher ed has to include community colleges. Any serious discussion of accreditation or assessment has to recognize that putting the most expensive protocols on the least-well-funded institutions isn’t likely to lead anywhere good.
The whole point of getting the sectors together, to my mind, is to enable a broader look at the entire higher ed ecosystem. That involves acknowledging each component of the ecosystem. Should the Ivies and state flagships have more students from the bottom half of the income distribution? Absolutely! Wherever might they find talented students from the bottom half who have shown the capacity to excel at college-level work?
Tomorrow Walter Bumphus, from the AACC, will keynote. Here’s hoping for a needed corrective...
Wednesday, December 06, 2017
DeVry is being handed over, for no money, to a for-profit college that has 600 students. Its current parent company isn’t even getting a player to be named later.
I’m trying to figure this one out.
Several recent moves in the for-profit industry have left me agape, but this one is striking for the sheer incongruity of size. An international chain being taken over, for no money, by a college the size of my high school graduating class?
As longtime readers know, I used to work at a DeVry campus. It was my first “real” job out of graduate school. DeVry was in rapid-expansion mode at the time -- it was the late 90’s tech boom -- and the nonprofits had already made the move to adjuncts. I could adjunct at Rutgers or be full-time at DeVry; I chose the latter, because rent doesn’t pay itself.
This may sound like rose-colored glasses now, but I swear it’s true: for a while there, DeVry was actually trying to gain a sort of academic legitimacy. I caught the tail end of that and the beginning of the decline. When the decline started to accelerate and the “for-profit” part started to drown out the “education” part, I bolted for the community college world. That was over fourteen years ago.
Still, it’s hard to imagine the place being swallowed by a college of 600 students. When I worked at the North Brunswick campus, it had 4,000 students by itself. And it was one of two dozen campuses around the country.
DeVry’s decline was largely self-inflicted, mostly because when it faced a choice between improving quality and improving quantity, it chose the latter. You can only hollow out quality for so long before something terrible happens.
I’ve been trying to suss out the motivations on both sides for the “sale.” (“Handoff” comes closer to the truth.) As near as I can figure, the “buyer” stands to gain scale quickly without paying for it. The last decade has shown pretty clearly that tuition-driven small colleges will struggle. It’s easier to change the “small” part than the “tuition-driven” part. And the “seller” stands to offload a whole bunch of potential legal judgments against it. The “buyer” may be gambling that its very lack of assets will deter lawsuits, just because there’s nothing to win; the “seller” has probably determined that it’s better off just washing its hands of DeVry and walking away.
I can’t really see the “buyer” doing well long-term, unless it knows something substantial that I don’t. The organization had been so cost-conscious for so long that I don’t see a lot of efficiency gains to be had. And its course offerings aren’t particularly unique. You can get a business or CIS degree in a lot of places, and often less expensively. Some of what was once innovative about it has long since become common practice elsewhere. In a competitive industry, I’d be hard-pressed to name what would make it special.
Of course, as I write this, Congress is looking at changing the Higher Education Act in ways that will make for-profit college’s lives easier and public colleges’ lives harder, so the “buyer” may be taking a calculated risk that it can wait until the good times for for-profits come back. If the buyer is struggling to survive anyway, I could see the logic behind a “what the hell” move.
Wise and worldly readers, is there a logic to this move that I’m missing? At best, this looks like a “cut your losses” move by the “seller,” and a Hail Mary pass by the “buyer.” Is there a better reading?