Thursday, February 21, 2013
Oscar Wilde supposedly once claimed that he would have been a socialist, but he liked to keep his evenings free. All that civic participation would have crimped his style.
I was reminded of that this week in discussion with some faculty who were balking at the time commitment involved in serving on search committees. They all believe in heavy faculty involvement in searches, but all that participation really adds up.
They’re certainly right that search committees are major time commitments. We have some pretty sophisticated protocols for staffing them, trying to balance veterans and newbies, faculty and staff, men and women, subject matter experts and fresh eyes. Unlike many private sector companies, we don’t let HR do the first round of screening; the search committee culls through the entire set of applications before deciding on who it would like to invite for first-round interviews. Depending on the position, the applications can run well into three digits.
Just scheduling committee meetings is a major endeavor. Faculty have different teaching schedules from each other, and staff members’ calendars are different still. Each committee has to be “charged,” to get its affirmative action training, and to have its “what are we looking for?” conversations. Then it decides who to invite for first round interviews and has to arrange the internal logistics for 8-10 of those. Finally, it has to decide on 3-4 finalists to send forward.
It’s a lot of work. We have a rule that anyone on a search committee is excused from all other college service for that semester, in recognition of the time it takes. (College service refers to other sorts of committees, but not to teaching.) Even with that, some people find the task too onerous.
But there’s no appetite for streamlining, either.
It wouldn’t be all that hard to streamline. Let HR decide who to invite for first-round interviews, and bring the committee into play for the first time at that point. Done and done.
Culturally, though, that’s just not an option. The committees don’t want to give up control, and control requires work. The process can be participatory, or it can be low-impact, but it can’t be both. Participation takes time.
In a context in which most people are teaching four or five classes per semester, that’s not just carping. Time is at a premium. That’s even more true as the semester progresses, and just fitting in all the interviews before the deadline becomes a challenge. And course releases for search committee members are neither economically sustainable -- you’d be surprised how quickly the cost adds up -- nor practical, given that hires tend to come in areas where we’re short-staffed already. When a department is already running thin, adding several course releases makes it even thinner.
Wise and worldly readers, has your college or company found a relatively practical and sustainable way to balance participation and efficiency in hiring?
(Program Note: Due to some life events, the blog will be quiet next week. I’ll return on Monday, March 4.)
Wednesday, February 20, 2013
I like this story a lot, even though it’s a little pessimistic.
Apparently, Klamath Community College, in Oregon, has decided to make a series of changes to improve student success rates. Some of the changes are relatively straightforward, such as requiring academic advising and new student orientation. But it has gone farther than that, and eliminated late registration.
Predictably, eliminating late registration came with a short-term cost. The college has lost about $400,000 in tuition revenue this year, in a climate in which every dollar counts. There’s no easy way around that. Ripping off the band-aid will hurt.
The college is gambling that it will eventually make up most or all of that through improved retention and graduation rates, and possibly through improved state funding that’s contingent on performance. To the extent that the change will increase performance, Klamath stands to gain.
I like the story a lot because it represents an attempt to verify empirically what many of us suspect theoretically. And unlike some student success initiatives that are either unethical or fiscally unsustainable, this one may be sustainable over the long term after the sting of the first year wears off.
It makes sense, intuitively, that students who sign up at the last possible moment are putting themselves at an academic disadvantage. Depending on how late “late” is, they may have already missed a week of classes. Textbooks sometimes sell out, so a student who arrives late may not have access for a little while. Financial aid and transportation arrangements sometimes take a little while to gel; if the student is already behind academically, it may be difficult to apply the necessary focus while so many balls are in the air.
The intuition seems empirically correct. Nationally, the literature suggests that the last in are the most likely to drop out. (My own college has run its own numbers, and found the same thing.) Yes, there’s a short-term fiscal temptation to take the late arrival’s money, but the odds of a good outcome aren’t nearly what they should be. And from a pedagogical standpoint, faculty are much more likely to do their best work when every student is present from day one, books in hand, finances arranged, and able to focus. That won’t solve everything, but it helps. Better to have the entire semester to work with the student than to have to get the weakest students show up already behind.
Ideally, a “no late registration” policy should be paired with an option for “late start” or “part of term” courses, so that a student who shows up, say, September 7th won’t have to wait until January. If that student is instead allowed to register for classes that start, say, October 1, then she’s less likely to vanish. And if she isn’t hustled into classes that have already started and for which she doesn’t have books, she’s less likely to fail.
Ideas this good often backfire, due to perverse incentives baked into various funding systems. I hope that fate doesn’t befall Klamath. The idea is too good to sacrifice to some idiotic technicality. Kudos to Klamath for taking the plunge.
Tuesday, February 19, 2013
In response to yesterday’s post about the seeming invisibility of the social sciences, a commenter asked me why, if I value the social sciences so highly, I strongly advise against people getting Ph.D.’s in them. Shortly after that, I saw Michael Berube’s essay about graduate admissions, in which he kinda, sorta suggested that they should be cut back, but not unless the departments are willing, and it’s complicated, and anyway aren’t we all “awesome.”
A few thoughts.
First, no intelligent observer can deny that the production of doctoral candidates in the evergreen disciplines far exceeds the demand for them in academic positions. According to this piece in yesterday’s IHE, that’s even true in many scientific fields. (The headline suggests that tenure-track positions are the true “alt-ac.”) But degree inflation isn’t limited to academia; as the New York Times pointed out yesterday, many employers now use a college degree as a first-level screen for job applicants, even for jobs that don’t use any college-level skills. It’s enough of an employer’s market that they can. The story profiles a law firm in Atlanta in which even the gofer has a four-year degree; among the benefits is a healthy crop of college football rivalries to make office parties lively.
So there’s that.
That’s why I see no contradiction between saying “students would benefit from taking Intro to American Government” and “I wouldn’t encourage my kids to do what I did.” The former speaks to the intellectual richness of the subject; the latter speaks to the institutional economics of higher education. Those are not the same thing. I can simultaneously believe that many students would benefit from a thoughtful examination of politics, and that it’s unlikely that tenure-track jobs in poli sci will suddenly explode. I love jazz, but I have no illusions that this year it will outsell Bieber.
A more useful line of inquiry, I think, would be to look closely at the structure and assumptions of graduate education, given the institutional economics of higher education in America. Does it really make sense to continue to produce so many people whose training is geared so strongly to tenure-track jobs that are increasingly scarce?
Over the long term, which is getting shorter all the time, I don’t see the current system as sustainable. Law schools are making some painful adjustments, and I don’t see why graduate schools should be immune. But in the meantime, some pretty basic reforms seem in order.
First, every graduate program with an academic focus should include some sort of introduction to the institutional economics of American higher education. It’s simply inexcusable that they don’t. Most programs have some sort of first-year seminar; that seems like a perfect place to do a basic introduction to the realities of the profession. If that leads to a certain amount of attrition, as disillusioned students pursue greener pastures, that’s not necessarily a bad thing. And if it doesn’t, at least the students will have a more realistic picture of what’s out there. Yes, research universities are conspicuous and, in many ways, attractive. They’re also atypical. Unless you’re coming out of a top-ten program, you’re likelier than not to land elsewhere.
Ideally, that course could take both “macro” and “micro” perspectives. From a macro perspective, what are the academic employment trends since, say, 1970? From a micro perspective, what do actual entry-level jobs actually pay? What do they actually require? Do they resemble the idealized image many high-achieving undergrads chase when they go to grad school? Let’s tell some truth. If Berube is right -- and I think he is -- that “we have to secure the future of institutions that permit freedom of inquiry and freedom of thought,” then let’s take a good, hard, serious look at what it means, and what it requires, to secure those institutions in this political economy.
It may have been reasonable to skip the introduction to the profession in the boom years, but the boom ended forty years ago. It’s not reasonable anymore.
Second, and I can’t believe this is still controversial, stop overproducing. Yes, that may lead to a loss of “access,” but in a setting in which so many people can’t find anything more lucrative than adjuncting, I have to ask “access to what?” Some graduate programs should shrink, some should re-focus, and some should phase out. After forty years of this, it’s unethical not to.
Finally, and again, I can’t believe this is controversial, pay some attention to teaching. At both community colleges and teaching-centered four-year colleges, the faculty positions that do exist go to people who teach well. (To its credit, English has done a better job of this than many other fields, including my own.) Grad students pick up messages from their advisors; if teaching is considered nothing more than a distraction from research, those grad students will have to do some serious unlearning before becoming useful outside of a few rarified places.
A more realistic graduate education system might result in fewer dashed hopes, better teaching, and more people with enough sense of how things actually work that productive reforms would be likelier to ensue. I see all of those as positive goods, even if they aren’t quite awesome.
Monday, February 18, 2013
STEM initiatives are all the rage in academia these days. They’re popular with policymakers, who see them as a form of high-end workforce development; they’re popular with parents, who see them as high-end job placement; and they’re somewhat popular with students. At the community college level, developmental math has long been -- and continues to be -- a major challenge for graduation rates; it continues, rightly, to receive substantial attention. From the bottom of the curriculum to the top, STEM fields are in a kind of heyday.
Meanwhile, the higher education press is rife with humanists. One would be forgiven for mistaking the Chronicle of Higher Ed for the house organ of the MLA. In the popular press, to the extent that higher education is discussed at all, it’s often portrayed as a battle between the “fuzzy” humanists -- variously understood to be hand-wringing liberals, stuffy antiquarians, or tattooed lesbians, depending on taste -- and the pragmatic business/engineering types who are busily preparing students for the Real World, with varying degrees of success. The subtext is that the business/engineering types are winning; whether you want to read that as progress or decline is up to you.
As a card-carrying social scientist, I can’t help but wonder at the relative silence around the social sciences.
They’re still pretty widely taught. Intro to Psychology is typically one of the most popular courses among American undergraduates. Intro to American Government -- my old haunt -- is weirdly marginalized, even though the subject matter is of obvious interest. “Sociology” as a brand is having a rough go, but the topics it addresses remain compelling. Economics is mostly honored in the breach. Even history -- whether you classify it under social sciences or humanities -- mostly gets ignored in the popular discussion, except for increasingly tired tirades about whether social history is progress or decline.
Yes, I have some bias here. But it’s also true that psych and soc and poli sci don’t get anywhere near the attention that math and English do.
I’m told that, at the K-12 level, the relative neglect of “social studies” is an outgrowth of No Child Left Behind. I’m old enough not to believe that. It didn’t get much attention before NCLB, either; at most, NCLB may be guilty of making a bad situation worse. But it was already bad.
Popular versions of the social sciences sell quite well. Malcolm Gladwell, Dan Ariely, Nate Silver, and the Freakonomics guys wouldn’t have the careers they do if it were otherwise. The subject matter of social sciences -- money, power, sex -- certainly holds popular interest. And from a scholarly point of view, the social sciences offer a wonderful duality. They’re both intuitive and empirical. They lend themselves nicely to both qualitative and quantitative analysis. In fact, the best work tends to draw deliberately on both. As “general education,” it’s excellent.
Yet it’s largely forgotten. I can’t remember the last time I read about a statewide focus on improving student outcomes in Intro to American Government. (“Civic engagement” gets some traction, but that’s not the same thing.) We have national foundations competing with each other to put forward the Next Great Idea for math, but I haven’t heard anyone address Intro to Psych.
It’s an odd elision. In my more conspiratorial moods, I like to think that the relative demotion of the social sciences is a conspiracy against critical thinking applied to social issues. But then I calm down and realize that it’s probably more a matter of simply taking for granted.
To my mind, the social sciences provide excellent fodder for quantitative reasoning (“correlation is not causation”), communication skills (can I convince you of my position on this hot-button issue?), and information literacy (is Fox news a reliable source?). They address wonderfully rich issues, and at their best, they can suggest that things we take as “given” are, in fact, changeable.
Wise and worldly readers, am I just getting this wrong, or are the social sciences getting mostly ignored? And if they are, should they be?
Thursday, February 14, 2013
From the “other duties as assigned” file: yesterday I had to go in for a medical procedure. In the course of making small talk with the tech, she mentioned that her daughter is looking at colleges, but they’re both worried about student loans. I mentioned where I work, and for the next half hour, the conversation was all about transfer, comparative tuition levels, student loans, and the difficulty for new grads who can’t find jobs but have huge loans to pay off. All of this during the procedure.
If nothing else, it brought home to me that the issues we’re dealing with are not abstract in any way.
The Girl: “The future may take a while.”
I’ve mentioned before my complete bafflement at the ubiquity of flat roofs in snowy climes. As those of us who’ve lived through snowy winters can attest, snow has a way of melting. And when the roof is flat, the water has a way of just sitting there. All that water is up to no good.
Now I’ve got some scholarly backup. Apparently, the flat roof fetish was an outgrowth of a really unfortunate flirtation with modernism in the mid-twentieth century. By a cruel accident of timing, the worst of the modernist fad hit at the exact same time as a building boom in public schools and colleges.
Gravity, people. Gravity. Pitched roofs are your friends.
The government’s “college scorecard” is out.
It’s a bit mystifying. It doesn’t include any measure of academic quality, which you’d think would be a key component of “value.” And its cost of attendance figures are simply mystifying. It’s supposed to help prospective students determine how much bang for the buck they’ll get at one school as opposed to another, but there’s no measure of “bang,” and a severely flawed calculation of “bucks.” Color me unimpressed.
If it wants to achieve something, take a page from Moneyball, and figure out the academic outcomes a college achieves as measured against the outcomes expected from the profile of its entering students. If a college punches above its weight, then it’s doing something right; if it skews wealthy but still gets lousy results, it’s doing something wrong.
I understand that any single scorecard will necessarily be reductionist; that’s the nature of a scorecard. It will have to focus on just a few numbers. All the more reason to choose the right numbers.
Convinced that The Boy got taller in the three seconds I was looking in another direction, I measured him the other day. At age eleven, he’s five foot eight.
The future may take a while, but he seems to be in a hurry. I don’t know what we’ve been feeding him...
Wednesday, February 13, 2013
I’ve been a fan of Kevin Carey’s for some time. He gets a lot right, and even when he’s off, he’s interesting.
This week, he’s true to form.
In response to President Obama’s hinted-at plans to open up financial aid for higher education providers other than traditional colleges, Carey developed a wish list that’s well worth reading. He’s almost certainly right that direct government price controls on colleges wouldn’t work, or at least, wouldn’t work in a constructive way. (Employer-based health insurance was born as an outgrowth of wartime price controls. How’s that system working out?) Real innovation typically comes either from new entrants into a field, or from panicked incumbents threatened by new entrants. Without new entrants, you don’t get major change. Opening up the financial aid system to new providers with new methods doesn’t guarantee a good outcome, but it certainly improves the chances.
Carey starts from that correct observation, and moves quickly to “and here’s how to do it.” And that’s where he gets...interesting.
On the one hand, he calls for accreditation to move from institutions as wholes, or even degree programs as wholes, to the course level. As he put it, “what if you want to specialize and provide nothing other than the world’s greatest Linear Algebra class?”
I’ll flip the question around. What if you wanted to be able to offer a whole host of courses in a bunch of different majors? You’d have to go through the accreditation process for every. single. course. The administrative overhead -- assessment, documentation, verification, and the like -- would skyrocket. It’s even worse if you want to offer something outside the traditional disciplines. Linear Algebra is pretty well established. But what if you want to offer, say, a philosophy course on the St. Louis Hegelians? (Worst baseball team ever. “There is nothing in the essence of the knuckleball that will not become evident in the series of its appearances...”) Where would you even find the standards to follow? What if you wanted to offer a course on the latest cutting-edge findings in a field -- the stuff that hasn’t been standardized yet? I get a headache just thinking about it.
Worse, students would be left on their own to cobble together coherent and recognized programs of study from the scraps provided. Many years ago, the economist Ronald Coase noted that the great utility of the “firm” in economic terms is that it reduces transaction costs. It routinizes, which is its great strength. Shatter firms, and every transaction has to start from scratch. In a market with a severe information asymmetry, such as higher education, the idea of loosing untrained 18 year olds (or busy and distracted 38 year olds) into the virtual wilds to piece together what they can is a recipe for disaster, even assuming that each individual piece is good.
Colleges as “firms” provide more than just courses. They provide structure. They provide guidance (usually called “advising” or “counseling”). They provide legibility. They provide social support, quiet places to study, and the reassurance of knowing that others have done what you’re trying to do. They allow for the serendipity moments of discovering that the course of study you thought you wanted wasn’t really for you, and that you’re actually much more fascinated by something else.
Offload all of those costs onto students, and you’re creating an inefficiency of monstrous proportions. Worse, the students who would lose the most are the ones with the fewest resources at the beginning. The wealthy, well-prepared, well-connected kid may be able to fend for himself relatively well in the virtual wilds; the poorly prepared and unconnected will likely either fall prey or fall out.
The one virtue of the course-by-course approach is that it could conceivably open the field to a host of scrappy new providers. But then Carey falls into an uncharacteristic bit of wealth worship:
Organizational capacity. If Harvard and MIT form a non-profit to do this, their capacity, academic and financial resources should carry weight. If Carl Wieman wants to get in the Physics 101 business, his status as a Nobel prize winner and researcher on best practices in teaching introductory physics should work in his favor. If Pixar wants to teach computer animation, Wall-E should count in their favor. (Cars 2, less so.)
Um, this is not helpful. The problem with the current system is not that Harvard, MIT,and Pixar are somehow shut out. They do just fine. The problem is that scrappy new providers -- and, to be honest, scrappy existing providers -- would be crushed by the wealthy and powerful. Only the wealthy and powerful could afford to run separate accreditations for every single course, and only the wealthy and powerful would have the name recognition to grab the uninitiated from all over the country. And as that happens, the argument for public support for access for everyone will get harder to sustain politically. As both a citizen and an educator, I have to call that a disaster.
Where I draw hope from Obama’s message -- keeping in mind that it’s a long way from footnotes from a speech to enacted legislation -- is in recognizing that innovation requires new blood. It does. Creating space for new providers to try new things -- and thereby put some useful pressure on existing providers to get more experimental -- strikes me as a genuine good. Let’s just not forget the strengths of what we have in the excitement of the possible.
Tuesday, February 12, 2013
One of the reasons I like President Obama is that he’s clearly a Dad. I don’t just mean that he has children; I mean that he’s obviously an involved parent. (If you haven’t seen the video of the two-year-old at the Medal of Honor ceremony, check it out. Obama responded as a seasoned parent would.)
That isn’t always easy. I smiled in rueful recognition at this piece from the Washington Post about the prices that involved Dads pay at work. In brief, fathers who take time to be with their kids are penalized at work even more than mothers are. It’s a kind of gender-deviance penalty. And it’s real. When TB was born, it was on a Monday; my boss told me that he expected me in the office by Friday. He never did that with new Moms. In this line of work, a reduced travel schedule and a reluctance to move on a regular basis bring real costs. Dads who are willing to slough off parenting duties don’t pay those costs; Dads who aren’t, do.
I smiled ruefully again at this piece from the Harvard Business Review. The Nordic countries are known for their world-leading family friendliness. They provide paid parental leave for over a year, and they require that the Dads take at least some of it. As a result, it has become normal for new fathers to take some time with their kids. The benefits of that are everywhere: the mothers get a break, the fathers develop parental competence, the kids get the benefit of attachment to two competent parents, and the workplace gets a generation of both men and women who understand what it is to be both an employee and a parent. As a result, the workplace is structured around an assumption that being a caretaker of some sort is a normal part of life.
We tend not to work that way here. Part-time pay isn’t scaled to full-time in most circumstances, so earning a decent salary (and health insurance) generally involves at least one full-time job. Parental leave is usually brief and unpaid, to the extent it exists at all. I couldn’t help but notice in the HBR piece that the male Scandinavian managers who did so well in Scandinavia struggled when they had to manage American men, who read their egalitarian ways as weak. A brutal system rewards brutal behavior; blind spots reproduce themselves. In a culture that gets some basic issues wrong, thoughtfulness can be a liability.
So I respect the American Dads who do the extra work and walk the walk. They -- and on a good day, I like to think “we” -- are swimming upstream in this culture, but it’s worth doing.
Last night was the town Daddy-Daughter dance, so I took The Girl. We both dressed up -- I even bought her a wrist corsage, the first time I’ve bought one of those since the Reagan administration -- and we went to a local country club for the annual event.
The place was chockablock with Dads and their daughters, mostly ages five to about twelve. (TG is eight.) The first couple of times we went, TG stuck close to me and mostly danced with me. Last year, she split her time about evenly between her friends and me. This year, I was a distinct second place; still welcome, but clearly not the point. (I’m told that in a few years, I won’t even be welcome.) I didn’t mind.
Watching TG with her friends, I couldn’t help but be proud. She was exuberant but not obnoxious, dancing in her patented ways and inventing some new ones. She led one of her friends in a version of the tango, which I didn’t know she knew. Her cluster of friends started a conga line during “Call Me Maybe,” which made up its incongruity in pure charm. And yes, I got out there with her a few times. I even managed to keep my composure during “I Loved Her First,” a song designed specifically to reduce Dads to quivering masses of jello.
But my proudest moments with her were afterwards. She’s comfortable talking to me. It’s the kind of comfort that comes from putting in the time. She floats theories, asks questions, cracks wise, and listens like it matters. We actually enjoy each other’s company.
The clock is ticking on this stage; adolescence with its angst lurks around the corner. At that point, a little more distance may well be functional. But for now, I’ll take a certain countercultural pride in knowing that The Girl -- and The Boy -- are getting real parenting from both Mom and Dad. And I’ll keep pushing, in my way, for a world that recognizes that caregiving is a normal part of life. Let’s get some thoughtful parents building workplaces that allow for thoughtful parents. It can be done, and it’s worth doing. It’s time to stop reproducing blind spots, and start noticing the shards of sheer genius that fall out of eight year old girls’ mouths at the end of the day.
Monday, February 11, 2013
I need some help from my wise and worldly readers on this one. A longtime reader writes:
I used to be staff in a marketing-type capacity at a communitycollege, and after a break in private industry, find myself back in asimilar spot at a different college -- this time a four-year school.About half of our incoming students are transfer students.
I have seen several marketing-type reports related to what incomingfreshmen are looking for in our materials. ("Top tasks", in the lingoof some of the experts.) This has been helpful in prioritizing work aswell as working out what words to use where.
But no one seems to have anything similar for transfer students! Doyou know of any data, reporting, etc, on the information-seekinghabits of transfer students when looking at four-year colleges? Iwould make a guess that some things are the same, but if there aredifferences, then we should be taking them into account.
My first thought is, I don’t know. I haven’t seen any research on that, though admittedly, I haven’t looked. Anyone who can cite anything specific is invited to share in the comments.
From conversations with students who’ve transferred and come back to share their experiences, I can think of a few things they should look for:
- Credit acceptance policies. Nothing grinds a student’s gears more than being told she has to re-take a class she has already passed -- and paid for -- elsewhere. Articulation agreements and transfer blocs are supposed to prevent that, and they help, but the devil is in the details. Frequently a college will proclaim loudly that it takes all credits, but then relegate a bunch of them to “free elective” status. “Free elective” status is where credits go to die. Since very few four-year programs have many “free electives” in them, students wind up having to take (and pay for) far more than they should. In the cases I’ve seen, the culprit is usually the department in which the student’s intended major is housed. It doesn’t want to “give away” too many credits. As “conflicts of interest” go, this is pretty basic.
- Transfer scholarships. This should be self-explanatory.
- Support for transfer students. Is there some sort of recognition of the stress of transfer, or are students just thrown in the deep end and told to figure it out? Is there some sort of community?
Of course, what students should know, and what they actually look for, may not always be the same. Your question was more about the latter, so I’ll throw it open. Wise and worldly readers, has anyone seen any actual research on what prospective transfer students look for when they look at four year colleges?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Sunday, February 10, 2013
The last time I wrote about Sandy Shugart, I gave him a bit of a hard time. Based on his essay last week, I regret that. It’s a wonderful piece, well worth reading.
It’s about the “completion agenda,” and the useful and destructive ways that it can be interpreted. Shugart, the president of Valencia Community College in Florida, points out correctly that reifying “completion” as a goal in itself sort of misses the point; the point is to produce students who are capable of learning at a high level. If we produce highly capable students, the rest will take care of itself, assuming we don’t go out of our way to mess things up.
I’m a bit less sanguine about the whole “take care of itself” piece, probably because I’m very aware of certain basic structural issues.. But what really sold me on the piece was Shugart’s recommendation that we stop looking at individual institutions, in judging performance, and start looking at higher education as an ecosystem. After all, that’s how most students experience it.
To take the easiest case, community colleges have significant numbers of students who are degree seeking, but for whom the degree sought is a bachelor’s. So they do a semester or a year at the community college to save money and gather momentum, and then transfer. As far as our graduation numbers go, those students count as dropouts, even though they got exactly what they came for. (To add insult to injury, they don’t count in the graduation percentages at their destination schools, either.)
At this point, the ‘typical’ bachelor’s degree student has accumulated credits at more than one college. Some level of transfer is the norm. And since students generally don’t have to complete an associate’s before transferring, many don’t. (Online degree completion programs are likely to accelerate this trend.) To hold that against the community college strikes me as a counting error, rather than any real indicator of performance.
(I disagree with Shugart that the answer is to require an associate’s degree before transfer. That seems needlessly paternalistic. I’d rather see measures that consider the reality of how students actually behave, and then let students follow the paths they choose. If they want to do a year at a cc and then move on for a bachelor’s, I have no principled argument why they shouldn’t be allowed to do exactly that. If the student achieved her own objective, give the community college the credit it’s due.)
Shugart argues further that we should only judge college completion rates when the students in question arrive college-ready. I can’t agree -- it feels too much like an abdication of mission -- but it’s certainly true that it’s more difficult to get a student who arrives with eighth-grade reading skills through a college level program.
A few years ago I saw Gail Mellow, the president of LaGuardia Community College in New York City, argue that if a community college takes someone with an elementary school reading level and gets her to, say, ninth grade reading, that the progress should be seen as a meaningful success. There’s real truth in that. I think that what Shugart was getting at in his reference to “value added.” Did the student leave stronger than she arrived? If so, that should count for something.
But crediting that requires thinking about higher education in a different way. Historically, we’ve viewed higher education as a credentialing device, with bachelor’s and associate’s degrees comprising the focus for undergraduate education. Credentials matter, obviously, but the idea that everyone should fit into one of two categories is getting harder to sustain. People’s needs are much more varied than that. Serving those varied needs well requires moving beyond simple, reductionist measures.
That’s one reason I’m happy to see the discussions around PARCC start to get momentum. Though far from perfect, PARCC at least recognizes that it’s silly and self-defeating to separate a college completion agenda from a serious look at the K-12 system. Many of the states that berate community colleges for teaching all those developmental classes only require two years of math in high school. If we’re serious about improving student success in higher education, we need to see it as part of the same universe as primary and secondary education. From a student’s perspective, it absolutely is.
My guess is that in the coming years, we’ll move from a “many paths to one goal” model to a “many paths to many goals” model. The transition will be halting and messy, with some false starts along the way, but the net change should be positive. Students’ needs are more diverse than one or two degree categories can, or should, cover. If we want to meet those diverse needs, then we’ll need a much more varied and robust set of measures based on ways that actual students actually behave.
Thursday, February 07, 2013
This week I had one of those “duh” moments when I realized that I had been missing something basic.
The dean of the Health division -- which includes nursing -- and I took a field trip to a local health care provider to talk about working together to give nursing students some exposure to what goes on there. These wouldn’t be full-fledged clinical placements -- we already have those -- but a sort of structured introduction to a part of the health care system that isn’t always top-of-mind for nursing students. The meeting went well, and I think there’s potential for something good to happen.
In the course of the meeting, though, the dean and the director of the facility got to talking about the difference between the ways that they were taught when they went to school, and the way the health field actually works. When they went through, they had “lectures” which taught “theory,” and “clinicals” that taught “practices.” Students who were relatively good at one weren’t always good at the other, and the connections between the two weren’t always obvious.
Now, largely due to technology, theory and practice are all mixed up, and students are better for it.
We have increasingly and incredibly complex simulators in class, to create the clinical situations the professor wants, on cue. To make time for those simulations in class, lectures have been largely displaced to out-of-class time, online. By the time the students are loosed upon actual people, they’ve already had to bring theory to bear on situations repeatedly. Even better, a student can make and learn from a harrowing medical mistake on a simulator without harming an actual person. (Last year, apparently, some students made a dosage mistake with a simulator, and nearly “killed” the “patient.” After that, those students got religion in a significant way.)
In this model, MOOCs and their variants aren’t threats to our business model or our usefulness; they make us better. They let us focus on where we can add the most value, and they pick up the part of instruction that was least custom.
To the extent that we use online resources this way, I see our business model doing just fine. MOOCs and similar expedients can pick up some of the most rote, least interesting elements of what we do, leaving us free to tend to the integrative, interactive, more applied stuff that often gets neglected. Used well -- that is, integrated thoughtfully into a curriculum -- they can actually help students learn more effectively than they did in the older style.
Nursing lends itself to this kind of blending particularly well, but I could see other disciplines doing something similar. Let the online tool take the students through “Congress is divided into the House and the Senate,” and use class time to have students work on a constitution for a society on a desert island. (I always enjoyed that one.) Let the online tool illustrate the layout of the audience in the Globe theatre, and use class time to show how Shakespeare wrote for each section.
The choice to be made is to re-imagine class time as a scarce resource. Given new and alternative ways to “deliver” information, can class time be used to help students actually wrestle with the information and make it their own?
That should have been obvious -- a “duh” moment -- but it’s largely missing from the public discussion. Can we use the new online tools to offload the least interesting parts of the classroom experience, precisely so we can upgrade the classroom experience?
Wednesday, February 06, 2013
Apparently, New England is in line for a “repent your sins” snowstorm on Friday.
I know this because I heard it from at least a dozen different people on campus. Then again from the kids.
The kids, of course, are giddy. To them, a huge snowstorm represents a day off from school, and a chance to go sledding, build snowmen, and throw snowballs at each other. It’s all good.
(The only downside is that a Friday storm could postpone the Daddy/Daughter dance. TG is not happy about that.)
I expect kids to be giddy. I remember savoring snow days as a kid, and kids don’t have to deal with many of the hassles of storms. For them, it’s all upside.
But it’s been fun watching adults react almost exactly the same way. It’s like a group flashback to childhood.
Administratively, an entire snow day is much less of a headache than a delayed opening or an early closing. When you split the day, there is literally no single moment during the day that doesn’t create some sort of dilemma. You have a delayed opening that eliminates half of a class period. Is the class still required? A lab section is shortened, but the experiment itself takes a set amount of time. Somebody’s shift is bisected. Some people can’t come in because their kids’ schools are closed. Somebody can’t make it but forgets to call in, and her students storm your office, upset that they braved the elements for no reason.
But a full day is much cleaner. Yes, it causes syllabus issues, and there’s always some making up to do. But at least you don’t have a flurry of no-win judgment calls. At this point, when a monster storm approaches, I root for it to arrive early enough to take out a full day.
The kids have a set of rituals that they use to bring about snow days. I won’t reveal them all here; suffice it to say they know what they’re doing.
Snow days during Intersession are a problem, only because there isn’t much time to make them up. And I live in fear of snow days during December finals. But a snow day in early February isn’t so bad. There’s time to make up what needs to be made up.
And it’s fun watching an office full of adults revert to some dimly remembered childhood habits.
Tuesday, February 05, 2013
For-profit colleges are having a rough go of it these days. Just this week, Everest College (a branch of Corinthian Colleges) was forced to shut down operations in Milwaukee after only two years, during which it burned through two presidents. In my own state, Attorney General Coakley has announced a broader investigation into various for-profit providers in the wake of the abrupt closure of American Career Institutes.
Unfortunately, much of the debate on for-profits has been ideologically driven. Republicans in Congress treat for-profit higher ed as a necessary bulwark against the tenured radicals they assume have taken over the public sector. And traditional academics largely assume that for-profit higher ed is a form of naked exploitation, preying on the poor and the naive. The “gainful employment” regulations that Congress passed, and that are in some sort of judicial limbo, have hit community colleges hard too, even though community colleges are the lowest cost providers of higher education for most people.
I’ve worked in both for-profit and public higher education, so I’ve seen both from the inside. This may be why I find the current debate so unhelpful. I’ll propose something different.
Allow for-profit higher ed to grow, and even prosper, but regulate the product. Restrict the realm of competition to actual quality. If a differently-organized college can get equivalent or better results for its students, acting ethically, then bring it on; that’s healthy competition. But if it’s offering a watered-down product, shut it down.
In other words, use student learning outcomes to measure the effectiveness of instruction. Compete on quality.
If we went in this direction, I’d expect to see for-profit higher education bifurcate. At the “low” end of the spectrum, it could continue to offer the programs that the publics don’t. (I’m thinking here of the classic bartending or truck-driving schools, but specifics will vary by location.) But I could also imagine a new focus on the high end. If they have to charge more than the publics and the MOOCs -- which they would, since they’re taxed and unsubsidized -- then they’d have to offer some kind of value beyond what the publics and MOOCs do. A for-profit that chose to specialize in one or two programs could conceivably do a very good job with them. I could even imagine special cases in which a given community college might cede a particular program to a for-profit provider that does a particularly good job with it.
Even better, this approach would provide a framework for simultaneously dealing with MOOCs and whatever the next big tech breakthrough will be. Rather than either circling the wagons against the future -- a losing strategy if ever there were one -- or uncritically embracing The Next Big Thing, I’d rather see us compare its actual results to what we’ve been doing. I’m confident that we’re doing a good job, by and large, and that a fair test would show that. But confidence and knowledge are not the same thing.
Instead, most of the dialogue has focused on student loans, placement rates, and advertising. Those are symptoms. The underlying issue is quality. Most of us would accept the proposition that a premium product could legitimately command a premium price. Since there’s often a serious information asymmetry in the market -- many prospective students have no reliable way of judging quality -- it’s possible for a provider to substitute sizzle for steak and do well for itself for a while. But the information asymmetry strikes me as largely curable. And if it is, and prospective students get the benefit of various institutions competing on quality, I see everyone being better off.
Our politics is poisoned, because we pretend that “the market” and “regulation” are somehow opposed. Good regulation enables a functioning market. Let’s try it.
Monday, February 04, 2013
This one is for the parents out there.
What’s your philosophy on “technology time” for your kids?
We have a few techie toys around the house that the kids enjoy using: a kindle fire, a nook running android, and an ipod touch. (For those keeping score at home, all of those were bought used, and all work quite well. I’m not sure why this option isn’t more popular.) There’s also the family laptop on which TW and I do work, and the kids play Minecraft.
(For the uninitiated, Minecraft is a game in which kids build virtual buildings. The Boy loves doing multilevel stadiums or castles. I don’t even know if there’s a competitive component to it; he just loves building stuff. Judging by its popularity at Lego League, Minecraft is the Space Invaders for this generation.)
Naturally, the kids love playing with tech more than they love, say, doing homework, practicing their instruments, or hanging their coats in the closet. At their age, I would have been the same way.
I’m a little bit torn.
Part of me agrees with treating tech as analogous to television: acceptable in small doses, but in large doses not leading anywhere good. They aren’t programming or hacking; they’re just playing games. As such, I’m fine with a little as play or stress relief, but I’d hate to see more worthy endeavors sacrificed to it. So this approach suggests rationing tech time and treating it as a sort of reward for getting the necessary stuff done. (TW is firmly in this camp.) Call it the “spinach before dessert” approach.
But part of me sees comfort with tech as a key skill/habit in the coming years, and I don’t want to weigh my kids down with my own misgivings. From what I’ve seen of the techies I’ve known, most of them spent what looked like unhealthy amounts of time at some point in childhood messing around with whatever tech they could get their hands on. For many of them, games were the initial appeal, but soon they started going beyond what was presented. A sort of excess became the foundation for later exploration and invention.
As the kids get older, we’ll have less control over this sort of thing, which is why I want us to get it right while our opinions still matter.
Wise and worldly readers, have you found ways to encourage your kids to engage with the creative side of tech without too much time-suck from other things? I want them to be comfortable with tech, but I don’t want to see them lose touch with everything else. And I worry about presenting either homework or tech as spinach.
Sunday, February 03, 2013
This is a “think out loud” piece, rather than an actual proposal.
What if financial aid went directly from the federal (or state) government to the student, rather than running through individual campuses? The students could use the aid to attend any accredited institution.
It’s not an entirely new model. When I graduated high school in western New York back in the 1980’s, New York State awarded “Regents scholarships” to students who met certain academic criteria. (I don’t remember what they were.) At the time, the scholarships were a hefty $250 a year -- inflation-proof since the 1950’s, according to local legend -- so they didn’t mean much. (Even then, private college tuition was well into the five figures.) But they were applicable to any accredited college or university in the state. The idea, we were told, was to keep bright minds around. Which might have worked, had the dollar figures adjusted to reality.
Private scholarship funds sometimes work that way, too. A local Rotary club might award some scholarships to promising local high school graduates, which they can take with them wherever they go. Even national merit scholarships used to work that way, and I assume they still do. A student who received a national merit award could take it with her anywhere in the country.
The models I’m aware of tend to be merit-based, rather than need-based, but it’s not entirely clear to me that they have to be that way.
As I understand it, one reason to run financial aid through individual campuses is to be able to adjust for differing total costs of attendance. A student who chooses a private four-year residential college will probably need more aid than a student who lives at home and attends a community college.
If most aid came in the form of grants, that might be an argument for keeping the current system. But even a “full” Pell grant comes nowhere near covering the full cost of a year at a typical private college, or even state university. At the more expensive places, “aid” is typically a combination of discounts, grants, and (increasingly) loans. There’s nothing stopping colleges from discounting on their own, if they choose; students would then knit together packages of grants and/or loans and shop around.
One might object that this would put too much power in the hands of the student. But if we have a public good to be served by underwriting higher education more than individual students would -- and I strongly agree that we do -- then we should underwrite it directly. It’s far more transparent and less labor intensive to simply increase appropriations to public colleges and universities than to run everything through a complicated set of hurdles.
Even better, shifting the resources from the colleges to the students would free up colleges to experiment with competency-based learning, different definitions of seat time, and other innovations that are either explicitly or effectively barred under the current system. As long as the caliber of outcomes was the same or better -- that’s where accreditation comes in -- then I don’t see why we shouldn’t be able to try to find new and potentially more appealing ways to get there.
The politics of the shift are a little tricky. Right now direct operating support for public colleges and universities tends to come from the states, while the bulk of financial aid comes from the feds. (I’m ignoring research grants for present purposes.) There’s a pretty good argument to be made that the shift, over the last ten years, from state support to student support has amounted to a shift from state to federal aid. What if we ratified that shift and addressed it directly?
I’m sure there’s a host of issues I haven’t considered, both positive and negative. Wise and worldly readers, what do you think? Would the efficiency gains of a single system be worth it?