Thursday, June 21, 2012


Learning Communities, Student Success, and Real Pizza

I spent Thursday at the “Strengthening Developmental Education” conference presented by the MDRC at Columbia University in a shockingly hot New York City.  It was an odd cluster of presentations.  On the one hand, the intellectual firepower present and the quality of evidence mustered was encouraging.  There was an honesty about findings, and a humility in the face of facts, that’s all too rare at academic conferences.  On the other, though, that meant that many of the findings suggested that much of the student success toolkit -- learning communities, summer bridge programs, and dual enrollment, to name a few -- just won’t live up to our hopes.  

A few highlights:

Martha Kanter, from the Department of Higher Education, made the point that we need to align Federal funds with evidence-based reforms.  When questioned about the persistent mismatch between an education department that favors innovation and a financial aid bureaucracy that’s in clampdown mode, she mentioned that the Education department has “experimental site authority” to work around financial aid regs in special cases.

Reader, I was heartened.

The tone set, much of the rest of the day was about experiments at various community colleges around the country.  Refreshingly, since most of the presentations were not first-person accounts, they could be honest about things that didn’t work, or about promising early effects that faded over time.  We desperately need more of this.

Mary Visher, of the MDRC, presented findings on learning communities that suggested that they have no impact on student persistence, and a small and fast-fading impact on credit accumulation that vanishes entirely by the third semester.  The one exception to that was a college that only offered LC’s to students who came in “college-ready.”

Limiting exit points was a major theme of the day.  Shanna Jaggers, of the CCRC, presented some findings on the Accelerated Learning Program at the Community College of Baltimore County and its close cousin at Chabot College in California.  At Chabot, shortening the developmental English sequence paid off most for the students at the top of the academic range, less for those farther down, and not at all for those who placed into ESL.  Angela Boatman looked at the Tennessee Developmental Course Redesign Initiative, and reported that a modular approach to developmental math seemed to pay off for students near the top of the cutoff.

In the followup breakout session, Katie Hern from Chabot clarified that students are given a choice for developmental English: they can do one semester or two.  Most choose one, unsurprisingly, and those who choose one make it to college level English in higher numbers than those who choose two.  (Hearteningly, the developmental English class -- which includes both reading and writing -- is taught as what Hern called “a junior varsity version” of the college-level class.  Students write papers of the same genre they’d write in the college level course; they don’t do the “first you do sentences, then you do paragraphs” model.) The higher success rate has held true even as the proportion of one-semester offerings has grown, so it doesn’t appear to be a function of student self-selection.

The Baltimore County ALP -- as explained by Peter Adams -- puts developmental and college-level students together in the same college-level class, but follows it with an appended extra help class for the developmental students.  The idea is to provide the extra help at the point of need.  The effects on student success have been remarkable.

At lunch, Scott Jaschik, Kevin Carey, and Kay McClenney fired off a few zingers.  McClenney asked -- in response to someone noting the difference between knowing what should be done and actually being able to do it -- where the boundary is between academic freedom and academic malpractice.  Scott Jaschik noted that, oddly enough, if you want to get really good empirical data on experiments that work, the for-profits and the military academies are the best places to get them.  That’s because the for-profits and the military academies aren’t shy about telling faculty how to teach, so when they mandate a new approach, they get one, and they get clean data.  (From my time on the faculty at Proprietary U, I’ll just say that there’s compliance, and then there’s compliance.)  Kevin Carey noted that the major issue with many of the innovations in the literature is scalability.  Yes, project x worked well with cherry-picked faculty and students and lots of per-capita money, but that doesn’t tell you how it will perform if it’s generalized across an entire college.  

The afternoon plenary was oddly fatalistic.  Cecilia Speroni presented findings based on a study of dual enrollment in Florida that found that dual enrollment gets good results when it occurs on a college campus, but does not get those same results when taught in a high school.  (She noted that Florida defines eligibility for dual enrollment relatively stringently, so it draws an unusually capable group of students.)  Judith Scott-Campbell showed that placement tests don’t do a very good job, though she found that a composite indicator combining placement tests with high school GPA does somewhat better.  Intriguingly, the tests were considerably more accurate in math than in English.  And Heather Wathington did a study of developmental summer bridge programs that found that they had no impact on enrollment, credit accumulation, or persistence.  

Vaguely dispirited, I left in search of solace.  Happily, I found it.  My current region of the country has its charms, to be sure, but the pizza -- and I mean this in the nicest possible way -- sucks.  They just get it wrong.  What they call pizza sort of resembles pizza visually, but, well, it is to pizza what reality television is to reality.  It’s just not the same.

But New York City is different.  I found a perfectly ridiculous hole in the wall called Big Nick’s, on Broadway.  It’s about five feet wide, and it looks and feels like a galley kitchen.  It’s ugly as sin, poorly ventilated, and festooned with pics of d-list celebrities from the 60’s.  

But the pizza!

It’s nice to know that some art forms are still practiced.  Learning communities, summer bridge programs, and dual enrollment programs may all let us down, but as long as there’s pizza this good in the world, there’s hope.

On to day two.

Sounds fascinating!

Question: Where was the study about placement tests plus GPA done?

I believe the result for math based on my own empirical info for something quite different (which indicates actual retention measured by a "cold" test matters less than a previous grade, but on a sliding scale). Math is recoverable with guided practice. The key is to fix small, but potentially fatal, problems that were never caught in HS.

I also believe the results for dual enrollment, although I would like to know what a "bad result" is for the in-school group. How they do when they get to a university, tracked somehow? I'm sure it is due to a difference in culture. I only teach on-campus dual enrolled students, and they frequently comment (in a positive way) about the different environment due to the working adults and vets around them.

My biggest concern is with studies that use IPEDS data for anything. I keep seeing repeated examples where students enroll with the specific intent of transferring after one semester combined with the mass of reverse transfers. They are using us as a bridge program because they missed some cut or deadline for direct admission or had an "unfortunate" semester, but there are NO flags in the data for these kinds of cases. The first group will score as a failure for us (since they never get a degree from us) and a success for no one (since they are no longer FTIC), and the second is an invisible success (as you've written about before).
I don't know about the study that DD referenced, but there is some really great work on placement using high school grades being carried out by researchers at Long Beach City College. The work is most applicable to English so far. Go to the LBCC website and search for "Promising Pathways". It should show up as Research Document. Follow the link and find the omnibus presentation is at the very bottom of the page.
From an upper-west-side of NYC native: Sal&Carmine's pizza, W102xBroadway. Better than Big Nick's.
I have pizza envy. Come home.
I like your blog. It is interesting & good to know this. Thanks for this post.
New York pizza is good, but New York bagels are to die for. Be a hero and take some home. I recommend Absolute Bagels, Broadway between 107th & 108th.
I second the Absolute Bagels recommendation. Get some before you head home (if you haven't left yet)!

I would point out that the developmental summer bridges did have an impact on getting through the full dev ed sequence and passing college-level courses in math and English, at least in the first semesters. The lesson, to me, is that an intervention targeted at a specific goal (e.g. getting students through their dev ed sequence) may have an impact on that specific goal, but we shouldn't expect too much from it on later, long-term outcomes. It's hard to make up in one single summer what twelve years of elementary school and secondary education has failed to do.
Thanks for the info about the work at LBCC. I like that it is backed by some solid institutional research. It was particularly interesting that they paid attention to WHICH specific HS English classes the grades were earned in, and that it was A and B grades that mattered. I have seen indications that "C" grades in HS are the most suspect when it comes to passing entirely on extra credit.

I also like that it should be scalable. I wonder if one example of failure to scale at my CC is because grouping "college unready" students together creates precisely the wrong kind of learning community.
Ccrc working paper 19 from teacher's college captivated my research reading attention this spring. The high likelihood of committing type one or type two placement errors, when using compass or accuplacer testing is most striking. And the damage either error causes for learners should give any DE admin pause. I even noted this research when interviewing for a DE admin position. Seems to show that it doesn't matter so much if you have one or one hundred exit points. The bottom line is that if you commit placement error, those affected negatively will find the door--formal exit point or not. Read the study as it's very revealing research done well.
Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?