For 20+ years there have been all kinds of boards and national
commissions recommending changes in classroom practice but little
seems to come of it (maybe that's just my perspective?). There's lots
of innovation in curricula, pedagogical practices, and general
approaches to teaching that have gotten lots of funding from people
like the NSF (just what I know, I'm sure that someone else could say
something about other fields). Some examples: in Physics there's
project Scale-Up (resource intensive), and in math there are
"group-work focused"* curricula for stats, "general math," calc 1, 2,
and 3, differential equations, ... But, for all of these, market
penetration is tiny.
Thinking in terms of higher education research generally, there's lots
of fairly traditional routes for dissemination: talks at conferences,
papers in appropriate journals (academic and practical), but how much
really penetrates via these routes?
Are there non-traditional routes that, given your experience as an
administrator, you would suggest that a researcher (faculty) wouldn't
necessarily know about?
Are there routes of dissemination to faculty that make sense, but
don't seem to be tried much (the usual list: papers, presentations,
workshops and booths at conferences)?
That is, we spend all this time doing research and, hopefully,
learning useful things. Or, is it that none of what we learn is
actually that useful?
This is one of those “harder than it looks” ideas, and I’ll stipulate upfront that much depends upon context.
It’s certainly true that there’s a wealth of information out there that often doesn’t find its way to the folks who could gain from it. Off the top of my head, I can think of several reasons for that.
The first is incentives. All else being equal, doing something the way you always have is easier than trying a new approach. Learning curves are real, and a new preparation or a new methodology is much more time-consuming than one you’ve done many times before. In a college that values research over teaching, there’s a valid argument to be made that too much time spent on teaching will result in too little on research, with severe career consequences. In a teaching-focused college, the sheer size of the courseload becomes the obstacle. Yes, there’s a conceptual interest in improved student success, but when you have your hands full just doing what you’re doing now, and the benefits from your additional work don’t accrue to you, there’s a strong gravitational pull to just not bother.
(Underlying that paragraph is the annoying truth that outside of a very few settings, the more a college values teaching, the more teaching it requires. The more teaching it requires, the higher the cost of pedagogical innovation. Mass production requires standardization.)
Theoretically, we could get around that by drastically reducing faculty courseloads. But the economics of that are simply prohibitive.
Then there’s the cultural taboo among faculty against being “told what to do.” Any innovation that comes packaged with administrative approval is often regarded as suspect simply because of the approval. I’ve seen settings where that was warranted, and settings where it wasn’t, but there it is.
Field specificity also matters. Techniques that may work beautifully in a lab science may not help much in history. Courses intended as stand-alone samples of a discipline often have more flexibility than first courses in a sequence. “Skills” courses and “content” courses have different demands. These days, there’s also a divide between traditional classes and online classes.
Travel funding matters, too. Most community colleges, as far as I know, have pretty limited travel and conference funding. That means that even when there’s a will to go exploring, there may not be a way. Virtual conferences and webinars try to fill the void, but they’re not the same thing.
Finally, there’s knowing where to look. In the spirit of making failure safe, limited travel funding and limited time tend to place premiums on sure things. In the absence of the time and money to take flyers on multiple different things, there isn’t always much to be gained by looking for alternatives.
It’s a shame, really. The students who most need innovation -- those for whom traditional instruction has largely failed -- are often the least able to get it.
I’ve seen a few expedients that help a bit, but I’d love to hear from my wise and worldly readers with more.
One is to make it somebody’s job to keep up with this stuff. Titles like “instructional designer” describe people whose job it is to maintain currency with the latest pedagogical innovations, and to work closely with faculty to adapt what’s useful to a particular context. That costs money, of course, but it has promise.
Another is to remove travel funding from the operating budget, and to put it under some sort of endowment. At community colleges, that’s often done through a “Center for Teaching Excellence” or something along those lines. If the Foundation funds it, then it’s immune from state cuts and other budget cuts.
In a more perfect world, individual faculty would have actual job-based incentives for classroom improvement; the issue there is in definition and evaluation. The union at my college is convinced that merit pay is a tool of the devil, and simply won’t hear of it; accordingly, pay is based on seniority. Over time, you get what you pay for.
Wise and worldly readers, I’m sure I’ve only scratched the surface on this one. Have you seen (or come up with) an effective and realistic way to disseminate pedagogical innovations so that they might actually get used?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.