(“The Prereq Temptation” was a rejected John LeCarre title. A lot of people don’t know that…)
In my imagined, more perfect world, there would be exactly one reason for a prerequisite to attach to a class: the students would need to know material from the prior class to be successful in the second one. For example, a student wandering into a calculus class who had never taken algebra or trig could be expected to be lost. (“Why are you doing math with letters?”) In sequences of courses that build on each other, the folks who teach the later courses should have some reasonable assurance that they don’t have to go all the way back to square one.
But in this world, that’s not the only reason prerequisites get put on classes. For example:
- Frustration with student performance. I’ve seen professors in all sorts of disciplines argue for English 101 as a prereq, because they’re frustrated by poor student writing.
- Screening out younger students -- either dual enrollment or first-semester -- without having to say so in so many words.
- A vote of no confidence in local high schools.
- Felt prestige. At a previous college, I once had a department admit in a program review that the addition of a prereq to its into class made no difference in student success; to its credit, it even included the numbers to prove it. But it argued for keeping the prereq anyway, as a “statement” about expectations.
- Self-defense in an arms race. If every other class that fulfills a distribution requirement has an English 101 prereq, and yours doesn’t, then you will get more than your share of the less prepared students, simply by default. After a while, even some folks who generally object to prereq proliferation will yield just to level the playing field.
- Transfer requirements. Certain large public institutions -- not naming any names here, but they know who they are -- won’t accept certain courses unless those courses carry specific prereqs. They take the presence of prereqs as a sign of rigor. Even if we could show locally that the prereqs achieved nothing except to delay students, we’d still have to keep them.
- Gaming graduation numbers. If every credit-bearing course requires that a student has cleared the developmental level, and developmental courses don’t add up to 12 credits, then the college can de facto exclude all developmental students from its “first-time, full-time” graduation rate. It’s unethical, but it happens.
Leaving aside the more sinister and self-serving reasons, people often argue for prereqs out of a sincere, if unproven, belief that they’ll set students up for success. The argument could be tested empirically, but almost never is. It should be.
Individual prereqs can make sense, but when they proliferate -- as they tend to do -- they make timely completion of a degree much harder. A student who has to wait for a prereq class to fit her schedule may add a semester or a year to her time-to-completion, just because she’s following an unproven rule passed through a combination of ego and wishful thinking.
To my mind, the burden of proof should be on prereqs. In the relatively rare cases in which the relevance is obvious and well-demonstrated, keep them. But subject all of the existing ones -- not just new ones -- to actual empirical tests. If our four-year counterparts would do the same -- hint, hint -- we could drop the prereqs that are only there to appease them.
The prereq temptation is subtle and pervasive, but it does real harm. If we could get that long list of reasons down to a single one -- where it actually helps -- students would benefit tremendously.