Thursday, November 20, 2008
Harder Than It Looks
If only it were as simple as the article suggests.
The part that it gets right is that the 'non-credit' side of the college plays by very different rules than the 'credit' side. The credit side runs on a pretty strict semester system, with state-mandated rules about 'seat time' (or its equivalent), and state and federal rules about financial aid eligibility, and regional accreditation rules about nearly everything. A new course – let alone a new program – has to run through the entire shared governance process, which takes at least a year if you do it right, and more if you don't. The per-credit tuition rate is set collegewide, so a three-credit English class and a three-credit Business class have to charge the same tuition. Degrees have to include a certain number of credits, distributed in a particular way. (Associate of Arts degrees have different Gen Ed requirements than Associate of Science degrees, but any degree that gets either designation has to play by a given set of rules.) The credits are supposed to be (and usually are) transferable toward a bachelor's degree, so a student can do two years with us and two more years someplace else.
On the non-credit side, the picture is different. Courses can run for any length, in any combination of days and times, and at any price. Instructors' qualifications aren't prescribed, and the courses don't have to run through the governance process. We can go from zero to sixty in a month if we want to. (In academic terms, that's lightspeed.) The courses aren't built for transfer, and the 'certificates' awarded can designate anything from completion of a single four-week training course to completion of a sequence of several courses. Subject matter is dictated mostly by market demand, so it tends to be a combination of employment training, personal enrichment, and adult basic education (which is the stuff that comes before remediation – adult literacy classes, for example).
Like Dr. Seuss' moose juice and goose juice, everything works well when the two are kept separate. On the credit side, we abide by all manner of rules to present thoroughly vetted courses that will carry weight in the wider academic world. On the non-credit side, we present what we want, when we want, how we want, charging what we want, and we let the market tell us when we got it right or wrong. On the credit side, we're educators; on the non-credit side, we're vendors.
The classic model of non-credit workforce development is the company that comes to us asking if we can train some of its employees on a new technology or software package. We throw together a four-week hands-on program, taught either on campus or at the company, and hire a trainer to teach it. This model works really well when you have savvy people running it for an extended period, since they build up networks and reputations.
Lately, though, I've seen two trends come along that are making the distinction between the two sides much murkier than it used to be.
One is the desire among graduates of the non-credit certificate programs, after the fact, to get some kind of credit for what they've learned. Converting non-credit to credit isn't always easy. (And there's an argument to be made that it shouldn't be easy, lest we inadvertently make end runs around accreditation too easy.) Telling students who have taken non-credit training workshops over and over again for years that they'd have to start a degree just like any other freshman is a hard sell. In areas with CLEP exams and similar options, there's a reasonably elegant way to weigh claims of equivalency: if you pass the test, you're in. But how many “Microsoft Word” workshops add up to Intro to Computer Science? (Hint: they don't.)
The other, which is becoming a real challenge, is the increasing focus by grantors on 'bridging' the non-credit and credit sides of the college. The usual idea runs something like this: industry x is growing, and it needs employees. Region y has unemployed people who are turned off at the prospect of the long, hard slog to a degree. If only we could somehow grease the skids to employment by hurrying these students through, giving credit for prior learning...
The grantors have no concept of accreditation requirements, or state regs, or faculty union contracts, or shared governance. And cash-starved colleges sometimes chase these grants simply because they need the money. But the headaches that arise from trying to square the circle are massive, and increasing. The faculty bristle at what they see as encroachment. The 'vendors' bristle at what they see as needless dawdling. The financial people struggle trying to reconcile different sets of rules. The administrators try to balance it all, which basically consists in spreading the dissatisfaction relatively evenly.
Over the long term, I suspect that this blending will continue, and that we'll have to take some serious looks at some of the walls we've built between the two sides. But for now, it's a messy, complicated, ugly, frustrating picture that the Times missed completely.
There is also the difference between credit and credibility.
Learning is work. And, work is also, um, work. Both give you credibility. Credibility is good. It just isn't the same thing as completing a college-credit course, which ideally confers both credit and credibility.
I think grantors, legislators, etc. are misconceiving the problem when they seek to erode accreditation criteria. They're disrespecting knowledge and the learning process by focusing on the degree as the factor limiting the work force.
To compare this issue to giving credit for life experience: If you spend years working at a job, you put that on your resume. If you then spend years worth of nights earning an Associates at a CC, you put both on your resume. I think it's misleading future employers to legitimize double dipping by calling up a CC, passing a test, then claiming a degree that you didn't spend time earning. (Maybe it would be fair if you could somehow trade in the right to claim your work experience).
This is my absolute pet peeve, and I think you're right. As a lawyer, I get why so many places do this -- it's a lot easier, and a lot less lawsuit-prone, to sort people by a qualification than to attempt to actually assess their skills -- but it's absolutely maddening. As an example, it's hard as hell to find decent legal secretaries these days, and you have all these talented young people barred from entry because they don't have a degree and the firms don't want to hire without a degree ... and then you have barely-competent 40 wpm no-people-skills applicants, but BY GOD they have a degree.
And when the college degree becomes just a passport that you can get routinely stamped SOMEWHERE, the value of the degree, unless it's from a top institution, becomes devalued across the board ... and we up the ante to having a masters'.
Okay, a little off-topic, but it drives me crazy, and I see DD's combined credit/non-credit wanting to head the same way.
We have a type of course called noncredit (or continuing education) that does have course outlines, minimum qualifications for faculty, state approval (which can take up to a year sometimes), and is part of college's accreditation. These classes are free to students and many colleges offer vocational programs through this area.
There are also community services classes, which sound more like what the article is talking about. These classes don't give college credit but also do not have the approval process, minimum qualifications, etc. Students pay whatever the market will bear. The big benefit, as the article states, is the speed in which you can get programs up and running.
Then there are contract education classes which can be credit or more like community services, but the big distinction here is that local employers pay the cost of these classes for their employees.
In the CE courses we teach, the syllabus and the instructors are vetted by the certifying bodies. I'd also say that the content being delivered is approximately like a 100-level laboratory class. However, the students don't have to demonstrate understanding of what they've learned. They attend, they do the exercises, and they get their certificate of completion. Since their employers send them to the course as a training exercise, as opposed to as an evaluation tool, that's OK.
We also take CE classes ourselves, which is useful, but it's more like auditing a lecture or seminar as opposed to taking a course for credit.
By contrast, our credit-bearing courses have a very strong evaluative component: midterms, graded projects, everything you'd expect from a college course. (We teach grad students, so we're pretty strict with them.) A passing grade in a college course means not only that the student has been exposed to certain concepts, but that they have demonstrated mastery at a certain level. What that level might be depends on the school, professor, and grading standards, of course.
I don't see how you can mix the two types of instruction for college credit, unless the CE student takes an evaluative test at some point. I could see a system where (X CE credits + grade of Y or better on exam) = (Z college credits).