Thursday, April 17, 2014
How long _was_ I out?
Welcome to rubber chicken season of spring 2015. We won't talk about the 2014 midterm elections.
Did you need a haircut?
As a faculty member who is working very hard to make sure my department is in compliance with the accreditation requirements (at least, those within the scope of my reach), I find it troubling that the accrediting commission has not provided us detailed constructive criticism during their site visits, yet they decide that it's appropriate to write an op-ed piece suggesting publicly that we voluntarily give up our accreditation.
It seems to me that if they were well-intentioned but their hands were truly tied, if they wanted to trying to work to help us improve instead of standing behind a curtain and judging us, they would have suggested this last July when they decided to revoke our accreditation. It seems to me that they would have suggested this right away, and not waited until nine months later, when they're feeling pressure from several lawsuits, an audit from the State of California, and a reprimand from the Department of Education.
While I recognize that there are some very severe problems at CCSF, some of these problems are not through the fault of many of our employees. Many of them are repercussions from decisions forced upon us by our position as a community college in the most expensive city in one of the most poorly governed states in the nation. The accreditation wrote us up for having financial problems? Well, we can start by blaming the voters of Proposition 13 back in 1978. We can continue by blaming the California Community College System for setting tuition at a level far below the other 49 states (which have fewer financial problems), as well as giving each community college an equal share per student (despite the fact that everything is much more expensive in San Francisco than it is in Yolo County).
Then if you want to blame the actual people who work at CCSF, sure, there are some who have added to the problems. But most of the faculty members come in every day, prep their classes, teach their students, grade their papers, do their committee work, assess Student Learning Outcomes, volunteer on extra committees because of the accreditation crisis, and work their assess off - doing absolutely nothing wrong - for less money than we were making five years ago (despite an added workload). But most of the nearly 2,000 faculty members and hundreds of other employees are working hard to give our students the best educational experience that we can. We don't deserve to lose our jobs. Our students don't deserve to lose their educational opportunities. There are problems, but hitting the accreditation kill switch doesn't help anybody.
(I wish this could have been more eloquent, and I wish I could say everything that I have on my mind (this is really just the tip of the iceberg), but I felt as though someone needed to say something in praise of all of the hard work and good things that we're doing with limited resources. It's easy to point the finger and say that there's something wrong with us, and there certainly are many things wrong with our school, but nothing that I've done has negatively affected our accreditation status, and I believe that's true for most of CCSF's employees. But the lack of transparency from the ACCJC, and their lack of direct communication with us, coupled with random op-ed pieces, are actions that DO directly hurt us and our students.)
It's hard, but I've known this was the game I was in and what these risks were for QUITE some time. I've been a decade-plus in independent higher education, full-time and tenure-track consistently throughout. Compared to the majority of my peers, I'm fortunate.
(And, for what it's worth, my read on the CCSF situation was the same as Anonymous 9:09 PM - candidate status was being floated as an option from the outside, it wasn't something that CCSF was even remotely enthusiastic about pursuing, because it would be perceived as a de-facto admission that the ACCJC was right to pull accreditation. I won't make any judgements, I'm not on the ground there, but I could understand how a faculty member would be hacked off if their accreditation status was the substance of an op-ed.)
Glad that your illness, though severe, was short. Great work keeping up with the blog.
RE: CCSF. Yes, you missed the interesting detail that giving up accreditation was proposed by two of the people reviewing CCSF. I read that as a signal from them that the college isn't anywhere close to showing progress on the issues raised in the original report, although one might question the ethics of a jury making a statement like that in public before the trial is complete.
With all due respect to the CCSF prof Anonymous@9:09PM -
If a majority of the faculty were doing outcomes assessment in the decade after the initial negative report from the accrediting agency, in the previous re-accreditation cycle, I don't see how they could have produced that laundry list of OLD, unaddressed issues with the standards. Granted, I only know what I have read in those voluminous reports and responses, but I didn't see evidence that all of those things were fixed but just not reported properly.
We're actually doing quite well with course-level outcomes assessment, at least compared to other area schools, according to the ACCJC. There are only two areas of eight in which are aren't meeting the standards.
One of them is that fewer than half of our institutional learning outcomes (ILOs) have ongoing assessment. This is a result of us mostly assessing outcomes on a course-level within our departments, and not having one department speaking to another department about outcomes. We appointed an SLO coordinator in 2012 and have been making improvements in this area, though apparently not quickly enough.
I take personal issue with these ILO assessments, though. If you look at the ILOs, they're extremely general. For example, under "Critical Thinking and Information Competency" we have "Use reason and creativity to make decisions and solve problems." Now, I'd imagine that in every physics class you offer, you strive to teach your students to use reason and creativity to make decisions and solve problems. And I'd imagine that most of your students are doing this, to varying degrees of success. Now, how worthwhile will it be to spend time developing and administering an assessment on this ILO? What could we learn from such an assessment? And since we're doing this on an institutional level, somehow your assessment results will be combined with those of philosophy instructors, Spanish language instructors, economics instructors, etc. You can't use the same assessment for all of these different disciplines, so you have to create different ones, which devalues the results as a whole.
If your institution is effectively doing this, please feel free to share how you've implemented this and what you've learned. It would be great to see a method that's effective but not overly time-consuming (remember that we also have to assess SLOs on the course level and program level). But since I haven't seen an instance in which there has been much to be learned aside from the obvious (some students are doing a better job of using reason and thinking creatively to make decisions and solve problems), I feel that it's a big waste of time.
The other area in which we aren't meeting the standards is that we have not regularly completed and updated comprehensive assessment reports. And while I agree that we were definitely not in compliance pre-2012 (when we were placed on Show Cause status), we moved very quickly to implement this afterward. In the 2012-2013 academic year, the SLO coordinator developed a procedure for reporting assessment results that everyone was required to follow. I cannot speak for other departments, but every course that was undergoing assessment in my department did complete an assessment report (and courses not currently being assessed were given status updates through this system). However, the assessment cycle calls for courses to be assessed in the Fall, with results analyzed in the Spring (plus possible additional assessment in the Spring), and to give faculty long enough to do proper assessments, these reports were not completed/due until AFTER the ACCJC visited the school. So of course these reports were not yet "regularly" completed -> they were not yet due! Despite creating and implementing a reporting system for a very large institution in under one year, it is impossible to change the reporting system AND have regular reporting in an amount of time that is less than the amount of time that an assessment takes to complete.
During this accreditation crisis, people have been talking about CCSF as though it's a group of buildings filled with people not doing their jobs. They've said that we may be "too big to fail," comparing us to big banks filled with shady, law-breaking employees that used loopholes and broke laws to steal people's money all in the name of profit. I take great offense to that. We may have our issues, and I don't agree with everything that is done at CCSF, but for the most part we are working hard to provide the best education we can to as many students as possible.
Faculty advocates are arguing that the accrediting body never said that the teaching at CCSF was below par. That’s true, but misleading. The reason they never said it is that there’s no meaningful outcomes assessment, or even internal supervision, to enable anyone to say one way or the other. My guess is that the teaching there ranges from great to not-so-great, just like at any large institution. The fact that I have to guess is, itself, the problem.
The problem is that the SLO assessment that the ACCJC wants still doesn't give us this information.
For each course at CCSF, there are a handful of Student Learning Outcomes (usually between three and ten). Let's look at one course as an example. Since I'm responding to CCPhysicist, we'll pick a physics course. The first one I clicked on is Physics 4A. You'll notice three Student Learning Outcomes:
A. Apply the concepts of kinematics to describe and analyze the motion of an object moving in one or two dimensions.
B. Apply Newton's Laws of Motion to analyze and predict the motion of an object with multiple forces acting on it.
C. Determine which physical quantities, if any, are conserved and apply conservation principles when appropriate.
These three SLOs give an outsider a very basic understanding of what the course is about, but they lack specifics. Instead, instructors usually focus on the Contents section, which detail the 38 topics to be covered throughout the semester. You can see that satellite orbits and escape velocity are covered. You can see that Bernoulli's equation looks to be one of the capstones of the course. You actually get a good idea of what's supposed to be going on in the classroom.
So logically, it would be good to actually assess how well we're covering these contents. Are all instructors getting all the way through? Can students use Bernoulli's equation to solve problems? Are we teaching simple harmonic motion well enough? Do students understand the difference between elastic and inelastic collisions? These would be excellent discussions to have.
And at that point, it's so much work that we don't have time to just sit down as a department or in a committee and talk about how we cover certain topics.
And the only straightforward way to get more detailed outcomes assessments on all of these different results is to have shared final exams among the whole department, with shared rubrics, and a collection and analysis of scores for each question. But nobody wants to do that for a plethora of reasons (we all teach differently, some people have different styles of writing out word problems, different faculty members have slightly different focuses in class, etc.). And even if we did that, we'd still have the same information: most students have an adequate understanding of that topic, and many students are struggling.
So none of this work that we're required to do and none of this reporting that we're required to do actually answers Dean Dad's question of "how is the teaching at CCSF?" And very little of it helps us make substantive changes to the way we teach classes. In fact, having to go through so much data reporting takes up valuable time that could probably be better used actually discussing how we teach different topics and in what ways our students struggle in different areas. But alas, we're stuck with the system the ACCJC is making us follow.
I would love to hear from CCPhysicist and Dean Dad (or other wise and worldly readers) about how SLO assessment at your schools shows whether the teaching is below par or not, and what sort of useful information you have learned from SLO assessment.