Thursday, September 24, 2009
The White Glove Test
Based on a conversation I had today with some faculty, I'm wondering if we wouldn't benefit from a local variation on the white glove test.
In discussing computers, equipment, and funding, they complained that we spend a lot on brand-new computers that mostly get used for word processing and low-level web surfing, and then don't have the money to pay for minor classroom or equipment repairs. The weird result is that some very cutting-edge stuff gets underused, while some of the classrooms slowly fall into visible shabbiness. The students, seeing with fresh eyes, pick up on the shabbiness immediately, with predictable effects on morale. Since some of the high schools from which they came can be pretty rundown, it confirms an already-present and destructive implied message.
I have to admit there's something to this.
I'm temperamentally allergic to arguments that assume a fall from a golden age ("Students never used to cheat!" Um, yeah, they did.) But even I will concede that new buildings don't stay new forever, and that the damage from partial repairs is cumulative over time. And lower-tech classroom equipment just doesn't get the attention of the high-tech stuff.
As with so many things, these patterns probably made some sense when they first developed. I'm old enough to remember when even a basic computer cost a couple thousand, and even a high-end one couldn't do much beyond word processing, basic math, and maybe email. (I still remember the first time I used a web browser. Within ten minutes, I was convinced that it was an epoch-defining innovation. I still believe that.)
In those days, when the buildings were newer and the computers more expensive, these spending patterns could be defended. Now, not really.
I'm thinking it might make sense to stratify tech purchases based on their likely uses. If a given lab will use computers just for word processing and web surfing, why not go cheap? I write most of my blog posts on a netbook that cost 400 bucks when I bought it, and that probably costs 300 now. For this purpose, it does just fine. (In labs, where we could use desktops, we could go even cheaper.) Then, we could reallocate some of the savings to do a white-glove test of classrooms and labs, and devote money to the lower-tech but still crucial stuff like lighting, blinds, paint, screens, and such.
Wise and worldly readers -- has your college done this? Has it found a sustainable way to keep the boring-but-important low tech stuff in good repair over time? I'm looking for a model I could adapt.
A fresh coat of paint can do wonders for making a place feel shiny and new. I think it's worth shifting resources to minor repairs.
No, seriously. Have one computer lab with very nice machines. Or maybe two -- one in the media arts department (for 3D rendering and video editting and whatnot) and one in the science and engineering building (for running computer simulations).
Give the rest of the computer labs names like "word processing center" or "internet commons" or whatever, so that students know that's all those computers can do.
The majority of students will be perfectly happy with those machines, and the ones who need to do something more computationally intensive can go to the labs with the heavy-duty machines. You can even allow them to book time on one, if they want a guaranteed spot.
I'd say 500 cheap word processing and web browsing machines and 30 power number crunchers would do students a lot more good than having 300 "decent" computers would anyway, and might be done for the same amount of money.
Or cut it back to 300 cheap ones and thirty monsters and use the savings for new window blinds.
(All numbers made up.)
What was most interesting about that committee, which I chaired for a couple of years, was that it became a terrific place to talk about pedagogy. For example, the question of what kinds of chairs and tables one wants will be tightly tied to the teaching styles one imagines the room will use and that faculty think are best. Technology too becomes linked to the pedagogies being employed by the departments in those buildings, rather than than just buying the cutting edge for its own sake ("do we really need a SmartBoard here, or will a regular screen do?")
Your mileage may vary at your own institution of course.
Something like 2/3 of IT costs go into just fending off entropy and keeping things working, and so when you double the number of types of systems out by having `good' and `bad' machines that has real costs in staff hours required to keep stuff going.
If, as Mary suggested, the ratio of good machines to bad were like 50:3, it wouldn't be such a big deal; you get most of the advantages of a uniform environment with just a couple labs worth of special machines, and you have a different staffer take care of those. But the more usual case is that the ratio drifts closer to 50:50, and the added manpower costs can easily dwarf the hardware savings (*especially* as machines get cheaper).
Is this really true? I've actually had a lot of trouble finding a cheap desktop. It would seem to me that the desktops would be cheaper, but that hasn't played out in my search so far.
The general point of the post, though, is a great one. When I was an undergrad in the late 90's & early 2000s, my college was already taking this approach. They had computers with monochrome monitors that were fairly old set up to only access the campus email system through Unix. They also had state of the art computer labs for when that was necessary.
Sometimes, we even have computers in the word processing/internet labs that are "hand me downs" from the high-end ones. Often, even though the computers don't meet the cutting edge needs anymore, they are only a few years old and work just fine in the other labs.
Surplus computers are old, but cheap. It's tempting to many administrators to pick them up and create a "new" computer lab. Sure, they're failure prone, and shabby, but suddenly there's a "science" computer lab where there was simply a classroom before.
Our main savings technique is to hire a lot of staff that are Dell certified. Because we can do our own warranty repairs, it's easy to bargain for discounts with the threat of just doing going without support contracts. There's already been a decision to move from a 4 year replacement cycle to 5, and just let the repair burden fall on staff and hope repair parts are cheap.
As far as capability testing, the greater challenge you'll face is compatibility with legacy binary software. Like a DOS based chemistry program that demands low level access to a student floppy drive that instructors insist is more effective than later versions.
The other change was to move away from using our own e-mail system. We're now with Google, although it says Research 1. The was a godsend, because we were DROWNING in Russian spam that overwhelmed the U's servers (over 150+ messages a day).
As for the original question, about the balance between spending upon making the labs less dingy vs spending on computers -- how about getting input from the students on this? Approach a local student organization and ask them to appoint a student representative to the committee. My experience with that kind of student representation has been quite good: the students have generally been quite conscientious about thinking through the tradeoffs, soliciting views from their fellow students, and making recommendations about what is most useful to them.
Our faculty in the sciences were buying laptops under that model (grants paying for them) until auditors came in and noted the laptops were being used in the classroom. The university now has to pay for the laptops.