Thursday, September 24, 2009


The White Glove Test

Readers of a certain age have probably heard of the white glove test. As I understand it, it was a test of cleanliness in which a woman (it was always a woman) wearing a white fabric glove would trace her finger along a tabletop, and it would pass if her glove didn't get dirty. I don't know if this ever actually happened or if it's like the guy who knows a guy who knows a guy who jumped into the Erie Canal and landed on a cow, but the expression survives.

Based on a conversation I had today with some faculty, I'm wondering if we wouldn't benefit from a local variation on the white glove test.

In discussing computers, equipment, and funding, they complained that we spend a lot on brand-new computers that mostly get used for word processing and low-level web surfing, and then don't have the money to pay for minor classroom or equipment repairs. The weird result is that some very cutting-edge stuff gets underused, while some of the classrooms slowly fall into visible shabbiness. The students, seeing with fresh eyes, pick up on the shabbiness immediately, with predictable effects on morale. Since some of the high schools from which they came can be pretty rundown, it confirms an already-present and destructive implied message.

I have to admit there's something to this.

I'm temperamentally allergic to arguments that assume a fall from a golden age ("Students never used to cheat!" Um, yeah, they did.) But even I will concede that new buildings don't stay new forever, and that the damage from partial repairs is cumulative over time. And lower-tech classroom equipment just doesn't get the attention of the high-tech stuff.

As with so many things, these patterns probably made some sense when they first developed. I'm old enough to remember when even a basic computer cost a couple thousand, and even a high-end one couldn't do much beyond word processing, basic math, and maybe email. (I still remember the first time I used a web browser. Within ten minutes, I was convinced that it was an epoch-defining innovation. I still believe that.)

In those days, when the buildings were newer and the computers more expensive, these spending patterns could be defended. Now, not really.

I'm thinking it might make sense to stratify tech purchases based on their likely uses. If a given lab will use computers just for word processing and web surfing, why not go cheap? I write most of my blog posts on a netbook that cost 400 bucks when I bought it, and that probably costs 300 now. For this purpose, it does just fine. (In labs, where we could use desktops, we could go even cheaper.) Then, we could reallocate some of the savings to do a white-glove test of classrooms and labs, and devote money to the lower-tech but still crucial stuff like lighting, blinds, paint, screens, and such.

Wise and worldly readers -- has your college done this? Has it found a sustainable way to keep the boring-but-important low tech stuff in good repair over time? I'm looking for a model I could adapt.

I think going with cheaper computers is a great idea. It's true that once upon a time even word processors needed lots of power to run. Not so much anymore. If you can get away with it, you might even consider switching as much software as possible to open source, i.e. going with Open Office. Nowadays, the look and feel is almost identical to Microsoft's products and costs nothing (though it'd be nice to donate some cash their way if you're going to install hundreds of copies). Seriously, most students won't notice--and documents can be saved in lots of different formats. It's worth considering.

A fresh coat of paint can do wonders for making a place feel shiny and new. I think it's worth shifting resources to minor repairs.
At our school, computer labs are largely updated with grant funds, so we end up buying expensive computers every 6 years when what we should be doing is buying computers for half the price and replace them every 3 years. But, of course, the grants don't work that way.
Reminds me of the "Broken Windows" theory of policing.
Go back to the even older days, when colleges used to have one state of the art machine and people had to book time on it...

No, seriously. Have one computer lab with very nice machines. Or maybe two -- one in the media arts department (for 3D rendering and video editting and whatnot) and one in the science and engineering building (for running computer simulations).

Give the rest of the computer labs names like "word processing center" or "internet commons" or whatever, so that students know that's all those computers can do.

The majority of students will be perfectly happy with those machines, and the ones who need to do something more computationally intensive can go to the labs with the heavy-duty machines. You can even allow them to book time on one, if they want a guaranteed spot.

I'd say 500 cheap word processing and web browsing machines and 30 power number crunchers would do students a lot more good than having 300 "decent" computers would anyway, and might be done for the same amount of money.

Or cut it back to 300 cheap ones and thirty monsters and use the savings for new window blinds.

(All numbers made up.)
A few years back we created a separate faculty committee comprised of one person from each classroom building on campus as well as the relevant administrators to annually review classroom needs such as lighting, chairs, and technology. We asked each building to identify rooms with the greatest need and then collectively ranked the needs across campus, all in close collaboration with our IT folks. The committee's request then when into the normal capital budget process. This allowed the Powers that Be to balance those "white glove" needs against the more purely IT ones and try to find the balance that you're talking about.

What was most interesting about that committee, which I chaired for a couple of years, was that it became a terrific place to talk about pedagogy. For example, the question of what kinds of chairs and tables one wants will be tightly tied to the teaching styles one imagines the room will use and that faculty think are best. Technology too becomes linked to the pedagogies being employed by the departments in those buildings, rather than than just buying the cutting edge for its own sake ("do we really need a SmartBoard here, or will a regular screen do?")

Your mileage may vary at your own institution of course.
I don't know the situation at your school, but there are real and significant advantages from having all the computers be as much the same as possible. Besides the flexibility it provides in scheduling rooms, etc, and real advantages in purchasing, it makes maintenance (both software and hardware) enormously easier.

Something like 2/3 of IT costs go into just fending off entropy and keeping things working, and so when you double the number of types of systems out by having `good' and `bad' machines that has real costs in staff hours required to keep stuff going.

If, as Mary suggested, the ratio of good machines to bad were like 50:3, it wouldn't be such a big deal; you get most of the advantages of a uniform environment with just a couple labs worth of special machines, and you have a different staffer take care of those. But the more usual case is that the ratio drifts closer to 50:50, and the added manpower costs can easily dwarf the hardware savings (*especially* as machines get cheaper).
DD -- there is research in the field of computers and composition that supports buying less expensive computers suited to their modest uses. If you want to go forward with a plan for reworking the technology budget, let me know and I'll give you the citations.
"In labs, where we could use desktops, we could go even cheaper."

Is this really true? I've actually had a lot of trouble finding a cheap desktop. It would seem to me that the desktops would be cheaper, but that hasn't played out in my search so far.

The general point of the post, though, is a great one. When I was an undergrad in the late 90's & early 2000s, my college was already taking this approach. They had computers with monochrome monitors that were fairly old set up to only access the campus email system through Unix. They also had state of the art computer labs for when that was necessary.
That's pretty much what we do -- the labs with high-end needs get the powerful computers (and/or the Macs), and the word processing/internet labs and classrooms get computers with less memory, etc.

Sometimes, we even have computers in the word processing/internet labs that are "hand me downs" from the high-end ones. Often, even though the computers don't meet the cutting edge needs anymore, they are only a few years old and work just fine in the other labs.
We've actually got a policy that labs get the new, high-end boxes; when the labs need to be refreshed, the old machines are refurbed with some additional memory and then passed on to individual faculty and staff. I'm happily working on a 5 year old machine with a memory upgrade and a newer, super-big monitor. I'll probably be fine for another couple of years--then I'll get a 2-3 year old machine from a lab. All is well in my world and we make efficient use of our scare tech money.
"Scarce" tech money, although the budget is so low it's pretty scary. Sorry for the slippery fingers.
Probably, most computers on any given campus do fall into the cheap end, and a few labs have something extreme, for 3d animation. The challenge is what to do with the computers you have once they fall out of the replacement cycle.

Surplus computers are old, but cheap. It's tempting to many administrators to pick them up and create a "new" computer lab. Sure, they're failure prone, and shabby, but suddenly there's a "science" computer lab where there was simply a classroom before.

Our main savings technique is to hire a lot of staff that are Dell certified. Because we can do our own warranty repairs, it's easy to bargain for discounts with the threat of just doing going without support contracts. There's already been a decision to move from a 4 year replacement cycle to 5, and just let the repair burden fall on staff and hope repair parts are cheap.

As far as capability testing, the greater challenge you'll face is compatibility with legacy binary software. Like a DOS based chemistry program that demands low level access to a student floppy drive that instructors insist is more effective than later versions.
Even at Research 1, we've started to economize on computing. So, faculty members used to be given both a desktop and laptop, to be updated every 4 years. We're now given a Toughbook, and a docking station at work. The IT boys will order an additional docking station for home offices, but faculty have to pony up on that expense. The updating is a year sooner (3 rather than 4 years), but I've been delighted by the switch. I've been using my laptop exclusively for years, so this makes sense to me.

The other change was to move away from using our own e-mail system. We're now with Google, although it says Research 1. The was a godsend, because we were DROWNING in Russian spam that overwhelmed the U's servers (over 150+ messages a day).
Well, I'm surprised to hear from faculty that their school buys them computers. Interesting. At my University, if we want a desktop or laptop in our office, we pay for it ourself. Perhaps that's a consequence of the fact that we're expected to raise a lot of grant money, so we're on our own to raise any money we need for our own work.

As for the original question, about the balance between spending upon making the labs less dingy vs spending on computers -- how about getting input from the students on this? Approach a local student organization and ask them to appoint a student representative to the committee. My experience with that kind of student representation has been quite good: the students have generally been quite conscientious about thinking through the tradeoffs, soliciting views from their fellow students, and making recommendations about what is most useful to them.
Well, I'm surprised to hear from faculty that their school buys them computers. Interesting. At my University, if we want a desktop or laptop in our office, we pay for it ourself. Perhaps that's a consequence of the fact that we're expected to raise a lot of grant money, so we're on our own to raise any money we need for our own work.
Our faculty in the sciences were buying laptops under that model (grants paying for them) until auditors came in and noted the laptops were being used in the classroom. The university now has to pay for the laptops.
You had to explain "white glove test"???? Do you think we live in a cave or do you just think we're far too young and stupid to have ever heard of it?
I'd say that if you have an IT department which is capable of caring what usage patterns on its computers are, you're way ahead of the game.
Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?