Joshua Kim’s piece yesterday reminded me of a basic, but widely ignored, truth.
In most industries, new technology is adopted because it’s expected to lower costs and/or improve productivity (which lowers costs over time). It doesn’t always succeed, of course, and the usual vagaries of faddism are certainly there. But by and large, the point of adopting a new technology is to make the underlying business stronger.
But that doesn’t apply in either higher education or health care. In both of those, institutions adopt technology to meet rising expectations, whether it helps with cost or not. Much of the time, it actually leads to increased costs.
For example, take the typical college library. Libraries don’t bring in much revenue on their own, if any; they’re pretty pure ‘cost centers’ for most colleges. They’re central to the educational mission of the college, to be sure; I’d suggest that in the context of a commuter campus, that’s even more true than elsewhere. But income is tied to credit hours, and libraries don’t generate credit hours of their own.
In the past, typical library costs included labor, acquisitions, utilities, and not much else. Tables, desks, chairs, and carrels could be expected to last decades (and judging by some of the graffiti I saw at Flagship State, they did.) Yes, you might find microfilm or microfiche, but even there the space requirements were minimal and the purchases could last for decades. (For younger readers: microfilm was sort of like cassette tape...no, wait, you wouldn’t know that...it was sort of like movies watched really slowly...no, not like dvd’s...ah, screw it, I’m old.) It wasn’t at all rare for the highest-tech thing in the library to be the coin-operated photocopier.
Now, students expect/demand that the library offer plenty of computer workstations with high-speed internet access, good wifi everywhere, all manner of ‘assistive technology’ for the visually or otherwise challenged, and access to proprietary (paid) databases for all sorts of materials. There’s nothing wrong with any of that, but none of it displaced what had come before, and none of it came with its own revenue sources. And that’s before mentioning the price pressures that publishers have put on traditional acquisitions.
As a result, the library is far more expensive to run than it once was. It isn’t doing anything wrong; it’s just doing what it’s supposed to do. The problem is that the technological advances it adopts -- each for good reason -- don’t, and won’t, save money.
Something similar holds true in the health-related majors. As medicine has adopted more high-tech equipment and methods, we’ve had to adopt them, too, to train the students on them. But we don’t get any of the gains from that. We have to pay for it, but the productivity gains, if any, accrue to the industry rather than to us. Worse, many of the purchases are so complex and high-maintenance that they require dedicated staff, thereby adding higher labor costs to the equation.
There are excellent societal reasons why that’s a good idea. I like the idea of the rookie Nursing student making his first medical mistakes on simulators, rather than on people, for the same reason that I like pilots to use flight simulators before they first fly planes. Fewer casualties that way.
But the college doesn’t capture the gains from that. It’s saddled with the costs, heaven knows, but not with the other side of the equation. And in an era of declining state support, there are only so many places to go to find the difference.
I agree that certain applications of technology can save colleges money, and that colleges should take those opportunities seriously. But to assume that it will only be deployed where it saves money, or even that it will be a net financial gain, strikes me as reaching. We train people on the latest stuff because we have to, whether it saves money or not.