Tuesday, July 13, 2010

 

When Technology Doesn’t Help

Joshua Kim’s piece yesterday reminded me of a basic, but widely ignored, truth.

In most industries, new technology is adopted because it’s expected to lower costs and/or improve productivity (which lowers costs over time). It doesn’t always succeed, of course, and the usual vagaries of faddism are certainly there. But by and large, the point of adopting a new technology is to make the underlying business stronger.

But that doesn’t apply in either higher education or health care. In both of those, institutions adopt technology to meet rising expectations, whether it helps with cost or not. Much of the time, it actually leads to increased costs.

For example, take the typical college library. Libraries don’t bring in much revenue on their own, if any; they’re pretty pure ‘cost centers’ for most colleges. They’re central to the educational mission of the college, to be sure; I’d suggest that in the context of a commuter campus, that’s even more true than elsewhere. But income is tied to credit hours, and libraries don’t generate credit hours of their own.

In the past, typical library costs included labor, acquisitions, utilities, and not much else. Tables, desks, chairs, and carrels could be expected to last decades (and judging by some of the graffiti I saw at Flagship State, they did.) Yes, you might find microfilm or microfiche, but even there the space requirements were minimal and the purchases could last for decades. (For younger readers: microfilm was sort of like cassette tape...no, wait, you wouldn’t know that...it was sort of like movies watched really slowly...no, not like dvd’s...ah, screw it, I’m old.) It wasn’t at all rare for the highest-tech thing in the library to be the coin-operated photocopier.

Now, students expect/demand that the library offer plenty of computer workstations with high-speed internet access, good wifi everywhere, all manner of ‘assistive technology’ for the visually or otherwise challenged, and access to proprietary (paid) databases for all sorts of materials. There’s nothing wrong with any of that, but none of it displaced what had come before, and none of it came with its own revenue sources. And that’s before mentioning the price pressures that publishers have put on traditional acquisitions.

As a result, the library is far more expensive to run than it once was. It isn’t doing anything wrong; it’s just doing what it’s supposed to do. The problem is that the technological advances it adopts -- each for good reason -- don’t, and won’t, save money.

Something similar holds true in the health-related majors. As medicine has adopted more high-tech equipment and methods, we’ve had to adopt them, too, to train the students on them. But we don’t get any of the gains from that. We have to pay for it, but the productivity gains, if any, accrue to the industry rather than to us. Worse, many of the purchases are so complex and high-maintenance that they require dedicated staff, thereby adding higher labor costs to the equation.

There are excellent societal reasons why that’s a good idea. I like the idea of the rookie Nursing student making his first medical mistakes on simulators, rather than on people, for the same reason that I like pilots to use flight simulators before they first fly planes. Fewer casualties that way.

But the college doesn’t capture the gains from that. It’s saddled with the costs, heaven knows, but not with the other side of the equation. And in an era of declining state support, there are only so many places to go to find the difference.

I agree that certain applications of technology can save colleges money, and that colleges should take those opportunities seriously. But to assume that it will only be deployed where it saves money, or even that it will be a net financial gain, strikes me as reaching. We train people on the latest stuff because we have to, whether it saves money or not.

Comments:
I can't speak to the health programs, but in the library at least, another key piece of the picture is that previously, the library purchased items: books, journals, print indices. Now, with the exception of most print books, the library licenses, or effectively, rents those same items.

What was, before, a one-time cost (a year's subscription, and then the journal is on your shelf forever) is now an ongoing fee just to retain access to things you've already paid for once. This is, of course, all to the benefit of publishers, who know perfectly well the size of the barrel over which they have us.
 
You didn't even mention the cost of putting Smart Boards in every classroom, upgrading computers so they can run new bloatware, or even the cost of replacing burnt out projector bulbs. Has anyone measured the "learning" improvement resulting from these expenses to see if increased retention pays for them? Ditto for course management software (*).

However, I thought the most interesting part of the article was the argument for using hybrid classes to reduce the capital costs for new classrooms. I know that is an issue for some schools (it is definitely an issue for us at times), but it begs the question of who will be teaching all of those extra sessions and what they will be paid. The professor has to be on line for each group of students, so you can't simply triple the teaching load while tripling the number of classes in a particular room unless you are (ab)using adjuncts. But it can work. Our college has saved a ton of money by shifting a course from f-t to adjunct faculty with significant computer support that costs a lot less than the salaries saved by this shift.

(*) I refuse to use LMS because I have yet to see any software actually manage learning. It only facilitates learning within a course management system that is too stupid to implement the grading algorithm in my syllabus.
 
Brian Mulligan (Institute of Technology Sligo) wrote on the IHE version of this blog that:
It also does not help when you are in an industry hidebound by traditions, where those responsibility for particular work (teaching) are more interested in something else (research), and where there is very little individual accountability (tenure).

1) I think DD needs to explain to his wider readership that faculty at a US Community College are not more interest in research than teaching.

2) When reading that remark, which contains one of DD's favorite phrases, that perhaps DD should consider providing some individual accountability: Calculate the percentage of students in a particular instructor's class who pass the next class in a sequence (clearly applicable to writing and math and any other classes that have a writing or math class as a pre-req). You might not be able to show those results to the community at large, but you can show it to everyone who teaches that same course when doing your annual review.

Just thinking....
 
If they ever found a way to sue the school for big $$ when a nursing student made a mistake, and the simulator decreased the error rate, the cost benefit would become obvious. Some costs are not overt.

Our library pays for WiFi with students fees - so there is another revenue stream associated with that service.

That said, all this sounds like the short argument for having the state pay more per student for the education they receive. I'm not holding my breath.
 
My current employment basically interacts with this stuff every day. Most of my time is spent creating and servicing new computer systems to serve academics, librarians and students.

Private investment in equipment is done to reduce labor costs (ie costs / productivity). Colleges don't always bring in new technology for this purpose, but it happens and taken advantage of. I've seen professors who use the systems to automate grading, distribute lectures and refocus class time on hands on activities.

What I've seen is that none of this stuff translates into more productive workers. Proctoring hands on activities get foisted onto grad students, prep time for lectures is cut owing to the recorded lectures, while teaching loads remain constant. All the surplus goes to the faculty, some of whom seem to have designed a 3 day work week.

I'm posting this anonymously, because I'm thinking maybe this is a consequence of unionization, a highly polarizing opinion. It's well documented that adjuncts run most of these colleges, and ours is no different -- we have more adjunct positions than full time faculty. It's a Haves vs Have-not situation that unionization is supposed to solve, but appears has done amazingly well at perpetrating. If colleges had a framework to capture a portion of the productivity gains, perhaps they'd be better equipped to make such investments. But I can think of no surer tactic of fanning strike flames than to suggest raising the number of students per faculty member.
 
I've been wondering for years why this isn't done more in my field. The thing is, I've been wondering why it isn't done more regarding one very, very specific aspect of the field - learning kanji (the characters you need to read). For Japanese, you need to know about 2,000 kanji, most of which have multiple readings and combine with other characters to make many new words. Yet the standard teaching method is to just sort of meander from class to class and school to school over the years, picking them up at random while actually focusing on grammar. Seriously. After the first couple hundred, you learn the kanji as you see them in readings, and often you only see a character every so often in wildly different readings. And that's not even getting into obsolete kanji, which aren't used anymore but sometimes show up in older works.

Which leads to my idea: why not offer a less-than-full-load course that would be mostly self-study via an online study system a la smart.fm, culminating in a proctored exam checking one's knowledge of a pre-determined set of kanji and vocabulary? The government keeps an official list of what students should know, so this isn't difficult. You could leave the study system online year-round, and only grant credit for the course upon completion of the exam. Sure, you'd pay salaries to make the system, host it, trouble-shoot it and proctor the tests, but it would also be drastically more accessible and (presumably) more used than higher-level courses, while simultaneously encouraging those students who drop Japanese because of the kanji (a lot).

That seems to me a way of getting the benefits everyone is hoping for, without some of the problems that are being mentioned. Certainly that approach wouldn't work for all classes, or even necessarily all disciplines, but there are some areas where technology could help solve some problems that aren't really being addressed right now. Chinese, for example, has the same problem. I imagine a similar set up could be applied to vocabulary in all high-level language courses - you memorize a given set of however-many higher-level words, you get a couple credits. I would think that that application would be particularly useful for community colleges, as I imagine some students come in with several years of a language already under their belts, but professors have to focus on low-level language courses.
 
I think that the cited article is conflating the back end of technology (i.e., those parts that are used to run the business) with the technology offered as a service to "customers."

When a college installs back end equipment (database servers, interdepartmental e-mail, room-scheduling software, electronic payroll management systems, and the like), it is acting just like a business in that these technology systems *are* designed to make things cheaper.

But when a college offers things like workstations in the library, wifi in the dorms, free laptops, etc., it is *also* acting like a business, albeit one that uses technology to attract more customers. Such as apartments with free cable and wifi, car stereos with iPod adapters, Starbucks and free internet access, computers with multiple USB slots, etc.

These kinds of customer offerings do not and are not designed to make things cheaper for the company; they are designed to make the product more attractive to its customers. Colleges are the same - neither a college library nor starbucks offers free wifi because it's going to make things cheaper than not offering it; they offer it to improved the customer experience and possibly get repeat customers.

Which is not to say that every technology adopted to improve the customer experience actually does so - but this is true of colleges as much as it is of companies; neither are perfect decision makers.

But my main point is that to even discuss this issue meaningfully, you have to be talking about the *same kind* of technology.
 
microfiche was like really, really small stone tablets.
 
Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?