Tuesday, August 26, 2008
Dittos and Ditching Darkrooms for Digital
Ditto machines were basically rollers with a hollow drum that would be filled with a mildly hallucinogenic purple liquid. Freshly-run dittos had a distinctive smell to them, and it was a common sight to see students sniff new dittos intensely. (The movie Fast Times at Ridgemont High has a scene of an entire class sniffing dittos.) Primo ditto had a way of brightening an otherwise dreary day. My pet theory on why illegal inhalant use rose so much in the 80’s and 90’s is that students lost access to fresh dittos. In the 70’s, we got it from our teachers. It was a different time.
I thought of dittos again as I had a conversation with some of the folks in the art area about photography. As the whole world knows, the world of photography has migrated fairly quickly from film to digital media. I literally don’t remember the last time I bought film, but it has been several years at least.
Film is a delicate medium, requiring lots of care and feeding. A photography program routinely required well-ventilated darkrooms, enlargers, and all manner of distinctive chemicals with complex disposal protocols. (I still remember the pungent aroma of stop bath.) Processing black-and-white film took some doing, but processing color film was really not for the faint-of-heart. Some programs did their own black-and-white processing in-house, but outsourced the color due to the sheer expense and difficulty.
Digital photography requires an entirely different infrastructure. You don’t need darkrooms or stop bath, but you do need rooms full of very high-powered computers. (In my observation, the cognoscenti typically use Macs.) In place of enlargers and stop bath, you have very high-end printers and Photoshop. Although it's still usually found in Art departments, the equipment required looks more like what you'd find in a media or computer animation program.
And yet, whenever someone has the gall to mention that maybe it's time to admit that the 90's are over and it's time to ditch the darkrooms and get on with it, we get the "we need both" arguments.
To hear the photography profs tell it, letting students start with the current technology would be to prevent them from understanding the medium. It's as if the computer science department insisted on Altairs and Apple Lisas alongside their current offerings, or colleges had to run typewriter pools parallel to their computer labs. Skip dittos, and you won't appreciate -- really appreciate – photocopying.
Color me skeptical.
Yes, it's fun to take trips down memory lane. And yes, it's probably hard to admit that a technology you spent so long mastering has gone by the wayside. (Apparently, much of Kodak's downward spiral was due to little more than denial.) But money and space dedicated to denying the passage of time are money and space not spent on something else. Keeping the darkrooms open and running is a real cost, both in terms of the operating budget and the opportunity cost. Those rooms and funds could have been used for something else.
This is when I really wish that we had a stronger system for tying curricular decisions to budgetary decisions. Should we keep the darkrooms alongside the digital, or should we use those resources to increase the number of students we could take in the allied health programs? Should we continue to keep a dying technology on life support, or should we use that money to expand our information security program? If you don't make the opportunity cost concrete, it's all too easy to make decisions based on conflict aversion, nostalgia, and personalities. When the possible futures are almost as concrete as the living past, it's easier to get clarity.
Technological progress has its uncomfortable moments, but there's something to be said for facing up to reality. I remember dittos well enough to know that photocopies are just plain better, even if they don't smell like lilacs in the Springtime.
The relationship between a photographer and their reprographic technologies is not the same as a teacher and theirs. The teacher has a primarily functionalist account: just reproduce the image, in the best way possible. In art, the process of reproduction is a way of reflexively engaging the history of the discipline. What you're describing as a procedural issue is actually more like a methodological issue. It's a bit like "we don't need to teach English anymore now that we teach media studies and journalism, since that's the context most people write in". Most artists lack the interdisciplinary background to make these kinds of analogies, but I believe they are nevertheless true.
Having lived through some art school restructures, I think the real challenge is to set up a viable digital imaging workflow that can work across many different departments. Students in media, CompSci and art departments all need to produce digital images to prints, and *if* (big if) you have someone who can understand the vastly higher technical-aesthetic needs of the art department, you could potentially run the same infratructure across the board. But if you try and put the art dept into a situation where their work ends up looking crappy in the name of administrative efficiency, you sell the discipline out completely.
As for the darkrooms - let's go back to your original analogy. If they're the equivalent of dittos (we had the branded term Gestetners in Australia), no-one will want to keep using them when you have a better system in place. My observation (as someone working in an art school with a lot of photographers) is that the darkroom is still fundamental to understanding what photography is. I haven't used a film camera in five years either, but I am not trying to work out anything about the history of photography in my snaps, whereas all your students are. I think you've gotta let thm roll for a while yet (even at a smaller scale).
A better analogy for this probably would be how when Monet paints water lilies it's as much about the technique - the brush strokes, the canvas chosen, etc. - as it is about the mimetic representation of water lilies.
What troubles me most about your post though, DD, is the fact that in your perfect world, where budget drives curriculum, things like art photography or anything else that doesn't have immediate use value commercially would be thrown out in order to serve things like "allied health programs" or "information security." Or even things like "graphic design" instead of painting or, as db notes, "journalism" instead of "English." I'm not saying that programs that are driven by vocational training are a bad thing AT ALL - they're necessary. But if we don't fight for those non-essentials, we're basically saying that the whole point of education is getting a piece of paper that will get a student a job. I believe education is more than that and should be more than that - even at a CC or a low-tier 4-year - and if it is, we may need to support things that seem like a waste of money when they are only evaluated in terms of the bottom line.
(This is not to say that the bottom line can or should be disregarded: it's just to say that when it becomes the only factor in determining curriculum, a whole lot gets lost.)
Where in my post - or in any post I've written in the last four years - have I ever claimed that in my perfect world, commercial value would be the criterion? Hint: I haven't. I don't believe that, and I'd think after all this time you'd know better than to trot out such a tired cheap shot.
First of all, the budgets to which I'm referring are internal. It's simply not the case that the most marketable majors externally are moneymakers internally; in fact, it's typically the opposite. We make money on philosophy and lose money on nursing. A more thorough accounting of costs, such as the one I advocate repeatedly, would be a boon for the liberal arts.
Secondly, your disparagement of allied health and information security (computer security, and more) is puzzling. If you think either of those is easy, I invite you to chat with the people in those fields.
It's exactly this kind of easy, ignorant posturing that proves my point about the need to introduce actual numbers to the discussion.
Film also has a much greater dynamic range than even the best digital sensors, for the moment.
As digital technology improves, costs will continue to drop and some of the technical limitations of sensors will be overcome. But even leaving aside db's comments about methodology, a pure cost-benefit analysis might not be as simple as it seems, at least not if your photography students are planning to do landscape and architectural photography, where fine detail and camera-based control of convergence is still very important. A Hasselblad 6x6 digital back costs about $10,000, and digital backs for view cameras can go upwards of $15,000.
To say the darkroom should be abandoned completely for digital printers, etc. is tantamount to arguing that art departments must eliminate drawing as a preliminary step to painting, or figurative painting to abstract painting. Or for that matter, perhaps to ditching painting altogether because graphic design software exists.
You may have a good argument to make, but I don't think photography is the right metaphor (nor would books be for literature folks, or DVDs for film studies people).
"This is when I really wish that we had a stronger system for tying curricular decisions to budgetary decisions. Should we keep the darkrooms alongside the digital, or should we use those resources to increase the number of students we could take in the allied health programs? Should we continue to keep a dying technology on life support, or should we use that money to expand our information security program?" As I read this section of your post, and I may have misread it, it appeared that you were saying that in order to fund applied programs that it made sense to do away with what you saw as inessential expenses in non-applied programs, which faculty - experts in their fields, presumably - have resisted. You believe that this is because they're nostalgic for days gone by. I pointed out this part of your post because I think that this is a broader issue that gets at the heart of what we think education is - not merely one that's about getting on with the new or about balancing budgets. I apologize if my comment came off as overly harsh. I certainly didn't intend it that way.
And, if you look at my comment, I didn't disparage allied health professions and information security. I didn't say that they were easy, or anything like that. In fact, I said that they are necessary.
And I said that we do need to take the bottom line into account when we make decisions about what to fund. I just think it makes sense to take professors' advice about what is necessary to teach in their field into account, too.
Again, I'm sorry if I misread your post, and I'm sorry that you perceived my comment as a cheap shot. It wasn't intended as such at all.
I'm not even a photographer, but I know that the development process is as much about the final image - light and shade, depth and intensity, and you can do things in the bath like putting your hand or a filter over part of the light forming the image and get quite different results.
For photojournalism - I think that's going digital.
I went to an exhibition by 16-17 year old art students where they used photo paper, and objects on top of it to create images in the dark room without a camera. Great stuff.
One thing I request of Dean Dad (and I normally 98% agree wholeheartedly with you and 110% admire your prose style) - would be great if you tagged your post.
Some of those days it would be nice to click on a tag (rubber chicken, Scandinavian-ness, TB and TG wisdom, budgets, CC-media-invisibility-field)
To be honest, a bit of TB and TG collected stuff would be just the ticket on the odd occasion!
OK, darkrooms: the right answer probably depends on what sort of courses the department is teaching. If there is a strong "fine arts" component to the photography curriculum, then yes, you do genuinely need both. I'm not a photographer myself, but I have worked with several, and film is not dead in that realm.
From a budgetary standpoint, the space/resources devoted to traditional film developing could probably be reduced from what would have been needed in the 1970's, where every photography student had to be in there for every class. If the pressure is extreme, and the darkrooms must be chucked entirely, then re-brand the curriculum as digital photography intensive. It is a significant change.
You could make 4 color (blue black red and green) dittos if you were skilled (ahem) and creative enough to do it. Try that on your copier!
The reality in photography is that you can't afford to buy your students an 80 mega pixel camera, but you can afford to develop a 35 mm b/w negative or color slide and scan it to that quality level on a $300 commercial scanner. You then do the cropping and "darkroom" work in Photoshop. I should know. I have a picture in print that was done just that way.
I also know that at least one famous b/w photographer shoots on 4x5 and scans it to a size (200 mega pixels) that no digital camera can match, then prints it digitally with a quality that is such that few will notice. It is fun talking to a person who never took a computer class, but rather moved directly from using a 1 foot diameter "dodge" tool in his very large darkroom to using one in Photoshop on (of course) his Mac.
The biggest problem with darkrooms comes from the chemicals. I am told that our photo operation gets written up more than our chemistry labs for hazmat issues. This is more about training than cost.
PS on the management side:
Our college puts all the high-end machines (Macs) in the graphics/ animation lab and the photo folks use them there. One person supports all of them in one place plus the ones in faculty offices.
I am married to a woman of similar vintage who is a graphic designer. Graphic design went through a similar change as photography. One cannot make a living today as a designer without being proficient in a variety of design programs. (This is not entirely true of photographers.) But being proficient in using design programs like photoshop, indesign and quark xpress is not the same as being a designer. I have heard a lot over the years from my wife that she can tell the difference between designers that have been trained in computer-only programs and those that still start the old fashioned way.
I think this is a good analogy to photography. Your profs are probably right that they need both. Students need exposure to both old fashioned photography and new methods. This is where photography is. While their tone may be like every faculty member who fights change, don't hold it against them, they are right.
A disclaimer - disclaiming I don't know what - I was faculty in digital media art until my latest 1-year appointment decided to eliminate my position altogether (budget).
As for photography, to expand on other comments, in a fine arts context, in a very Mcluhan-esque fashion, the process really is a big part of the content (i.e. is the "message"). This is true because it determines the appearance as well as for more explicit, conceptual and intertextual assertions. For that reason, accomplished, mainstream, artists still make daguerreotypes, pinhole cameras, and the seemingly anachronistic like.
Those things, by the way, obviated the primary uses of painting in the late 19th century. Then, to add to that, as far as "employable" applications of painting go (illustration, etc.), nearly all of that is now also produced digitally.
However, most colleges still offer painting courses. Some though, can't spend the money on a ventilation system that will permit using oil paint. In those cases, students are required to use a 20th century technology called "acrylic" that can not be manipulated in the same manner as oil. So, how's that for an analogy?
So, if you run a big art school, certainly you'll want to offer all of the above. However, especially in our less fortunate CCs, and even in a large state U I taught at, the discussion about axing the darkroom pops up over and over again. And, from what I've seen, unless funding for the arts is grand (a longshot for sure...) darkrooms at CCs are dropping like flies.
My thought is to ask the department whether or not they can fill sections of both traditional and digital photo. If so, great. If not... digital photo certainly better prepares students for current standards in the field. I think it's reasonable to accept the probable future of darkrooms as the exclusive purview of upper level university courses and extremely lucky CCs.
A few thoughts of my own:
1) Is the photography program at Dean Dad's school primarily artistic or primarily vocational? Even I (a non-photographer) can come up with examples were film is better than digital, but that's just not the point, folks. (And isn't Dean Dad's school primarily a transfer school? So that students could, I don't know, transfer to another school and learn print techniques there?)
2) I have rarely seen a well-run computer lab, and my guess is that the cost of purchasing and maintaining them is vastly underestimated (or that money gets yanked out of that pot). I don't know how much a print photo lab costs (chemicals, infrastructure, personnel), but I know twenty computers ain't cheap unless you buy crappy ones and let them rot.
Frankly, I think it's right to have these discussions, but cost/benefit analysis does seem to be anathema to some.
If you talk to them and they say they don't need students to have lots of darkroom experience, you can't in good conscience keep it in the curriculum. And if darkroom space is vast and computer lab space is not, I would say there's room for a one to one trade. Make the art department decide what to keep and what to add. This is the place where your ability to approve curriculum becomes real power.
At the end of the day, the important thing is that students get what they need out of your programs. So ask the 4-years, ask industry, and ask newer faculty (or those less entrenched in “but we always did it this way before!”) what do students need to know now? How can we teach them what they need to know?
We've altered and cut all kinds of things to make room for new material in biology. We've sacrificed lab time so that students can have more upper division lectures. Would they be better off with more labs? Yes. Would they be in school for extra years? Yes. Do we have the faculty to staff those labs or the right to make students stay in school until they learn *everything* we think they need to know or the money to pay for supplies? No! So we design a curriculum that gives them a good foundation and allow it to evolve as science does.
At the end of the day, budgets, schedules, space, and staffing being what they are, we have to be strategic in what we do. I see this post as a call for strategic thinking, not a referendum on darkrooms. We need not discard the photo with the stop solution but we do need to be realistic. When new things are added to the curriculum, other older or less relevant things must be dropped. If you want a new computer lab, you might have to give up one of your darkrooms.
That said, red flags go up when an administrator starts telling a department that their curriculum is inappropriate. It suggests that the administrator thinks he/she knows more about the field than the folks with terminal degrees in that field. (If an administrator were to tell our music department, for example, that acoustic instruments were outdated and that we should only teach music technology, he/she would lose all respect from our faculty. Sure, we're willing to embrace new technology, but not to throw out tried-and-true techniques entirely when they are still working well for our students...and when our students are still getting good jobs with the skills we're teaching them.)
There comes a point at which you have to trust that your faculty are well trained, that they know what they are doing with their curriculum and what their students will need, and that they are making good decisions with limited funds. (If you don't have faith that they are competent or that their students are leaving the program well-prepared, that's another matter entirely.)
We do need more information to evaluate the metaphor properly. It seems that DD's CC has an art department ("folks in the art area"), which suggests "fine arts" to me. What do the painters say? The potters? The glass artists? Do the photographers on staff work in film or digital media themselves?
The art profs I know are very conscious and careful about their curricula. It seems to me that if they say you need both, you need both. If you cut the darkrooms over their protests, make sure you drop their requirement to do an institutional effectiveness document, because they won't be able to pull it off with a straight face. (But then who of us really can?)
I'd cut photography altogether rather than cutting print (or digital for that matter) or ditch the darkrooms, split photography off from art and stick it in "digital media." Make sure students know they won't be able to produce the images they'll look at in their history of photography courses, too, though maybe those aren't really that important either. I took courses like that myself, but that was, I confess, back in the 1990s.
Here's a key test: is the new technology better than the old one? For dittos and photocopies, the answer is yes. For photography, the answer is: not necessarily.
Terminal Degree's call to "trust the faculty ... that they know what they are doing" could be applied to administrators too. The challenge DD faces is how to support a program that has sharply increased its costs in an environment where there is no way to increase the revenues (and certainly no desire to pursue other cost cutting measures, like moving even further to use of contingent faculty).
In my experience, an administrator who is struggling with an unaffordable program often has to start asking sometimes-poorly-informed questions like "can't we just close all the darkrooms" because the faculty can't be convinced to focus on the really critical issues until they see that someone else is going to make the hard decisions if they won't. Better to see this type of question as an invitation to join the discussion about resources than as a call to attack the messenger: a department that simply digs into an impossible position risks being axed entirely if that becomes the only acceptable choice.
--Anon R1 dean
The problem with "trust the faculty" is that the photography faculty don't see the opportunity cost of what they're advocating. With the costs of the program having doubled and no new revenue for it, we have to adjunct-out the liberal arts even more than before to pay for it. To my mind, that's an awfully high cost for 'trusting the faculty.'
a couple years ago one of my housemates was an instructor at a Brooks Photography campus. All his students had to take a film/darkroom series of classes that ran along the whole first year, and still he would come home ranting and raving that the students didn't really know how their camera worked or why the digital manipulations they were trying showed their ignorance of how _film_ and the chemicals worked. And he taught photojournalism. So if your students wanted to transfer into that school, or be hired by photogs at the local paper who agree with my housemate, then they better have both darkroom and digital experience.
As a graphic design professor there is an educational reasoning for teaching students traditional materials. In fact traditional hands on production techniques is severely lacking in students in favor of computer solve all problems mentality. Traditional photography is a fundamental - plain and simple.
Digital offers great speed at some loss in quality, but the price is not less than the price of film!
I would agree that there is little need for much more than a single darkroom equipped for b/w print work, but can you afford to get rid of it completely? I gather from a comment by one of our art faculty that darkroom work was a key part of the portfolio of one student who transferred to SCAD. You should know the expectations of the major private design schools that your students would want to attend.
The analogy of film photo to painting doesn't work. Film photo to hand-mixed heavy-metal based paint? maybe. But none of the analogies people are posting seem right to me. Digital and film are different technologies, not different mediums and they are not progressive (i.e., you don't need to know how to use a darkroom to use photoshop). The knowledge for using either a film or digital camera on manual is incredibly similar. The darkroom question is only about the production of the image; it is not about the taking of the picture.
My dad built the photography program at the community college where he taught. He was an amazing photographer and taught a great many people to be professional photographers. I'm bragging, but he really made the photo program the best in the region, including the 4-year schools.
The program is completely digital—no darkroom, and it’s an excellent program. When he switched to digital in his commercial work, he was thrilled because he hated the chemicals in the photo lab. He said he would never go back to print and didn't (and his specialties were architecture and photojournalism). When the CC, at his behest, switched to digital, it was because darkroom knowledge is no longer necessary for commercial/ professional photography Most professional photographers use digital, and anyone wanting to go into a photography-based field needs to be skilled in digital technologies. Demand for darkroom use in the profession is rare. He insisted that students know how their cameras work, but there is no darkroom at his school because students don't need that knowledge to be good, competitive professional photographers.
Darkrooms have become non-essential, not because the skill-set is undervalued as “art” without a commercial outlet, but because the skills are no longer necessary for producing the desired art.
While I'm sure there is lots of value in a photographer mastering darkroom production in terms of range of personal knowledge, etc., to us it is about the quality of the final image.
And we use digital files exclusively.
In my neck of the woods photographers, professional ones, have dismantled their darkrooms not just because clients (like my employer) demand digital, but because they can no longer source darkroom supplies at a reasonable cost. I think that tells you a lot about the industry and where it is going.
I think that a case can be made for keeping a conventional darkroom active if enough courses requiring one are offered--not to mention enough students taking them. Easier to justify as part of a large arts program than as an adjunct to photojournalism.
That said, if resources are scant, maintaining that darkroom (and for that matter other survivals of old tech) is hard to do--or at least to justify. Is your CC in the business of teaching art photography? If so, then the darkroom is still important because a significant number of artists use those wet processes. if not, then it's likely that the attachment is more emotional than practical...hence eligible for a good hard look.
FWIW, my "day job" is in a state government department that produces sometimes hundreds of publicity and documentary photos per week. The last time we used conventional film was five years ago when one of our offriceholders discovered that his complexion was hard to reproduce charitably in digital...print film was more forgiving. Once our phtoographers figured out the digital "filter pack" that allowed him to look good, they hung up the conventional Nikons and invested in better Macs. They--and I--still play with film from time to time, but mainly, I think, to reassure ourselves that we still have the moves we learned--or taught--thirty or forty years ago.
I agree with posters above that if you're teaching photojournalism then an all digital education is certainly viable, unless you have a particularly research-driven program (e.g. into the nature of photography in journalism). But if it's teaching standard publication documentary, then no problem, subject to the caveats above that digital always looks cheaper in advance, if you underestimate licensing costs (which usually happens, especially when platforms come and go).
Everything you consider with the camera--composition, exposure, lighting, depth of field, shutter speed, focal length, perspective--is exactly the same with digital as it is with film. Except you can change ISO speed with every exposure when doing digital.
Development of film is straightforward following of a recipe, for the most part.
If you shot slides, then that's about it, which is analogous to looking at a digital shot on a monitor.
Then there's the whole separate issue of printing, the other half of the skill and artistic decisions in traditional photography. But all total, this was a bunch of chemical recipes to adjust the contrast and brightness of the image--side by side with Photoshop, or even a more basic (and cheaper) photo program, the traditional methods look like kludges. The simple "Levels" command in any photo editing program gives the photographer more control over the image than could be had with variable contrast printing.
The number of things film does better shrinks with each generation of digital cameras, but by now I don't think the degree of artistic control over the final image is far greater with digital than with film.