Why I shoot film by John Kossik

Why I Shoot Film
John Kossik shares with us his thoughts on shooting film and gives us a science lesson in the process.
When I was in 1st grade back in 1966 I would walk the 10 blocks from my home in Trenton Michigan to St. Timothy’s Elementary school will a little plastic old-school collapsable briefcase in hand. The only item that I can ever remember being in this briefcase was a red workbook entitled “Think and Do.” It was some kind of early-reader that all kids my age used, “See Dick and Jane run.” I remember little of the specifics, but the title has stuck with me for some 48 years.

You see this seemingly innocuous title is quite profound being based on the “Scientific Method.” The Scientific Method that since it origins with Ibn al-Haytham over 1000 years ago since then has formed the basis of our understanding of the universe.

In its base form, the Scientific Method tells us to: (1) Formulate a Question to be Answered; (2) Create a Hypothesis that answers the Question; (3) Make a Prediction based on this Hypothesis; (4) Conduct an Experiment; (5) Analyze your Results to see if your Prediction is correct.

If your Results match your Prediction then you are done and you have an understanding of how the Universe works, an understanding that you can use to predict the future. If your Results do not match your Prediction than go back to (1) and try again.

Formulate, Hypothesis, Predict (THINK)

Experiment, Analyze (DO)

THINK and DO. Note that the order is important here, first we THINK then we DO, and this is where our so-called “digital revolution” can let us down.

Take “data-mining” for example. Companies like Google, Microsoft, the NSA, etc. collect terabytes and terabytes of data just for the sake of collecting it and “hoping” that they can “find” some trends or patterns or human tendencies buried in it to use to make money or control people. Unfortunately they do not even know in most cases what they are looking to find. They are DOING before they are THINKING!

Well you say this is not inherently bad, but it is as it is teaching the younger generation the direct opposite of the scientific method and in as such we are losing analytical skills that we have carefully cultivated for a millennia. Case in point. Check out the Wikipedia entry on Monte Carlo method (http://en.wikipedia.org/wiki/Monte_Carlo_method). There you will see the use of this complex computational tool to approximate the value of pi. To do so you need thousands of inputs to approximate pi. On the other hand you could just take your bicycle wheel, mark a point on it, roll it till you reach that point again, measure the difference traveled, and divide it by the diameter of the wheel!

This ludicrous example is noted here to illustrate the dangers of humans being too dependent on electronic devices and methods of which they have no understanding. How many young people even know how to use a compass?Better yet, if you are in the northern hemisphere, how many can tell you North by simply looking for the big dipper and the North Star? I was purchasing a cell phone lately and specifically wanted a model with a piezoelectric chip sensor in it so I could use it to determine the barometric pressure. I went to three stores and none of the people at any of these even knew what I was talking about when I mentioned barometric pressure let alone the piezoelectric effect. When the general public does not understand the workings of devices that they depend on every day, then they become (unknowingly) slaves of the small minority that controls them. A democracy itself is in peril under these conditions.

What does this have to do with photography and the discussion of digital vs film? Everything! This can most easily be seen in the phenomena called “Chimping,” clicking a digital picture and immediately checking it. In doing so we are DOING then THINKING. Aside from the danger of walking in front of a bus while “chimping” there is the fact that no THINKING is needed prior to taking the picture. Without this prior THINKING we are much less likely to formulate a viable hypothesis/prediction sequence than if we followed the proper steps of the scientific method. To be fair, there are many who use digital cameras in the THINK and DO mode. First they set their exposure settings (not using AUTO), frame their subject (THINK), then take the picture and review the results for exposure and content (DO). Matter of fact all good photographers do this, and this is the same method used in film photography, the only difference being that the time between THINKING and DOING can be a few days or at least a few hours due to developing time.

The problem is that most people using digital cameras do not do this, they simply DO then THINK (or just DO, DO, DO, DO, DO). Alas the scientific method is lost on them and they learn nothing from their efforts.

Will digital photography cause the downfall of the human race? Not quite, but is does serve us well as the “canary in the coal mine” of a path we seem to be traveling down that leads to a dead end.

Shooting film FORCES you to THINK then DO, and thus FORCES you to use the scientific method whether you know you are doing so or not. That’s why I shoot film.

12 Responses to Why I shoot film by John Kossik

It’s well on the way to a downfall.
Till a few years ago I was running a research lab with its share of graduate students who really thought this way. They kept hitting themselves on their heads with a hammer, because it felt so good when they stopped. Throwing away resources until “something happens” is just so much easier than asking for guidance or thinking about it. They all have their PhD’s now.
Speaking about film, one bright spark asked me if my polaroid camera could scan, as well as print. She now has her PhD too.

I agree. Encouraged by my father, as a child I used to walk around with the thumb and index finger of each hand held in an L shape, and the two hands together to form a viewfinder of sorts, which I would hold up to my eye. I learned a lot about framing, and saved my dad some money on film developing, no doubt. I also probably looked odd to some! Eventually, I moved up to film and fixed-lens cameras, which again forced me into thinking and re-positioning before shooting.

The data mining example is one I use when I try to teach thinking before doing, as I believe virtually free and unlimited computing power is being used today in what is really a ‘lazy person’s’ approach to brute-forcing insight out of data. It’s all about big data, instead of the right data. This sells expensive software to clueless execs, who try to ‘keep up with the Joneses’ out of fear of looking out of step. Sadly, reasoning skills are not being taught in schools or applied in the workplace as they should, because of the abundance of ‘tools’. A further danger is that, in not understanding a tool fully — and who knows how a search algorithm works, really — one is doomed to simply taking the results on faith. This creates an unhealthy dependence on tool providers, and puts them in a permanent ‘guru’ position. I now think that if we ever suffer an extended power blackout, our ability to think and solve problems will come to a screeching halt and the collective IQ will drop into single digits. Back to the Stone age at the flick of a switch, as it were.

Back to photography, on the positive side, one could argue that digital offers ‘instant feedback.’ How many of us, however, are using that great opportunity to actually learn to re-frame, or change the lighting, etc., and how many just discard and shoot again with no great thought given to the task at hand? As mentioned, a similar point can be made about prime lenses vs. zooms to learn positioning relative to the subject.

Warren Buffet argues a similar idea in his investment approach, which he views a as a dance card with a finite number of ‘tickets’, each one representing an investment that you must carefully vet before you dive in. The more common alternative, sadly, is diversifying (‘diworseifying’, according to Peter Lynch) into mutual funds, each holding hundreds of companies you don’t know or understand, guaranteeing subpar or average performance at best when what is needed to really make money is a focused bet on an outlier.

I have simplified because of needed brevity, but the idea is clear. You have to do your homework and put in the time to gain understanding about anything. And necessity, not abundance, is the mother of invention and progress. If you had only one roll of film, with one shot left, what image would you try to make?

Hallmarks of a curmudgeon’s diatribe: irrelevant, nostalgic tale about one’s glory days, espousing the one true, correct way of doing things as if one were Moses on the mountain, complaining about how the youth of today are learning all the wrong things…

If you think anyone who doesn’t already share your viewpoint is listening to your rant, then you’re delusional.

Also, a metaphor is a tool of poetry, not of logic. Just because you find a similarity between digital photography and data mining or the Monte Carlo method, doesn’t mean any actual correlation exists.

Hell, I shoot film for exactly the same reasons as you’re describing. But that’s my personal choice. Each to their own. Get off your soap box.

Looks like I sparked some debate. Pro or Con any debate is a good thing. I hope I did not come off as a Luddite as I am far from it. I have and continue to use many of the high-tech tools like rigorous computer simulations in my career as a chemical engineer, something I have practiced for over 30 years. Hell, I even write my own Android applications (http://www.63alfred.com/63alfredapps/).

As everyone though my outlook on the world is primarily formed by my interactions with it and those interactions show far too much of a dependence on complicated mathematics that can only be computed digitally. Prior to the extensive computing power we have today the calculations (mostly empirical based on actual field observations) were necessarily very conservative. Though wasteful of resources (structurally beam oversized etc.) they provided a safety factor that almost always paid for itself during the life of the system. Today there is a tendency to design systems with little safety factor in them as our simulation programs allow us to trim this away. Of course this is only as valid as the equations, algorithms, and programming of the simulation tool you are using. I fear that as time goes on engineers will loose their “gut-feel” for systems they are designing and their ability to conduct a “back-of-the-envelop” calculation to ensure that the computer is reflecting reality and nature.

Nature is analog, we are analog, digital is only an approximation of nature and we have only been able to use it as such since we have developed the computing power to make our data points small enough and numerous enough to generate a smooth curve. Digital is here to stay and it should stay as it is a very helpful tool. But becoming dependent on a tool of which you do not have an understanding, whether it be a digital camera, the water heater in your house, or your car puts you at the mercy of those that do.

Some are comfortable with that some are not.

Nice discussion, keep it going:

“Freedom is hammered out on the anvil of discussion, dissent, and debate.” Hubert H. Humphrey

I have been shooting film for 34 years and a DSLR owner for 6 months. So, it will probably take me a long time before I develop the bad DSLR habit of chimping. I haven’t chimped any digital photo that I’ve taken yet; I will review the photos in my camera afterwards, but not immediately after.

John, I agree with a lot of your points, if not necessarily your reasoning. I prefer to understand the tools I use, mostly because I find it rewarding, and it encourages my creativity. The more I know about a process, the better I can bend it to my desired outcome. However, that’s just me. I know plenty of others who don’t think this way, and produce work that is just as valid and worthwhile and creative as mine, if not more.

I think it’s very worthwhile to investigate and improve the way one works, and discussions like these can provide some insight to the working methods of others. But I’m certainly not going to say my way is better, or that others should follow my lead. I don’t even stick to my own rules dogmatically. I use plenty of processes that I have little or no understanding of. If I had to understand the nuts and bolts of every piece of software or mechanics I used, I’d still be working at a stone-age level. I mean, if Newton was happy to stand on the shoulders of giants, why can’t I be?

Also, the idea that an analogue process is superior because nature is analogue is a bit far-fetched. Photography itself is an approximation of nature, so you’d better get comfortable with the idea. Not only that, but film could be thought of as ‘digital’ too, or binary at least. A silver halide crystal can only be either developed or fixed, there’s no smooth curve there…

I’m not scientific at all in the conventional sense, although I have a philosophical and somewhat analytical cast of mind. Yes, I believe in scientific method and the need to understand the tools one uses, but what I’ve found in getting into film with sophisticated as distinct from point and shoot cameras is that I often get unexpected results — and sometimes the unexpected and unplanned turn out to be pleasing.

Sometimes this might be due to inexperience or lack of knowledge, but as often as not it can be because the scene or subject one is photographing changes in unanticipated ways, or one sees in the resulting picture something not noticed while shooting. Such results, when they happen, can be most satisfying.

This is not a plea for careless photography, but rather an acknowledgement that much photography is not necessarily best approached as some kind of scientific experiment — despite containing scientific elements.

Also, although I love what my digital cameras can produce, I suspect I have a sentimental attachment to the look of film because of the generation to which I belong, and childhood memories of poring over the family photo album.