RAW vs. JPG Images

Wednesday, July 05, 2006

Click to recommend this page:

To shoot in RAW or JPG mode on your digital camera? Do you really
get the extra, fine detail that 16-bit RAW mode purports to deliver,
and that everyone is raving about? Or is it a case of the Emperor's New
Clothes, and no one wants to say anything? Or, more precisely, do people
even notice?

To begin, let's understand where the controversy started. When
the first digital cameras came out, they shot reasonable pictures, but
the cost of flash memory was so high, that one had to spend a fortune
on memory cards to get a suffiently large number of pictures onto a card
before running out of room. Most consumers didn't really care about
image quality as much as pros, so cameras were set to use very high
compression in JPG mode. This translates to very low quality, which was
fine for consumers, but pros had no choice but to use RAW mode to get
good quality images.

Since then, the technology has moved on: the quality of digital image
sensors improved, and the cost of memory went down. So, cameras eventually
evolved to the point where the quality between JPG and RAW have become
virtually indistinguishable. But the mantra of "shoot in RAW" never went
away. To be clear, there are differences, and so long as one is aware of
them, one can make an intelligent choice about whether to shoot in RAW
or JPG. But if you are not astutely aware of those reasons, and you
are skilled at using use tools that make RAW genuinely useful under those
rare conditions, you should be shooting in JPG mode.

The rest of this article explains the technical reasons behind all this.
And if you are in the business of selling pictures, all the more reason
you should be educated about this topic.

Images and the Human Eye

Show/Hide this section

The human eye is limited to a color spectrum that can be expressed
completely and fully by a JPG image. A RAW image contains far more
image detail than we can perceive. This extra data can be quite useful
under rare circumstances, so to explain that, we need to back up a bit
and explain what a pixel is.

In brief, a "pixel" is a "picture element," which is simply a number
whose value represents an index into a colormap. You've no doubt seen
a multi-colored "wheel" that somewhat resembles a rainbow. This is an
image where each pixel's value increments by 1 to create the full
spectrum of colors. The first pixel is black, and the last pixel is
white, and everything in between is some blend of red, green and blue.

A colormap can be very largealso called a "wide gamut"or very
small. The original web browsers could only handle 128 colors, and
even earlier computer monitors could only handle 16 colors.

Today's computer monitors can actually display a wider colormap than
what the human eye can perceive. It's not that they need toit's just
a byproduct of other features necessary for the screen to work.

As it pertains to perceiving color, the fact that cameras can discern
between color variations at a granularity beyond 8-bits is technologically
interesting, but it's not functionally useful. The RAW mode is simply a way
to capture this greater perception of color.

NOTE: I'm talking about color, not brightness. Highlights and shadows
are not affected by RAW vs. JPG. Which begs the question: How is
it that some people say that they can see the differences in detail
between RAW and JPG? They say that the RAW image captures more detail
in highlights and shadows than the JPG file.

The observation may be correct, but it has nothing to do with RAW vs
JPG. Indeed, you could save the original RAW image as a JPG and still
see the detail. So, if the JPG file can present precisely the same level
of detail as the RAW file, it must not be the RAW format responsible
for the better image.

The answer is that some cameras do a poor job of converting the image
into JPG format in the camera. That's right, it's not the JPG file format,
it's the function of the camera itself. While this was true more often
in the past, it is becoming less so all the time. For those older cameras,
photographers had to shoot in RAW first, then save their images to disk,
then convert to JPG using software (such as Photoshopo or something).

As noted earlier, this is largely a historical artifact that no longer
applies today. At least, not with SLR digital cameras. While there are
lingering and diminishing consumer camera models that produce inferior
JPG files (meaning you would choose RAW if you actually owned one),
this will almost assuredly be entirely obsolete by 2011, if not sooner.

(NOTE: As of now, it is 2012, and I know of no camera whose JPG quality
is too low to warrant using RAW for general shooting purposes.)

The RAW Exception

Show/Hide this section

The last caveat where RAW's granularity of detail could be useful is one
that professionals may need to exploit under some conditions. Even though
RAW images contain image detail that is not perceptible by the human
eye, one can exploit those additional bits in order to amplify them to
express an artistic or creative effect. For example, consider a
solid-colored fabric like a blanket in a shadowy light. Here, the
color range is very, very narrow, but there are very subtle nuances
that can be preserved in the RAW format and then accentuated using
appropriate software techniques. Again, the human eye wouldn't discern it
in its native form, but the artifacts in the precision can be unnaturally
exploited to reveal interesting artistic effects.

Speaking of image editing skills, this leads to another perception that
misleads people into thinking RAW is better: many pro photographers saw
they use RAW. Don't confuse their creative skills as both photographers
and post-processors (using Photoshop) with the RAW file format. The
same software engine can do the same things to a JPG image and yield
visually identical results. Just because the pro used RAW doesn't mean
he had to, and nor should you. One has nothing to do with the other.

Don't forget, people created amazing images with film for over 100 years,
and still do so today, and the dynamic range of film is far below what
even consumer-grade digital cameras can shoot today. Even in JPG mode!

In fact, many pro photographers talk about using a tool called "Adobe
Camera RAW", or ACR. As its name implies, it is an image-editing plug-in
to Photoshop that was originally designed to deal with RAW images. I've
used this tool, and while I genuinely like its capabilities, it's simply
a different processing engine than Photoshop. It requires different
techniques, and yields different results, much like shooting between
two different brands of color film. Both Fuji and Kodak are good if
you learn how to use them to take advantage of their particular
characteristics. Accordingly, you may wish to use either Photoshop
or ACR depending on your desired result, or your familiarity with either
tool. And since ACR can be used to edit both RAW and JPG images, I have
accordingly found that applying exactly the same manipulations on the
same file in RAW and JPG yielded no visually discernible differences.

The truth is, the largest contributor to an image's quality is the
creative and technical skills of the photographer, and these skills
are honed by focusing on the tools he or she happens to use, whether
it's film and a chemical darkroom, or a high-end camera shooting RAW
or JPG. Artists have to master their tools to achieve their desired
goal. Their abilities are not governed by the tools, but their mastery
of them. That these artists might not be aware of it is part of why
there's such misinformation about RAW in the first place.

It's not that I dismiss RAW entirelythere are occasions that include
the need or desire to alter white balance in post-processing. To some,
this is highly useful, and remains the primary reason why I would choose
to shoot in RAW. It's usually in complicated lighting scenarios,
where there's a mix of lighting sourcesnatural with incandescent,
with fluorescent, and so on.

Lastly, the biggest downside is RAW mode is that it is
proprietary. It's not just among each camera manufacturer, but from
each iteration of your own camera to the next. Yes, the very RAW data you
shoot today is not guaranteed to be readable at any given point in
the future, even by your own camera if you ever upgrade it (or buy a new
model). If you archive your images in RAW mode, they may be readable for a
while, but one day, they will suddenly be unreadable. Should you ever need
to recover old images from backup disks, they may be totally inaccessible.

Removing the proprietary file format of RAW into a single standard that
all camera companies could employ would open the entire workflow of
applications that touch imagesthese include software editing tools,
desktop applications that import images (say, Word), online social and
business sites, devices, printers, and so on. As long as RAW remains
proprietary, there's no incentive for the broader industry to make RAW
images useful.

At the time I originally wrote this article (2006), I have looked back on
my RAW images from the day and compared them to today's images, and have
found them to be not only obsolete, but less useful than their JPG versions.

I should note that shooting in JPG mode is different than
archiving images in JPG format. I never archive files in JPG format.
I shoot in JPG mode, and after downloading them onto my computer, I open,
edit and save directly to TIFF. This is a much better format for image
archival, and because it is standard, it's guaranteed to be around for
a long time. There are other formats becoming available, and Adobe has
introduced the "DNG" format, to hopefully convince camera companies to
use it instead of their own RAW format.

Why the Controversy?

Show/Hide this section

If the issue of RAW vs. JPG is so academic, why the big fuss?

There are two forces at play here: photographers and camera manufacturers.
Together, they are creating a feedback loop, where the actions of each affects
the decisions of the other, and then back again. The problem is that true
information is lost in the noise. As it pertains to RAW vs. JPG, there
are those who don't understand the science and the nuances of what they are
told. Pro photographers mis-interpret (or misapply) the message they
hear from camera manufacturers, and then publish their opinions and underwrite
the marketing messages that get people to buy particular cameras. These
recommendations trickle down the food chain, and consumers end up buying
cameras merely because of features that they are told are good for them,
such as the ability to shoot in RAW mode.

As cameras with such capabilities increase in sales, this both legitimizes
and amplifies the mis-translated marketing message from the camera companies
and they produce more of the same. The camera manufacturers get into the mode
of developing technologies that their marketing departments think the
customers want, even though this "need" was really a mis-impression from
information that came to them through a media source that mis-applied a
technical review from a technophiles with good intentions, but a misguided
sense of real-world applications.

The "fuss" comes into play when someone in the crowd (like me) actually
takes a more pragmatic view of the emperor and proclaims that he's not
wearing any clothesat least none that are visible to the naked eye.

In summary, "yes!" the techies are right in the most academic sense,
that 16-bit images have more data than 8-bit images. But the real world
makes their observations far less beneficial than the work and other
problems necessary to bother with RAW mode in the first place. RAW
is a very importantly aspect to photography that must exist if
for no other reason than to keep track of what cameras do internally,
but its practical use to 99% of today's real-world photographers is
close to nil.