You don't want to just clean up (ie., remove the Photoshop reference) from graphicdesign.stackexchange.com/questions/199/… and refer abusers to it? Both answers clarify quite succinctly the difference between Point and Pixel. All that's missing is "per inch".
–
FarrayFeb 21 '12 at 0:45

1

It's a perennial. Comes up over and over in different forms on every graphics-related forum, listserv, blog or seminar. e100's right, though, we don't have a definitive statement that covers all the bases and cleanly addresses this exact question. I've made an attempt below. Feel free to edit.
–
Alan GilbertsonFeb 21 '12 at 1:09

@AlanGilbertson Sounds good. After we get some good canonical answer(s), perhaps we could edit excerpts into the TagWikis and hopefully reduce future confusion...
–
FarrayFeb 21 '12 at 2:01

@e100 You are correct. I jumped the gun a bit on that comment based on the gut feeling that we've seen this question before. After reviewing a bunch of other DPI/PPI questions it seems we really haven't hit this subject in a way that future questions could be referred to for a definitive answer.
–
FarrayFeb 21 '12 at 19:56

5 Answers
5

Hi and welcome to GD.SE. How exactly does this improve the existing answers? In general we tend to avoid one line answers and go for more substance. Its not that we dont want you here, we do. But the point of shortchange is to create a repository of excellent answers. And quite frankly theres a bit of learning curve... anyway welcome.
–
joojaaApr 11 at 21:09

Starting to answer my own question, I think one of the problems with explaining this is that only parts of the full story are relevant to anyone's day-to-day work, unless you cover the gamut from web/app designer to print designer to prepress expert. So I'll probably come back to this and tune it for different audiences.

Pixels per inch (ppi) is a measure of resolution in two different contexts.

(a) the resolution of a image printed at a specific physical size

This is not an intrinsic property of an image file. Pixels have no real-world dimensions, and so ppi has no meaning until the image is printed, or at least specified in a print layout which uses physical (inches or mm) measurements.

e.g. 1. You have a 1000px square image in Photoshop and you print it so it's 3 inches square. The resolution of the print is 333ppi.

e.g. 2. You then put it in an InDesign layout at 1.5 inches square and PDF it. Acrobat's prepress tools tell you it's 666dpi - when printed.

e.g. 4. You go back into Photoshop and notice the image has been set as 72ppi all along, and it didn't make any difference to any of these cases. You change it to 144ppi (without resampling, i.e the number of pixels is unchanged) and find it makes no difference to any of the cases either.

e.g. 5. After a bit of experimentation, you find that ppi does make a difference when you print from Photoshop at 100% scale. But so does changing print dimensions - they are directly (and inversely) related, and both are just metadata which specify how the image should be printed, and can be easily overridden.

(b) the physical resolution of a display device

Going back to e.g. 3, the image displayed at different physical sizes because the screens have a fixed hardware grid of pixels. So PPI in this case is an intrinsic property of the hardware.

You can measure the PPI of a screen easily if you know its pixel dimensions, just divide its height (or width) in pixels by its height (or width) measured with a ruler. The first Macs had 72ppi; laptops are up to about 130ppi; current smartphones 200ppi+, up to the iPhone 4's 330ppi.

Dots per inch (dpi) is a physical measure of a print device's output resolution.

The dots are individual blobs of ink.

(to be continued)

NB The term is very commonly used to mean ppi in both senses above.

Personally, I regard this as incorrect, however entrenched it may be, and the source of continued confusion.

Good start! In the end, in terms of 'image data density' PPI and DPI are interchangeable. Whether we call the units of data a pixel or a dot is somewhat moot and arbitrary. I'm leaning towards PPI being a more up-to-date term. But still don't see any difference in definition with DPI when we're talking about image data density. The terminology differences become more apparent when we're talking about hardware: Printers vs. Screens.
–
DA01Feb 21 '12 at 22:38

e100 has me now questioning my opinions on this (which is a good thing!) and in doing some more research, I need to update my opinion a bit.

HISTORICALLY, DPI refers to the print process...namely how many discreet dots a printer can produce on paper. PPI referred to the number of pixels per inch of your monitor.

So, one is about the process of printing on paper, the other is about your display hardware.

Neither is directly about the 'resolution per inch' of your digital image, though both seem to have correlations to the point of being almost identical in many ways.

So, let's say you have a 1200dpi laser printer. That means for every inch of paper, that printer can create 1200 unique dots.

Now...on to the issue of assigning a DPI/PPI setting to a digital image.

A digital image is composed of pixels. As long as the image remains digital, it doesn't care what PPI you tell it it is, as it will always render based on the total pixels. For isntance, a 1000px x 1000px image, by default, in a web browser, will be 1000px wide.

Where giving an image a DPI/PPI setting matters is when printing...that's how the software calculates the physical size of the analog output. So your 1000px x 1000px image with a DPI setting of 500dpi will result in am image printed 2" in size.

Where things get confusing is that IIRC, Photoshop historically used 'DPI' as the term for this. Though it clearly looks like it now uses the term 'PPI'.

Because of that, I'd say either term is appropriate when referring to the output resolution meta-information in your graphics editing software.

So, in summary, my definitions would be:

DPI = a spec for your printer explaining how detailed an image it can print
PPI = a spec for your monitor explaining how detailed an image it can display for a given physical size

In terms of setting up an image for printing = either PPI or DPI is an appropriate term. I think PPI makes more sense for the person dealing with creating the digital file. In the end, you will need so many pixels for decent output. DPI makes more sense for the person printing the digital file, as that is a term that makes sense in the service bureau/printer world.

Still not found where Photoshop gets it wrong...
–
e100Feb 21 '12 at 9:31

2

Photoshop has a DPI setting that it uses to try and emulate a specific PPI when you view an image at 'actual size'. Also note that images don't really have PPI settings to begin with. PPI is a measurement of your screen density and not really your image. As far as your image, a pixel is a pixel.
–
DA01Feb 21 '12 at 15:04

Where exactly is this DPI setting in Photoshop (I don't have the app to hand)?
–
e100Feb 21 '12 at 21:09

@e100 it looks like PhotoShop now uses PPI to change the print resolution of an image. I'm not sure when PhotoShop changed that terminology. So, I suppose to be in adherence with PhotoShop, PPI is the right term. To be in adherence with 30 years of print shop and service bureau experience, DPI makes more sense. It's a toss up.
–
DA01Feb 21 '12 at 21:24

1

PPI is only for calculating physical dimensions for the purpose of output. Nothing changes in the image itself, only its metadata. Once upon a time it was possible to tell Photoshop what the actual screen resolution was in pixels (screen dots, really) per inch, so that "Actual Size" was accurate. I cn't find that setting on a quick search in CS5. There was a write-up on this by Scott Kelby, iirc, several years ago. I'll have to look it up.
–
Alan GilbertsonFeb 22 '12 at 0:10

Alan covered most of the basics quite nicely. I would also like to highlight not just the difference between DPI and PPI, but also the relationship between PPI and display resolution (which is just the raw numbers of pixels in a display).

I think many remember those heady days when we got to move from VGA to SVGA to XGA resolutions. It was nice for a while when most consumers had monitors with specs similar to 1024px X 768px @ 72 PPI and we had a pretty safe target for display graphics.

This complacency is stillevident, but ever more unrealistic. Many mobile manufacturers have been focusing on increasing resolution while the physical screens change slightly, if at all. The result is PPIs that are on the rise in small hand-held form-factors. This provides many opportunities to accidentally create graphics that simply aren't useful (or, nearly as bad, get automatically scaled-up by the operating systems at the expense of those clean edges you worked so hard on).

An easy real-world example is Apple's iPhone. The first 3 editions had an 89mm (diagonal) screen, with 320px X 480px (163 PPI). Then they introduced "Retina" display (hello, Marketing) which bumped the resolution up to 640px X 480px but was still only 89mm in physical size (326 PPI!). BlackBerry has made similar moves (though without the marketing panache) and it looks like the same is about to happen with iPad as well.

One other term that may also be of interest is DIP or DP. Microsoft refers to them as "Device Independent Pixels" and on the Android platform they are "Density-independent Pixels". They have slightly different names, but are the same core concept. The goal is to allow apps to be "DPI aware" and scale text/objects around constant conventions. Density-aware applications are aware of the display's PPI and your desired "DPI", and then scale text and objects appropriately.

Taking this back to our ever-increasing-PPI scenario, if you designed a website or app with a 80px font, you may think that's pretty large on your 163 PPI screen. Your text would be nearly a half-inch tall, which is reasonably large on a phone. But then the next version of the phone comes out and suddenly your text is only 1/4" tall at 326 PPI. This would be especially bad if some of your text was scaled in another unit of measurement and suddenly body text is larger than header text. (Sometimes I still run across websites that made this mistake.) If instead you were using DIP/DP for scale, you could hold both phones side-by-side and the text would be the same physical dimensions (though perhaps noticeably sharper on the higher-DPI model).

At any rate, while PPI may be the most confusing of the measurements, it is incredibly important to pay attention to it. If you have the option of designing with DIP or DP units, it should handle some of the guesswork for you and allow you to target a wider array of displays without increasing your technical burden.

My apologies that this answer is so screen-centric, I rarely dabble in print.

AFAIK, Microsoft used to call DIPs TWIPS. They are useful for cases where the screen is more of a viewport: many applications on smartphones allow for zooming, and in this sense the screen is a viewport measured in logical units. The higher PPI screen allows for better looking images at low zoom levels.
–
horatioFeb 21 '12 at 19:00

A pixel (the word was originally coined, iirc, by IBM and derives from "picture element") is the smallest indivisible unit of information in a digital image. Pixels may be displayed, or they may be printed, but you can't divide pixels into smaller pieces to get more information. How many channels and bits per channel make up one pixel is the measure of how subtle the information in a pixel may be, but the basic fact is that 1 pixel the smallest increment of information in an image. If you do video, you know that pixels don't have to be square -- they are non-square in all older video formats. Square or not, a pixel is still the smallest unit of a picture.

An inch (okay, so you know this already -- bear with me) is a unit of linear measurement on a surface, which could be a screen or a piece of paper.

A dot is, well, a dot. It can be a dot on a screen, or it can be a dot produced by a printhead. Like pixels, dots are atomic. They're either there, or they're not. How much fine detail a screen can display depends on how close the dots are (what they used to call "dot pitch" in the old CRT days). How small the dots are from an inkjet, a laser printer or an imagesetter determines how much fine detail it can reproduce.

Dots per inch is fairly easy. A screen has so many dots (each comprising R, G and B elements) per inch of screen. It's the same on paper. A 1200 dpi printer can lay down 1200 dots in one linear inch. In describing screen detail or printer output, dots per inch is the correct term.

PPI is where the confusion comes in. An image has so many pixels. Its metadata contains an output size in inches, cm, mm, M&Ms, whatever. It's the width in pixels divided by the output width in the metadata that "per inch" comes from. So the same image with different metadata may be 72 ppi, 150 ppi or 8000 ppi. The image information is the same; all that's changed is the metadata.

A quick and easy demo that somewhat illustrates the point is to make some marks on a piece of elastic, say five to an inch. Stretch the elastic to twice its length. The number of marks hasn't changed, even though the "marks per inch" is now 2.5.

You can see this in Photoshop if you turn off Resample Image and change the size. The ppi value changes to reflect how small the pixels must be reproduced in order to hit the measurement value in inches/cm/mm etc. Note that in this case the Pixels fields are disabled. You can't change those values unless you resample.

Mass confusion entered in when image pixels were mapped to screen dots in web browsers. A 200 pixel image shows up as 200 pixels in a browser. How large it is, measured with a ruler, depends on the dots per inch of the screen. The image metadata might say it's 200 ppi or 72 ppi or 1 ppi, it will still occupy exactly 200 screen dots. The world remains fixated on "72 ppi for the web," so the question of "what's the right resolution for web images" keeps coming up, and the correct answer, "it doesn't matter," keeps being supplied ad nauseam.

If you're still with me, there's one last step that brings the two together.

A 720-pixels-wide image at 10 physical inches wide has a resolution of 72 pixels per inch. If you print it on a 1200 dpi printer, there will be 1200 dots per inch on the paper, but the image is still 72 pixels per inch. That's why it looks like crap. On the other hand, a 7200 pixels wide image printed at 1 inch wide will exceed the resolution of our 1200 dpi printer. Photoshop (let's say) and the printer driver decide which pixels to throw away and which to actually print. Some of the printed dots will be averaged among adjacent image pixels, but, regardless, some of the image information has to be thrown away. The output will be 1200 dpi, but the resolution of the printed image will have been reduced to at most 1200 dpi by the software.

Hmmm, good stuff but think the mention of "dots per inch" for screens is confusing...
–
e100Feb 21 '12 at 9:27

2

It's probably worth mentioning as well that each pixel in an image can be any colour in the working colour space, dots (especially in digital printing) are generally far more limited. A 1200 dpi printer can print 1200 dots per inch (usually 1200 dots per inch per colour), but each of those dots is either an on/off or available in a limited number of sizes (usually four or fewer). That means that you need a group of dots taken together to accurately create the appearance of a pixel's colour. (cont'd)
–
Stan RogersMar 2 '12 at 7:45

1

Printing a 300ppi (print resolution) image on a 1200dpi printer would mean that there are 16 dots of each colour (with a low "depth" for each dot) to represent each of those pixels. While printer/driver systems are pretty good at dithering, that's still a pretty poor gamut. That's why digital printers with more than four ink values (lighter blacks, light cyan, light magenta and often one or more pure colours that would normally be mixed) can print so much better than a $50 1200dpi four-colour all-in-one—each of the dot groups used to print a pixel can represent a much larger range of tones.
–
Stan RogersMar 2 '12 at 7:54

@Stan - I would encourage you to fold this info into the answer.
–
e100Mar 21 '12 at 16:54