There’s a wild meme on the loose. A significant number of people seem to have decided that “px” in CSS is an angular unit rather than a unit of length, and moreover that it is in some sense “non-linear”. This is wrong – WRONG! Confusion runs deep and common sense is imperiled. I’m going to try and set this straight.

The confusion appears to originate with this article. You can see how far it has spread by googling for “px angular” (minus the quotes). I first came across the meme in this tutorial, and it appears to be repeated all over the place – here, here, here (where it received only upvotes), etc. (Edit: It has now popped up on reddit, too.)

Now, a (very) charitable reading of the whole of the original article suggests that the author may actually just about understand how this works, at least in part. However, the explanation is severely misleading and there are many individual statements there that are plain wrong – most significantly the title itself: “CSS px is an Angular Measurement”. It simply isn’t. It’s a unit of length.

How px units are actually defined

The “px” unit is basically intended to correspond to a real-world pixel. However, the authors of the CSS spec recognised the obvious fact that “pixel” is a somewhat nebulous concept. Simply saying something like “a pixel is defined as the smallest object that can be displayed on a device” would have made px a completely useless unit, as designs specified in pixels would vary wildly in actual size depending on the display technology, from low-res monitors at one end of the spectrum to high-res printers at the other.

To understand how they addressed this problem, you first need to be aware of how they approached the entire question of having multiple different units. Rather than allowing the various units to vary in size relative to one another depending on the circumstances, they (very sensibly, IMO) decided on a completely rigid system of units: the ratios between the different units are absolutely constant and do not depend on the display technology in use.

It certainly won’t surprise you to know that 1in is always equal to 2.54cm, but it may be less obvious to you that 1px is always equal to 0.75pt, and that 1pt is always equal to 1/72nd of 1in. This means that 1px is ALWAYS equal to 0.75 × (1/72) × 2.54 = 0.0264583333… cm. That is, 1px is around 0.265mm.

There is no wiggle room in this. The standard is absolutely clear. The following is taken from the section of the CSS 2.1 standard linked above and contains everything needed to perform the calculation:

in: inches — 1in is equal to 2.54cm.

cm: centimeters

mm: millimeters

pt: points — the points used by CSS are equal to 1/72nd of 1in.

pc: picas — 1pc is equal to 12pt.

px: pixel units — 1px is equal to 0.75pt.

This system ensures that the relative sizes of things that are specified in different units remain constant regardless of the display technology, so that’s that problem solved.

(For completeness, I should mention here that the “em” and “ex” units are exceptions to the above. The lengths 1em and 1ex do vary relative to the other units, because they depend on font size.)

How the entire system of units is “anchored”

So it’s time to rest my case, then? 1px = 0.265mm, as we’ve established. Any claim that it’s an angular unit is obviously complete rubbish, right? Well, not quite. There is a bit more to this.

The thing is, when the standard says “cm” and “mm” above, it’s not referring to the familiar “cm” and “mm” we deal with in the real world, those sturdy units of length defined by the SI system. What is being referred to here is CSS “cm” and “mm” units, which are not the same thing at all.

You see, if the CSS authors had just said that a CSS “cm” is always the same thing as a real-world SI “centimeter”, they would still have had a problem with varying device resolutions. Although now all devices would represent “1px” as the same physical distance (up to the ability of the device to represent it), much of the time this distance would not be an integer number of actual device pixels. Only on a device where a pixel really was 0.265mm, or some simple fraction or multiple thereof, would an integer px length routinely result in an integer number of pixels.

This wouldn’t be much fun, as px-measured lengths would be subject to aliasing effects (or almost-as-bad anti-alising effects). Who wants their 1px border to disappear because the screen pixel is bigger than 0.265mm? This would again have rendered “px” pretty much useless as a CSS unit.

The way the CSS authors handled this problem (again, sensibly, IMO), was to allow the user agent (ie. the browser) to choose a useful precise size for the “px” unit and then size all the other units relative to that “px” unit. This is referred to in the standard as “anchoring” the system of units. From the standard:

For print media and similar high-resolution devices, the anchor unit should be one of the standard physical units (inches, centimeters, etc). For lower-resolution devices, and devices with unusual viewing distances, it is recommended instead that the anchor unit be the pixel unit.

As an aside here, notice that for printers, which don’t really have a relevant concept of “pixel” (dots are too small to care about), the recommendation is to anchor to physical units in which case 1cm in CSS really would be one real-world centimeter. (N.B. I don’t know how accurately this recommendation is typically implemented.)

The results of this choice are summed up in a comment in the standard:

Note that if the anchor unit is the pixel unit, the physical units might not match their physical measurements. Alternatively if the anchor unit is a physical unit, the pixel unit might not map to a whole number of device pixels.

How big do you make 1px when you’re anchoring to the pixel unit?

We’re done now, surely? A pixel is 0.265 “CSS millimeters”, so if someone is writing a browser and they’re anchoring to pixel units they’ll make 1px the integer number of device pixels that is closest to 0.265 “real-world millimeters”, right? Wrong! Or wrong at least some of the time.

Imagine for a second that your webpage is displaying on some funky futuristic eyewear. Now imagine it’s being displayed on a ridiculously massive display wall. Now imagine the font size is set to 4.24 real-world millimetres in both cases (that’s 16px, if 1px=0.265mm in the real world).

You see the problem.

In reality, the physical size of a pixel doesn’t really matter that much, on its own (aside from the aliasing issue, of course). What matters is how big it appears, and that varies depending on how far away from it you are. In order to allow web pages to look acceptable both on small devices held close to the face (eg. mobiles), and on large devices positioned further away (eg. TV screens), the CSS standard authors came up with another concept: the “reference pixel”.

The reference pixel is intended to reflect “yer basic pixel”: a pixel on a standard 96dpi monitor positioned roughly arms-length from the observer. That is, your typical web viewing setup from the 90s-2000s. However, the reference pixel is intended to scale in size depending on the “typical” distance of an observer from the display. If your display is typically held about half an arms length from the observer, then its reference pixel would be half as large. Two arms lengths away? twice as large.

The nice standards people even provided the following diagram of this happening, and if you remember your high-school geometry, phrases like “similar triangles” may be going through your mind:

Simple enough, right? The recommendation (when anchoring to px units) is to size the 1px unit to be the integer number of device pixels that gets you closest to the reference pixel size at the typical viewing distance.

Aaaand we’re done. Really, this time. That’s all there is to it.

So where did this “px is an angular measurement” thing come from?

Well, IMO, the standards authors screwed up a little when they chose the wording of the standard itself. They defined the reference pixel in a perfectly natural way, but they worded it really badly:

The reference pixel is the visual angle of one pixel on a device with a pixel density of 96dpi and a distance from the reader of an arm’s length. For a nominal arm’s length of 28 inches, the visual angle is therefore about 0.0213 degrees. For reading at arm’s length, 1px thus corresponds to about 0.26 mm (1/96 inch).

This is describing exactly the behaviour that I described above, but they are doing it in terms of the other high-school geometry phrase that might have occurred to you on seeing the diagram: “angle subtended”. The (perfectly reasonable) definition is that the reference pixel is sized such that it subtends the same angle at the eye as is subtended by a pixel on a 96dpi monitor at arms length.

However, you can’t escape the fact that the quote above does say, quite clearly, “[t]he reference pixel is the visual angle…”

That wording, to my mind, is just wrong. What they are trying to explain is how the size of the reference pixel is determined, but they’ve ended up saying that the reference pixel “is” an angle, which doesn’t even make sense (as we’ll see below). A more precise, if less easy to read, version might be

The reference pixel is a length that subtends the same visual angle when viewed at the typical viewing distance as one pixel on a device with a pixel density of 96dpi subtends at arms length.

So what’s wrong with the articles you linked to?

The articles quoted above appear to be taking this part of the standard and misinterpreting it to mean that the “px” unit is actually an angular unit. This is perhaps an understandable interpretation, given the wording. I mean, if the reference pixel “is an angle”, and a browser that is anchoring to “px” units is trying to make 1px be the same as the reference pixel, then surely 100px is 100 times the reference pixel angle – ie. a larger angle?

However, this interpretation contradicts the rest of the standard. It also contradicts plain common sense. Lets explore the implications if this “px is an angular measurement” interpretation were to be correct.

The first thing to observe is that in the cases we are interested in the display itself is clearly intended to be a flat, 2-dimensional surface. I don’t think the standard explicitly states this: the closest I can see to it is the mention of “print” and “screen” under “media types”, where “screen” is described as “Intended primarily for color computer screens.” However, it would stretch credibility to claim that they are talking about anything other than a flat screen, ie. 2-dimensional Euclidean geometry.

The next thing to observe is that on a flat surface, the angle subtended by an object depends on its position on the screen. Say you stand facing a flat screen, with a black square directly in front of you, in the dead centre of the screen. If that square then moves out towards the edge of the screen, it will subtend a smaller angle – ie. it will take up a smaller chunk of your visual space. It will also start to appear distorted by perspective.

So, if we’re to say that an object of size, say, 5px is required to subtend the exact same angle regardless of where it is placed on the screen (which is clearly what we must mean if 5px is an “angular measurement”), it’s clear that this object must be made physically larger the further it is placed from the centre of the screen.

This would already be looking pretty odd, as I’m sure you’ll agree. We certainly don’t expect things to change size as they scroll around the screen. What would be even more funny-looking, though, is that straight lines would no longer be straight! In order to maintain the constraint that fixed “length” objects must subtend fixed angles, lines would have to curve, with lines that appear “vertical” when they are close to the center of the screen starting to bow outwards towards the edge as they move away from the centre. Everything would have to become more stretched the further out you go. This would happen without limit – if you build a large enough screen then the “1px” unit would be a mile at the corners!

What we are essentially describing is a spherical geometry, projected onto a plane. It’s certainly very interesting, but it’s definitely not how people expect a monitor to behave, or a piece of paper for that matter. A planetarium, maybe…

Also, all of this would have to change depending on the position of the observer! To do this properly, you’d have to make sure that as the observer moved their vantage point, the whole surface/projection adjusted itself to compensate. Although this would be a marvellous thing to see, particularly on a piece of paper, I don’t think it would catch on as a way of styling the web.

But that’s not what those people said!

Well, I did wonder about this a bit. To be fair, the original article does seem more confused than outright wrong. In particular, the formulae and the “Quick Conversion” bit in this article suggest that the author doesn’t actually think that px is a unit of angle in the sense that 1px equals some fixed number of radians. However, he almost seems to go out of his way to convince people that this is in fact exactly what he means, and a few people have clearly taken that impression away with them.

I’ll just finish by going through a few of the explicit claims made in that article and correcting them:

The title: “CSS px is an Angular Measurement” – I think I’ve covered this pretty thoroughly. Just in case there is any further doubt I’ll make the final observation that “px” is defined in the section called “Lengths”, of which the first sentence is “Lengths refer to distance measurements.”

The first sentence: “The “px” unit in CSS doesn’t really have anything to do with screen pixels, despite the poorly chosen name.” – this is obviously complete nonsense. The standards authors went out of their way to give browser authors a way to match the “px” unit up to the real-world device pixel (or some multiple/fraction thereof). The name is certainly not poorly chosen, any more than the others are (cm, in, pt, etc). They can all vary widely from their real world versions, depending on how the unit system is anchored, but they are supposed to reflect their real world equivalents in some, at least intuitive, sense.

“It’s actually an non-linear angular measurement.” – No. It’s a length measurement. It has a non-linear relationship to the angle subtended, because (in Euclidean geometry) the lengths of straight lines DO have a non-linear relationship to angles they subtend. This doesn’t mean the length measurement itself is in any way non-linear. Quite the opposite: if you double the number of px in your CSS measurement, you double the length that is implied. There is no non-linearity to it at all, except in its relationship to angles subtended, which is pretty much irrelevant to web design.

“The formulæ to convert between radians and px are as follows…” – The formulae presented, if they are correct at all (which I haven’t checked), would only work in situations where 1px is actually equal to the reference pixel size. As discussed above, deviations from this frequently arise, for example to avoid 1px being a non-integer number of device pixels or because the system of units is anchored to something other than the pixel. It also assumes you are sitting at the “typical” distance from your display, whatever that is .

“That means that when you do “{ width: 24.3px }” in CSS 2.1, you’re making something as wide as the moon looks to be.” – No. What it means is more like “you’re making it 24.3 times as large as something that subtends 1/24.3 times the angle subtended by the moon”. Notice that the “subtends 1/24.3 times the angle” part is working with angles, whereas “24.3 times as large” is working with lengths. There’s also the assumption again that you are sitting at the “typical” distance from the display, and that 1px is equal to the reference pixel.

Well, I feel better for getting that off my chest. And I understand it all a bit better myself now, which is a bonus.

17 Responses to In CSS, “px” is not an angular measurement and it is not non-linear

CSS px-measured scale does subject to real bad anti-aliasing effect. This happened a lot on Android devices that trying to cheat on achieving retina density screen but packed with less than enough pixels. Remember failed previous generation Nexus 7. Real pixel dimension is 1280×800, that translated to 960×600 css px dimensions (ratio 1.32). Bad lines rendering ensues. If you go to page with a lot of aligned lines e.g. http://cubiq.org/dropbox/iscroll4/examples/simple/ you will see on Nexus 7 some lines are blurred and some are solid 1px lines, some simply disappear. Similar case happened to some samsung Android phones. It simply doesn’t pack enough pixels while trying to achieve retina-like density. Horrible antialiased lines.

Sure, these are cases where the units are anchored to px but the CSS px has been set to a non-integer number of real pixels, a valid if perhaps not very successful implementation choice (using an integer number of real pixels here is only a ‘recommendation’ in the standard, but you can see why it’s recommended :-)). To ‘work around’ this would require media queries or device-customised content. It would also require a proper understanding of how CSS length units work. If you think CSS px don’t have any relationship to real pixels you’re not even going to understand why it’s happening, let alone how to avoid it.

What logic is there in making “1px” not 1 pixel? If you want a unit that scales with the resolution, you should be using pt or pc or in or cm or mm. If I write “px” I want pixels, regardless how it looks.

The creators of CSS were right not to include a “real pixel” unit. As Kevin Geng pointed out, it would cause problems on high-resolution displays.

More to the point, the desire to use the pixel as a unit of measurement betrays a fundamental misunderstanding of what a pixel is. See the memo by Alvy Ray Smith, “A Pixel Is Not A Little Square” (http://alvyray.com/Memos/CG/Microsoft/6_pixel.pdf)