Camera: Thinner, Faster, Better Low-Light

Section by Brian Klug

The iPhone 4S represented probably the single biggest leap in camera performance for Apple in the progression of iPhone. The combination of an 8 MP CMOS with 1.4 micron square pixels, F/2.4 optics, and Apple’s own ISP resulted in a great overall performer for its class. We made the prediction early on that optical performance would remain roughly the same with the next iPhone, and that largely turned out to be the case. With the iPhone 5, Apple’s major design guideline seems to have been reducing z-profile, and probably one of the biggest obstacles to getting to that goal was reducing the thickness of the entire camera system. In fact, getting to a thinner optical system with the same performance characteristics is quite a challenge.

Superficially the iPhone 5 camera specifications are almost unchanged. We’re still talking about 1.4 micron pixels (roughly two waves in the red), not the smaller 1.1 micron pixels that are in the cards for the future. F-number remains at 2.4, and total pixel count is still 8 MP. Focal length is shorter, as expected (this is a thinner system, after all), resulting in a slightly wider field of view. I’ve made a table with the relevant specifications for the iPhone 5 cameras.

That’s not to say that there haven’t been changes made to the rear facing camera performance, however. Apple talked about dramatically improved low light sensitivity thanks to a low light boost mode. As we’ll show later this does make a big change in overall sensitivity thanks to the combination of 2x2 pixel binning at ISO 3200 to keep noise under control, and better fixed pattern noise rejection on their ISP.

Apple claims that this is an entirely new ISP, oddly enough I found that the interface is still named the same (AppleH4CamIn ISP) as what I found on the 4S, but I don’t doubt that there have been at least some tweaks, though unfortunately this is still relatively opaque without lots of digging. During the keynote, Apple claimed that image capture is now 44 percent faster than the 4S thanks to this improved ISP. The improvement is actually hard to measure, the iPhone 5 doesn’t have a burst mode, instead capturing quickly requires tapping as fast as you can. Shot to shot latency is essentially zero on the iPhone 5, gated only by how fast I can tap. I put together a short video comparing the 4S and 5.

The camera launches faster, HDR images capture quicker, all around the iPhone 5 camera experience is just smoother and faster, which isn't a surprise.

Camera Performance Comparison

Property

iPhone 3GS

iPhone 4

iPhone 4S

iPhone 5

Camera Launch Time (seconds)

2.8

2.3

1.4

1.2

HDR Capture Time (seconds)

-

4.9

3.2

1.6

Working Distance (cm)

-

7.0

6.5

6.1

Apple has once again gone with a Sony CMOS for the rear camera, though this time (thanks to Chipworks) we know it ditched the IMX145 markings. Apple is frequently able to have its suppliers make specific one-offs with changes just for themselves, it’s highly likely that’s the case here. I have almost no doubt that the changes made to IMX145 accommodate this extremely high ISO (3200 is almost unheard of for that size pixels due to noise) with some tweaks to the amplifier in each active pixel (which is what we really mean when we talk about a CMOS versus a CCD). Either way I have no doubt time will tell that this is an IMX145 derivative with some tweaks to both aide the low light boost mode, and also possibly get to Apple’s desired chief ray angle if IMX145 couldn’t do it already. Unfortunately like so much in the smartphone space there’s very little in the way of open documentation for IMX145.

Apple claims that they’re aligning optical elements with even tighter tolerances now, which does make a big difference in the kinds of optical design tolerances available. There’s also a sapphire front window in the place of what was previously just optical glass. Sapphire of course has an extremely high surface hardness second only to diamond, in addition to excellent chemical resistance, and good transmittance. The real advantage here is again one of thickness — you can run thinner sapphire windows in the place of standard glass windows and get better transmittance. Sapphire windows in optical systems are colorless and chemically composed of single crystal aluminum oxide (Al2O3). Upon inspection the iPhone 5 sapphire does indeed appear to have an antireflection coating as well.

The story on the front facing camera is actually one of dramatic improvement. The iPhone 4 was the first iPhone to include a VGA front facing camera, which remained the same on the iPhone 4S as well. That system was arguably good enough for FaceTime which seemed its original reason for existing, but finally gets updated to 1280 x 960 on the iPhone 5. That particular CMOS is an OmniVision part with 1.75 micron pixels and topped with an F/2.4 optical system. Images captured on the front facing camera are dramatically better, and video is 720p now.

Before we talk about image quality though I’d like to make brief mention about user experience on the iPhone 5 camera. I touched on how the interface is even faster than the 4S thanks in part to faster A6 silicon and improved ISP onboard, and Apple continues to make things very minimalist with virtually no options for changing shooting modes manually or configuring ISO. In addition, every image taken on any Apple camera is at the highest resolution and compression mode, always. Essentially all of this functionality is abstracted away from the user leaving the shooting experience fully auto all the time. This includes the low light boost mode which kicks in below a certain threshold.

What’s puzzling about the iPhone 5 user experience is that the aspect ratio of the viewport no longer matches the aspect ratio of the CMOS or the end images. This is done purely for aesthetic reasons as far as I can tell, because of the extreme letterboxing that would happen with a 16:9 viewport and 4:3 image size. Instead of going ahead and giving you that letterboxed but 100 percent field of view preview, Apple instead crops off the top and bottom and presents a roughly 3:2 image in the preview screen.

I encountered this somewhat unexpectedly while taking images of the ISO12233 chart and trying to align the chart with the 4:3 CMOS, only to be thoroughly confused to the point of questioning my basic math skills until I realized the preview was a crop of the real image area. As of this writing in iOS 6.0 there is no way to double tap on the preview and get an aspect-correct, 100 percent preview with letterboxing like you could do with 16:9 video in video capture mode. Instead, you’re just always locked to an absurd 3:2 center crop of a 4:3 image. This makes absolutely no sense to me and will always result in image composition in the preview screen that looks nothing like the end result. I sincerely hope an update adds a double tap gesture that will give an actual 4:3 preview. Another new thing I noticed on the iPhone 5 is that if you let the phone get too hot it will disable the LED flash until the device cools down. I have never seen this behavior on the 4 or 4S.

Lastly, iOS 6 adds a panorama mode to both the iPhone 5 and iPhone 4S which actually has been lurking around hidden in iOS for some time now. Panorama mode in iOS continually integrates over the field of view, first for the full field, later over a small center strip until you reach the end. The mode produces results that are at maximum 10800 pixels wide and around 2590 pixels tall depending on if you swept through the horizontal field of view without any shift. In addition the mode supports portrait panoramas if you rotate the camera 90 degrees and scan upwards.

I stuck my iPhone 4S, iPhone 5, HTC One X and Samsung Galaxy S 3 in the dual camera bracket and took a number of panoramas for comparison purposes. There’s a surprising amount of difference between the approaches I see handset vendors taking for panorama. The One X takes a few exposures and stitches them together, Galaxy S 3 does continual integration but produced strangely blocky results. iOS continually stitches a small center strip together as I mentioned already.

Still Image Quality Evaluation

To evaluate still image quality we turned to our standard set of tests which seems to keep growing. That consists of a scene in a lightbox with constant controlled illumination of 1000 lux taken using the front and rear cameras with as close to the same field of view as possible, images of a distortion grid, GretagMacbeth ColorChecker card for white balance checking, and an ISO12233 test chart for gauging spatial resolution in an even more controlled manner. Because I’ve moved houses and lighting will never ever be exactly the same, I have decided to move the three test charts into my lightbox as opposed to putting them on a wall and illuminating them with studio lights. This warrants a completely new set of comparison images, hence the smartphone 2012 camera bench for the three charts and front facing camera.

Let’s start with what’s most objective first, the tangential and saggital spatial frequency crops. You can really see here that Apple’s camera design team kept performance roughly the same between the 4S and 5, I can count up to roughly 16.5 (100x lines per image height) on both devices. Samsung Galaxy S 3 appears to also be around 16, along with the HTC One X. The iPhone 4 and Galaxy Nexus are at a huge disadvantage with their 5 MP CMOS, I can see up to 15 or so before there’s a contrast reversal from us crossing through MTF of 0. The PureView 808 actually outresolves the test chart at full size and at the 8 MP on-device oversample, obviously you can’t beat that device with just an 8 MP CMOS.

The tangential frequency crops basically tell the same story which isn’t a surprise. It’s shocking how close the iPhone 4S and 5 are here. I strongly suspect that team was basically ordered to keep MTF the same and just reduce the thickness of the 4S-era optical system. You can look at the 100 percent size versions of the tangential and saggital crops as well rather than these which are resized to 600 pixels wide maximum to fit online.

From the rest of the test charts we can see the iPhone 5 has slightly more pincushion distortion than the 4S in the distortion chart, but not a whole lot. It is also evident from the GMB chart and other photos that I’ve taken over my time with the iPhone 5 that the revised ISP also has better auto white balance.

The remainder of the well-lit tests really tell the same story. In outdoors lighting (with both cameras automatically selecting the same ISO and exposure time) I can’t find any major difference in camera performance between the 5 and 4S, they’re very close. Apple also changed the LED diffuser design with iPhone 5, it is visibly different and now results in a much more even field of illumination in the lights-off lightbox test.

On the front facing camera the increase in resolution and overall quality is dramatic, however.

It is in low light performance that the 4S and 5 radically diverge thanks to the low light boost mode which kicks in automatically at a preset threshold on the iPhone 5. You can tell when this happens just looking at the preview since there’s a dramatic shift all of a sudden in exposure. The iPhone 5 does a 2x2 pixel bin, then upscales that image to the same full size resolution as normal 8 MP capture (3264 x 2448). The result is a tradeoff in spatial resolution for lower noise and an exposure without integration time that’s inordinately long. According to EXIF, the iPhone 4 will do a maximum ISO of 1000 and 1/15th of a second, the 4S will do a maximum of ISO 800 and 1/15th of a second, and finally the iPhone 5 will do between ISO 1600 and ISO 3200 and 1/15th of a second. The difference is quite dramatic as expected.

I took samples in the lightbox at a controlled 4 lux with the phones I selected for shooting with the test target and selected low light modes wherever possible in the camera UI. On the PureView 808 I manually forced the maximum ISO of 1600 since there is no low light preset. The resulting image from the PureView isn’t a mistake, that’s what it actually looks like in full and PureView modes. It’s interesting to see how the different cameras handle this extreme low light. The Samsung Galaxy S 3 shoots at ISO 640 and integrates over a full half second to produce its result, PureView 808 takes ISO 1600 which I set and integrates over a full second (according to EXIF), the One X goes to ISO 1200 and doesn’t report how long it integrates, meanwhile all three iPhones select a maximum exposure time of 1/15th of a second and their respective maximum ISOs. Considering the exposure times of some of those cameras are far too long to hand hold (I use a tripod for these comparisons) I would say that Apple setting a maximum 1/15th of a second makes a lot of sense.

Before we depart still image quality entirely I think it’s worth visiting the evolution of all the iPhones, from the original generation of iPhone, to the latest and greatest iPhone 5. We’ve come along way in a short time since 2007, from 2 MP cameras that basically crammed a webcam module into a smartphone to 8 MP shooters with custom optics and ISP that are now arguably good enough to take the place of a point and shoot. Things haven’t entirely plateaued yet either.

Purple Haze

The final thing I’d like to talk about regarding still image capture on the iPhone 5 is the so called “purple haze” purple glare which sometimes appears with a bright light source placed just outside the field of view of the camera. When this started getting public attention many assumed that it was the result of light somehow picking up purple from the sapphire cover glass. I guess this assumption was the one people settled on quickly because most sapphire gemstones are purple in appearance? Regardless, the reality as I touched on earlier is that optical grade sapphire windows for either expensive wristwatches or camera systems impart no color on light passing through them. In fact, when I saw this I actually immediately tweeted that this was merely a matter of some stray light bouncing around inside the camera module and probably picking up a purple cast probably from some magnesium fluoride (Mag-Flouride is a very common AR coating choice that looks purple) or other antireflection coating. Note that these coatings are designed to work on a limited range of acceptance angles, from some angles they can indeed reflect, in spite of the name.

The iPhone 5 does exhibit this a bit more than the 4S, but such is to be expected given the wider field of view and larger chief ray angle. Most photographers are used to using either a lens hood or simply shielding cameras with their hand to block stray light from reflecting around inside an optical system and creating this type of glare, and obviously in the case of a smartphone there really isn’t any possible way people are going to attach either baffles or a lens hood (maybe there’s a market for that, though). Note that it is not correct to call this a chromatic aberration, insofar as it is just light that has picked up some color.

The two circular purple artifacts are clearly reflections

I captured two great photos which to me conclusively prove this is an internal reflection of some kind (in case you don’t believe the Apple support statement which parrots what I’ve said already). The first photo shows the purple flare that most see, in the second photo I’ve tilted the phone down slightly and the purple now shows up as two circles which to my optical engineering eyes instantly look like two visible reflections. I was actually going to set up an optical bench and track the angle until I stumbled upon this while playing with the camera during a late night trip to CVS. Again, all of this is easily mitigated by blocking the stray light.

Post Your Comment

278 Comments

'Right, so if you have good vision, like I do, then at a foot away, you can see those pixels.'

If you can see that then you would also be capable of observing that the SG3 doesn't have full pixels, it uses a PenTile display which overall has fewer sub pixels over a greater area than the iPhone 5 screen, making it both absolutely lower quality and relatively lower quality per area.Reply

"Personally i purchase apple products due to the insanely high resale value, which allows me to keep up with new gear on a yearly basis for a reasonable price."

I don't see it. Maybe if you buy the newest thing as soon as it comes out and sell your old last-gen device that most people are still happy with, then you're selling it for a decent amount, but you're still spending way more money than any reasonable person would. There's absolutely no monetary argument to buy Apple products, because if money is your concern, then you shouldn't be buying them in the first place.

Apple's phone prices are much closer in line with their hardware; for laptops and desktops, the resale value argument goes WAY out of whack.Reply

I bought an early 2011 MBP last year for $2650 AUD. got a high res screen option etc. I heard rumours of the retina model and sold it just before the 12 months was up so the new purchaser still had a little warranty me could buy AppleCare if they wished. I sold it for $2300 AUD. This means I lost $350 over the year, it cost me $350 to have that machine for a year. I didn't buy AppleCare ($429 AUD) either.The retina model came out, and retailed for $2499 AUD

I've been doing this since my first Mac, in 2006. I can't believe the crazy used prices on Macs especially if they are still current model and about a year old. I pay about $300-$400 a year to have the latest and greatest and a machine that is always in warranty. If I bought a cheap PC notebook for $400 I'd be suffering with an underpowered plastic machine with little ram, no SSD, and it might last more than a year but I wouldn't be happy with it anyway. Each to their own. I could never stay current with PCs because a year later the system was next to worthless, even if I'd put a $1000 video card in it at the time. (I now, reluctantly, game on consoles or a little in bootcamp)Reply

Apple refuses to pull its head out of its ass or LEARN. One profound impediment to making iOS devices useful is Apple's ridiculous fear, which you can see in its crippled SDK. One example: the lack of developer access to the dock port.

Um, when someone calls me and I miss the call, iPhone shows a missed call. Then my carrier (Telstra) sends me a text message "You have a missed call from 0412xxxxxx" then "Please call 101 you have 1 new voicemail(s)"I get multiple alerts for both those SMS messages. Reply

Why going all the way in calling people that like apple products as sheeps?I think you should accept the fact that some people like small phones, and maybe like small smartphones, which neither iphone5 or sgs3 are.From my perspective iphone 4/4s screen was maximum I would go with something called phone into my pocket, but I do not, instead finding xperia mini great sized, although too thick.My point, why would we have to considere as progress only bigger screen phones as such, we do have plenty of tablets to pick from for that usage?Reply