“I’m a reformed professional photographer that has shot just about every kind of camera from film to digital, professional and pocket. Weddings, portraits, landscape, wildlife, sports, industrial, you name it. I’ve processed film and prints by hand and machine and have taught photography as well,” Matthew Panzarino writes for TechCrunch. “I don’t know everything photographic there is to know, far from it, but I’ve been around a bit.”

“Unfortunately, cameras from many other phone companies like Samsung and Motorola simply don’t match up to the quality of images coming out of the iPhone. I’ve tried many, many different Android devices over the years which promised better images but none have delivered,” Panzarino writes. “Over the last few years, the iPhone has really become my go-to camera. The DSLRs have sat on the shelf and even a compact Panasonic 4/3 camera only comes out infrequently. This means that when Apple introduces a new device I’m all ears when it comes to what they say about its camera. The iPhone 5S is no exception, and there is some pretty great stuff here. Obviously, this is not a review of the camera, just an exploration of the specs and what they might mean for other iPhoneographers.”

Apple’s all-new iPhone 5s featuring True Tone dual-LED flash

Panzarino writes, “True Tone Flash: This thing is the crown jewel of the new iPhone’s camera capabilities, in my opinion. Yes, many people will probably still avoid using a flash, but the sheer engineering prowess here is insane.”

Thank You for supporting MacDailyNews!

29 Comments

The dual flash. I have seen dual flash on other phones. So I don’t understand. What have the other’s been doing with their resources that makes Apple’s dual flash a first? I mean, did they say, we aren’t going to use dual flash until it actually does something? Or did the others just suck at doing it right?

How can someone not see that using two tones, or even better three tones could actually make better pictures?

That’s the same way I felt when Apple announced it :) I could not believe that no one had ever thought of doing this before. It just makes sense. But then again that’s the innovation at Apple, giving us features we didn’t know we needed and will use everyday.

Other cameras have had two-tone flash, but the results fall fall very short of the its potential. Doing it right requires intense real time image analysis to select the right mixture of white & amber light for the best and most natural looking exposure, and that requires a unique combination of very good software and raw computational power.

It’s fakery, have no doubt. But Apple nailed it brilliantly. Only they have the intent and patience to calibrate the color for this technology. No other company would even think of bothering. They’d just toss out a rough estimate and expect the customers to eat it and like it. It’s all a matter of RESPECT for both one’s self and the customer.

White balance is really hard for a computer to get right, where “right” is defined as when a human looks at the picture and says, “yeah, that’s how it looked.” First off, the camera has a linear response to light while the human eye has essentially a logarithmic response. In practice, the human eye can adjust for brightness on a per “pixel” basis where a camera can not. And because of that, when the camera looks to asses white balance, it will get it wrong all the time if there is no true white or true grey (digital photographers look for 18% grey, but it’s becoming less important with newer technology). Older iPhones really couldn’t get white balance correct unless you made sure there was something white in the photo — I’ve stuck the corner of a piece of paper in photos just long enough for the iPhone to adjust white balance and then hurriedly removed the white object and snap the picture. And that’s just with available light. When your flash throws light that is of a different temperature than the available light, the camera, being linear in response, records the color of the flash (usually a lot bluer than the ambient light color) bouncing off the shiny surfaces (foreheads, noses, etc) while receiving the ambient light from everything else. And that’s why the intro showed the two photos one of which was quite “flashy” in appearance. So what Apple has done right, probably because of engineering and because of the speed of the camera and the computer in the iPhone, is correctly interpret ambient light and integrate that into what power levels to assert to the two flashes, all the while accounting for what those two flashes are going to contribute to the scene. I don’t have inside knowledge of Apple’s design, nor that of other phones, but looking at Apple’s moves (the London photo shoot this weekend) I’m convinced they are applying the available technology in a way guaranteed to outsmart the other phones and the professional camera systems costing many thousands of dollars.

Golum. When you describe having seen dual flash’s on other phones, have they been implemented in the same way that Apple have done on the iPhone 5s? (I mean having two flash bulbs encapsulated in one enclosure)
I have what may look like a dual flash on my compact camera but they are separate. One determines the distance of the subject via infrared whilst the other bathes the subject in light depending on the setting. So, if my intention is to eliminate red eye, then the infra red light will be brighter and on for longer so that the subjects eyes focus on the light before the actual flash bursts at least three times by which time the photo has been taken.

The principle of having colour toned flashes arises from the aspect of what colours are absorbed and reflected in order to create what we perceive as colour. A principle exploited by colour television very successfully. TV depends on light travelling through a filter/s to our eyes giving a perception of colour, whilst in photography, the principle is that of light bouncing back from the subject giving us a perception of colour depending on which colours were absorbed by the subject and those reflected by the subject.

I do not declare to be an expert on this subject, but for the sake of discussion have contributed non the less.

It’s usually 3-4 months before the geniuses realize that they didn’t get it…when the figures start speaking for themselves and the lines don’t subside and the stores are packed with people clamoring to touch and feel and wow and the new phones pop up everywhere…

Don’t you recall the doom forecasts for iPhone 5 and so many other products and software that Apple release on their deaf ears and mutt brains?

I love that other phone people keep cramming more megapixels into their phones and the sheer size MBs will fill up the phone so fast not to mention they are so big you can’t email of if you could you will eat up your data.

Exactly. I’ll use my own prosumer camera as an example. My 13 year old Olympus C-3040 is a 3.3 megapixel camera. Over the years I have borrowed much newer cameras that had 4, 6 or 10 megapixels. When I compared them to my lowly 3.3 camera, they didn’t look any better and some looked worse.

The reasoning for this is the Olympus had excellent quality optics and a very good CCD. I printed an 11 x 16 from that camera once and just about everyone that looked at it thought it was from film.

So Apple is on the right track. Make every step in taking a photo as best as possible. One week point will ruin the shot.

As a profesional photographer myself, I agree. The True Tone Flash and the ability to take multiple photos, examine each one for sharpness, and then automatically merge the sharpest parts of each photo is nothing short of insane!
Can’t wait to try it out from myself.
If another phone/camera company made this claim I would be skeptical, but I think Apple can actually pull this one off.

I’ve never bought an iphone for it’s camera. I have professional lenses and have had many professional cameras and know how to use them.

However, when I watched the presentation I came away thinking that Apple has jumped ahead of the “real” camera companies in some areas. Of coarse nothing beats the larger sensor, quality lenses, and versatility of a DSLR but if I decide to buy the 5s the camera quality will be a major factor in that decision.

It does have it’s limitations so it can’t replace a real DSLR but it will be nice to have a quality camera with me all the time without having to lug around a heavy backpack.

Apple is just continuing its practice of offering last year’s technology at a lower price. The only difference is that Apple put that tech in a new colorful plastic case, to further increase differentiation. Apple obviously wants as many customers as possible to pay an extra $100 to get the latest and greatest iPhone. But I think you’ll be surprised by how popular the iPhone 5c is with customers who don’t care so much about pure specs.

Clearly. But I have been trained to know what there is to know. I can tell you that the phrase “True Tone Flash” makes me ill on several levels, because I know exactly what technology is being used on the iPhone 5S. Dig into the gory technical details and you’d get nauseous as well.

But I have to say, that for shooting from the hip with kind of good enough technology, Apple is remarkably innovative. I’ve NEVER seen anyone pull this trickery to sort of fake the correct color temperature. It’s BRILLIANT. Obviously they have calibrated this trickery rather nicely.

That is NOT to say that the resulting photo images will have ACCURATE color. They damned well will NOT! But this isn’t technical photography. It’s consumer photography. And Apple’s iPhone 5S image results clearly are a coup over the wannabe competitors. That’s my Apple.

Derek: you are FAR more educated and experience than I am on this, but even I think you should wait and give it a try yourself before making some of these comments. In your expert hands, I wonder if the iPhone 5s might not surprise even you.