SAY CHEESE! —

Google is working on a new Android camera API, supports Camera RAW

Could a new API shore up Android's biggest weak spot?

The Nexus 5 camera was a huge disappointment, especially after comments from high-ranking Googler Vic Gundotra stating that "we are committed to making Nexus phones insanely great cameras. Just you wait and see."

That was nine months ago. We waited and saw, and what showed up on the Nexus 5 wasn't very good. There may be an explanation for this, though. According to commits in the public Android source code, which were first spotted by Josh Brown on Google+, Google is working on a new camera API for Android. Work on the new API started in December 2012, which would make it seem targeted for KitKat, but about a month before the new OS's release, the API was pulled from Android's framework code. The commit that removed the API from the release Android code is here, with the comment saying:

DO NOT MERGE: Hide new camera API.
Not yet ready.
Bug: 11141002

This commit was pushed on October 11, about a month before the release of KitKat. A month before release was probably "feature freeze" time, where work on new features stops and everyone focuses on fixing bugs in time for release. The camera revamp didn't make it and was replaced with the original camera API.

The really good stuff is in the initial commit, which contains tons of documentation about the new camera setup. There's a new API class called "Android.hardware.photography" (the current camera functionality lives under "android.hardware.camera"), and with the fancier name comes fancier capabilities:

Full-capability devices allow for per-frame control of capture hardware and post-processing parameters at high frame rates. They also provide output data at high resolution in uncompressed formats, in addition to compressed JPEG output.

The new camera API has a backward-compatibility mode for older devices, but "full-capability" devices now have access to a few new picture formats. The only new image format listed that isn't present in Jelly Bean seems to be support for camera RAW:

General RAW camera sensor image format, usually representing a single-channel Bayer-mosaic image. Each pixel color sample is stored with 16 bits of precision.

The layout of the color mosaic, the maximum and minimum encoding values of the RAW pixel data, the color space of the image, and all other needed information to interpret a RAW sensor image must be queried from the {@link android.hardware.photography.CameraDevice} which produced the image.

Smartphone cameras normally output JPEG files, which are compressed, mostly finalized images. RAW is minimally compressed and unprocessed, so shooting in RAW gives the photographer much more flexibility after the picture is shot. Programs like Photoshop can do much more with a RAW file than with a JPEG.

Camera RAW is not totally unheard of on a mobile phone; Nokia's upcoming Lumia 1520 will be able to shoot RAW, for instance. Besides making Photoshoppers very happy, the RAW file could be passed to an even more powerful on-board photo editor, which Google seems very keen on building into Android and Google+.

The new API also supports face detection. This feature includes bounding boxes around faces and center coordinates for the eyes and mouth. In addition to the face-focus capabilities, the system can assign unique IDs to each face (provided they stay on screen) so developers could do things like assign silly hats to multiple faces in a video feed. While you may have seen face detection on some Android devices, those were all solutions built by Android OEMs.

There's support for burst mode, too—another feature that you would swear was already included in Android but isn't. On Nexus devices, the only "burst mode" involves the user pressing the shutter button really fast.

The camera device is removable and has been disconnected from the Android device, or the camera service has shut down the connection due to a higher-priority access request for the camera device.

The strangest new feature is probably support for a removable camera. We can't recall a single Android device of any kind that has had a removable camera, so feel free to leave your suggestions in the comments.

The most important possible improvement that wouldn't be visible in the source code: image quality. Android cameras arguably lag behind the iPhone in quality, so this new API may be Google's solution to that problem. Android's subpar image quality seems to be an across-the-board problem, so maybe the issue really is as low-level as the camera API. There's no way to be sure, though, until we get finished software and devices in our hands. With documentation using phrases like "substantially improved capabilities" and "fine-grain control" it certainly sounds like Google is out to fix Android's digital-imaging woes.

Ron Amadeo
Ron is the Reviews Editor at Ars Technica, where he specializes in Android OS and Google products. He is always on the hunt for a new gadget and loves to rip things apart to see how they work. Emailron.amadeo@arstechnica.com//Twitter@RonAmadeo

Oof, just imagine your monthly bill if you switch to RAW output, but left your phone's settings to auto-upload your images to G+.

Actually, in google+ you can choose to only Auto-upload while on wifi... I believe you can even set to upload only when wifi + only when charging! So... no, data overages shouldn't be a problem. (and I believe g+ recognizes RAW images if you upload Canon or Nikon files to your google drive.... so I'd think they'd not have a problem with their own (which hopefully would be in .DNG format so we could edit in Lightroom or Aperture.)

As for will RAW be useful... I say YES!!! The white-balance issue alone is big... you have to realize that alot of data is thrown away (as in UNRECOVERABLE) when the raw image data is converted to jpeg. Yes, some minor correction is possible... but not at same quality. I don't know how many images I've taken with on-camera flash that are blue or some other unrealistic color due to poor white-balance... if I can correct for that AND retain full image data, those pictures will be better. As far as useful on a small sensor? I say ANY amount of recovery would be helpful..... I'll take whatever improvements I can get to what I have... I don't expect DSLR quality just because its RAW... I expect RAW to give me some latitude with my edits... letting ME decide the edits instead of the built in processor - and hopefully make my picture rise above the rest who choose not to.

Unless the sensor is very high quality IMHO there will be very little benefit in shooting RAW.

I doubt the Samsung Galaxy Camera (nice name, Sammy) will be the last camera that uses the Android interface. A camera has to run some kind of OS, why not Android? This makes a manufacturer's decision to utilize Android a bit easier if they want RAW to be a feature.

Of course, the quality will have to be good, which isn't going to happen today.

At the end of the article it mentions improvements to facial recognition. This can only be a good thing, as currently every attempt at facial recognition I have seen (Aperture, G+, iPhoto) fails miserably in many ways. Non white anglo faces are particularly hard apparently. The (false) stereotype of "they all look the same" seems to apply to facial recognition when it comes to my Chinese friends. I find it infuriating that these applications seem to lump both male and female faces, from a large number of subjects, into one badly recognised facial type. Even after training it, I still can't get Aperture to reliably recognise a single person. G+ is worse in my experience.

And don't get me started on how these apps miss faces completely if they are looking to the side.

Genrational pictures cause problems as well (father/son etc). Or people with beards. Or glasses. Or pretty much anyone who isn't in the Apple promo shots of this tech working.

If Google can crack this issue, and make it reliably recognise people to the point where it requires minimal effort on my part, then I will be a very happy bunny, except if it is then used for further advertising purposes and I will be sad again.

God knows how the police and military get on. The number of "is this Terrorist X?" false alerts that get raised must be huge (exaggerating for effect).

U.S. Internet service price/performance ratio = low, compared to most of the worldCellular internet service = severely capped, expensive, and not always available outside of major citiesSize of RAW image date = highNexus devices don't allow outboard SD card storage, you're stuck with the limits they decide upon for the market. .

I'm not interested in yet another bandwidth chewing, cloud storage based service until infrastructure improves. I'm sure it's helpful for some people, but it's a non-starter where I live. "Oh, I took 5 photos today. I'll just wait 30 minutes for those to upload to Google's servers.

I live 2 hours from the Twin Cities, and I pay $120/mo for 10/1 ADSL (best possible service in the area). Spotty 3G is the best we get out here for mobile use. I'd be more interested in the JPEG2000 algorithm being standard than the option of RAW, or an external SDXC card option, which sadly, seems to be going away on the flagship devices.

Several reasons for improving the API come to mind: 1) Google Fiber compatibility, since bandwdith limits go out the window. If a scientist wanted to film at 1000 frames/sec in high definition and stream the uplink to his Drive he could do that. 2) Knowing that amateur/pro photographers all think they are Ansel Adams, maybe they want to run a test vs his Raw data before releasing the API 3) For security cameras, maybe Google wants to support high definition streaming video uplinks from 50 Raw cameras at a time.

There's support for burst mode, too—another feature that you would swear was already included in Android but isn't.

Now this could just be an LG app and not the stock Android one, but the camera app on my Lucid has "Continuous" mode, which is the same thing. I hold down the button and it takes up to x pictures in a row.

What I hate about it (the app) is that it lacks any sort of control over how compressed the images are, which results in images that look like complete crap at native res more often than not.

I'd also be more interested in seeing phone makers stop this silly race of more megapixels and focus on better quality sensors and glass. I'd be happy with smaller but sharper/more accurate pics.

I have to wonder how much help RAW really will be for a small sensor camera. There are a couple of advantages to saving RAW images and post-processing. One is that DSLR-grade sensors have more dynamic range than what can be easily saved in a 8-bit JPG file, so post-processing can allow you to recover details that may have otherwise gotten clipped in the highlights or lost in the shadow noise. Small sensor chips don't have nearly the same dynamic range, so you don't have the information to start with.

The other main advantage of RAW, which would pertain for a phone camera, is that the white-balance corrections aren't baked in (as they would be for JPGs). That makes color-correcting for different lighting a lot easier. If you don't care about that sort of thing (and I imagine most people just keep white-balance set to 'auto' and forget about it), RAW doesn't really buy very much.

I agree that RAW isn't going to do much for most end consumers who are delighted with Instagram. I find that none of the smartphone cameras can even match the IQ of compact digital cameras let alone DSLRs. I haven't actually seen the Lumia 1020. That may indeed be a game changer.

Having said that, from photos that I see (especially from HTC cameras), highlights get clipped all the time. I don't know whether that's because the sensors don't have the dynamic range or they don't have underlying support from Android to get those pixels. If the latter, then this may lead to better out-of-the box images from the phone manufacturers themselves. So while end users may not do any PP using RAW images, the built-in software may provide some benefit. Either case, worth watching.

Most new DC's and even DSLR's and Mirrorless'es get wifi built in or as an option. If Google published a good set of API's then it will become much easier for these cameras to be controlled/communicated by the Android phone. Think Sony PlayMemories mobile and QX10/100 but much more "open". Google will not need to wait for LG or Samsung (or even Motorola for that matter) for improvement of the in-phone camera and still be able to capture in best ever qualities...using the most popular or the most niche camera apps. Those images will then be stored on Google+, edited using some even better online editing tools...or edited by Photoshop Touch or even better apps on a tablet!

The imagination is limitless and I am looking forward to seeing that happen!