Post Your Comment

32 Comments

Like with everything else, google arrives late and indecent. Hopefully this will be an improvement, I am sick and tired of android camera continuously refocusing and ruining footage even when the phone is on a stand and shooting a stationary object. It is plainly retarded.Reply

BTW still waiting for low latency audio google is promising for like 5 years now... Pathetic considering android is just linux and the entire low latency audio stack for linux has been around for quite a while.Reply

It "was coming" with JB and KK as well. My expectations have been tuned down to save myself the disappointment, they will probably shave a few ms off the latency, but not really low level low latency audio that would be usable for music making applications.Reply

I've read about "rewriting" the audio system. Hopefully they did not only rewrite it, but rewrite it right and abstracted it away from the VM and java runtime, which was the reason it was so laggy to begin with. It was far from low-level, even when using the ALSA C API. Usable low latency audio will probably require real-time kernel, which unfortunate I don't see happening with android. Meaning expected latency won't be lower than 20 msec, which is still pretty high, 10 and below would be nice. I've been able to get 2 input and 3 input for a total of 5 msec latency in Linux with Jack and RT kernel on a machine about as powerful as current high end android devices.Reply

They still aren't low enough. They admitted as such, though it is around 20ms as it stands in L. They are using opengl es now to do audio or its counterpart.You can watch it for yourself if you search for the i/o schedule and watch the segment on audio.Reply

I'm really excited for RAW support for upcoming phones, here's a 100% crop comparison I made between a RAW out of the OPPO 7a that I ran through Lightroom, and a out-of-phone-jpeg. You can clearly notice the additional sharpness of the bricks, flora and the asphalt.

i2.minus.com/imZpFpS3b7PnH.png

This is the 13MP Sony IMX214 sensor that is probably on half of the Android devices that came out this year.

Colors and Gamma is slightly off on the RAW because Lightroom didn't have full support for the Oppo when I processed it.Reply

The quality of the image has little to do with the format JPEG itself, the implementation of image processing APIs is the main factor.

One can directly convert RAW to JPEG with no visible difference in the result. RAW allows better image at the end because it allows the often weak image processing software in the camera to be bypassed, then the image can be developed using software like Lightroom, and finally burn into a good JPEG.Reply

Usually JPEG used in phones uses a lot of compression, blurring the image a lot. The denoise filter is also too high.Of course you can use little compression when converting from RAW to JPEG and use a less agressive denoise filter.Reply

Well, that's because you oversharpened a noisy image thus producing a somewhat sharp but also very busy picture. The processing engine probably employed some blurring as a primitive from of denoising and didn't sharp that much (maybe to keep the size down).

I don't think RAW support will really matter because in the end the sensors used in most smart phones are still utter crap and I don't expect that to change much, unless using a compact camera running on Android. Trying to postprocess RAW capture of a crappy sensor is just a waste of precious time, people who are halfway serious about producing usable images will still have to buy a dedicated camera with a bigger sensor.Reply

The noise on the RAW is low if you consider it was a 750px, 100% crop from an 4k*3k picture. A RAW file out of a DSLR has the same high-frequency luma noise before you put it through a noise-reduction filter (which I didn't apply on the crop).

It's okay if you don't like the detail though, since RAW is non-destructive you can just apply whatever amount of filtering you'd want. Undoubtedly there are going to be Android Apps like a camera app that internally shoots RAW but converts them on-the-go to JPEG with your own custom processing profiles.

Unlike my phone, I don't have my DSLR or compact with me everywhere so I welcome every improvement to the image quality on phones, skipping in-phone-processing happens to be a significant one.Reply

The only thing I can say to having RAWs on a mobile phone is that I love, love, LOVE this option on my Lumia 1020. Even though the internal jpeg processing became quite decent with the Lumia Black update, it is still way behind the quality I can squeeze out of those RAWs myself. Yes, even the beast that the 1020 is in the mobile phone camera area severely lacks behind any DSLR. But as MyRandomUsername just said, the Lumia is the device I have on my almost all the time. This and not being forced to be on mercy of the internal image processing (this and having quite nice manual options while shooting nonetheless) makes a huge difference.Reply

> The noise on the RAW is low if you consider it was a 750px, 100% crop from an 4k*3k picture. A RAW file out of a DSLR has the same high-frequency luma noise before you put it through a noise-reduction filter (which I didn't apply on the crop).

> Unlike my phone, I don't have my DSLR or compact with me everywhere so I welcome every improvement to the image quality on phones, skipping in-phone-processing happens to be a significant one.

I'm a MLC user and from my own experience phone pictures come out so terrible anyway that RAW doesn't help a bit; usually those are snap, show, throw away. In most cases I don't even bother pulling out my Lumia because I'm very easily annoyed by shitty photos and I don't want myself to ruin my day.Reply

The point isn't to replace a DSLR. The point is to have significantly better detailed files to work with, which have much greater capacity to be adjusted in post, all of the times that you DON'T have a DSLR on you.

You couldn't be more wrong about the sensors not being good enough. The sensors are more than capable for most use (obviously not large prints, but neither are most DSLRs). The gains in sharpness and image detail we've been seeing on RAW phones have been pretty good, and noise levels are pretty low sub ISO 1600. They compare pretty well to an entry level DSLR.

This is coming from a photographer who spends around $4-6k/year buying more photo equipment for his business. There have been plenty of times that I didn't have my MKIII with me that I had to use a cell phone for a quick picture and wished I could touch it up without exposing all sorts of artifacts. This looks like it will solve that problem.Reply

isn't it possible to profile the CCD's response in different lighting situations using calibration cards and create an automatic multi-dimensional filter that fully optimizes automatically after shooting? I just want iPhone quality colors out of my photos.Reply

Also, I'm curious; would these APIs be able to make an app that doesn't use the dedicated imaging chip on the HTC One M7? The purple haze issue is due to an overhearing dedicated imaging chip, and if you could process images without using that chip, shouldn't it fix the purple haze issue?Reply

To be pedantic: you could do focus-stacking with multiple exposures, you might even be able to infer a depth map (or add an additional depth-sensing imager) and use a hyperfocal lens to reestimate focus in software, but you cannot magically turn a single-lens-assembly focal plane camera into a light-field camera (e.g. Lytro). Reply