Like it or not, compared to a DSLR, or even a point & shoot, your smartphone's camera sucks. They've gotten considerably better over the years, and will continue to improve, but their tiny sensors and limited optics means image quality and their ability to accurately process a scene still have a long way to go.

Your phone's camera already does its best to make an educated guess about what settings to use based on the light available in a scene. But if you've ever snapped a photo and noticed the colors were way off, you realize it can use all the help it can get. And that's where Google's patent comes into play. Using GPS data your phone would grab the current weather conditions for your location and use it to improve how it process the image data from the sensor in terms of white balance, hue, saturation, and contrast.

Will it guarantee perfect shots every time? Probably not. Can the extra hints about how to properly process a photo taken at a given location help? Most certainly, particularly given the hardware limitations of your phone's camera. But what do you think? [US Patent & Trademark Office via PetaPixel]