Image Sensors: The Eyes Of AI And Intelligent Systems

With the advent of the smartphone we have witnessed one of the most powerful tools that mankind has ever seen. However, one of the lesser understood and talked about phenomena accompanying the smartphone is the rise of the image sensor.

copyright by www.forbes.com

By integrating progressively more and more powerful image sensors into smartphones, we have taken photography and the information age to the next level. Today I wanted to share some thoughts on the image sensor—what all it has enabled, and where it might be heading in the future.

Bringing vision to AI

Without this technology, we wouldn’t have Facebook live or the countless videos that give people access to what’s really happening on the ground. The smartphone has essentially turned everyone into a photojournalist, allowing the world to nearly immediately see what is happening in people’s everyday lives. There are now billions of smartphones, with cameras, around the world. Image sensors aren’t limited to the smartphone only—even some of the cheapest ‘feature’ phones in the world now have cameras. The ubiquity of these cameras translates to a very large potential dataset that can be harnessed for applications like big dataBig Data describes data collections so big that humans are not capable of sifting through all of it in a timely manner. However, with the help of algorithms it is usually possible to find patterns within the data so far hidden to human analyzers. , machine learning and ultimately AI.

Right now, we are seeing an exponential expansion of the speed and accuracy of artificial intelligenceArtificial Intelligence knows many different definitions, but in general it can be defined as a machine completing complex tasks intelligently, meaning that it mirrors human intelligence and evolves with time.. The ‘smart’ devices we are equipping with machine intelligence and artificial intelligence—dronesDrones are defined as unmanned aircrafts, they can be very small or rather large. Most drones cannot operate completely autonomous, but need human inputs. Moreover, there are a lot of laws in place which do not allow for private drones to fly out of sight., robots, self-driving cars and even phonese—need to be outfitted with image sensors in order to best know what’s going on around them and operate safely in the real world. Sure, they have access to location data and audio via microphones but neither is as powerful as a real-time camera. This is why camera sensors are only becoming more important—they are essential for the image information gathering needed to train machine learning and artificial intelligence. Even once devices are trained, they still need camera sensors to provide context for what they are looking at in order to properly act on the information they’ve been trained on.

My computer has a webcam, my phones have multiple image sensors, my drone has multiple image sensors, my vacuum has an image sensor, my car has image sensors. While there are plenty of cameras in everything that we use today, and a lot of it is powered by intelligent computer vision algorithmsAn algorithm is a fixed set of instructions for a computer. It can be very simple like "as long as the incoming number is smaller than 10, print "Hello World!". It can also be very complicated such as the algorithms behind self-driving cars., the next step in the evolution of these devices is to add machine learning and artificial intelligence and to improve those algorithms dynamically over time. Essentially where we’re at now is existing image sensors in everyday devices need to be imbued with machine learning, and existing ‘smart’ devices need more, and higher quality image sensors. While this marriage is already underway, it’s only going to ramp up in the coming years. […]