Most of the digital cameras out there come equipped with color filters that convert most of the white light into an image with red, green, and blue channels before it reaches the sensor. The problem with these old-school RGB filters is they prevent up to 50 to 70 percent of the light from actually hitting the sensor.

To fix this problem, a Panasonic team has replaced the traditional color filter with a set of micro color slits. The slits work by separating colors at a microscopic level using diffraction rather than filtration. In other words, rather than forcing the light to pass through something (in this case, the filters in regular camera sensors), this new technology merely splits up the light.

This array of slits is broken up into two groups of semi-transparent red and blue slits that are oriented diagonally. The red deflectors refract the red-white light out into its own channel, leaving the non-diffracted cyan light to reach the sensor. Meanwhile, blue deflectors do the same for blue-white light, allowing the yellow light to reach the sensor.

Before the sensor can fully compose an image out of this cyan, red, yellow, and white light spectrum, a computer interpolates everything the sensor picks up back into a traditional RGB scheme. The Panasonic scientists say this sort of high-speed computational color remapping has not been possible until now.

All this might seem too technically complex to happen in the near future, but because the sensor itself is fundamentally unchanged; we could see these micro color splitters implemented into our cameras sooner than you might think.