I am a bit confused as I am considering to upgrade my canon 350D to 600D. The pixel size has reduced from 6.4 microns to 4.3 microns, however, top ISO has increased from 1600 to 6400. How did that happen? Shouldn't larger pixel have lesser noise? At which step/process did the noise actually reduce? The sensors should be shot noise limited and not read out limited.

Further, diffraction should also start having its influence at larger aperture with smaller pixel. Isn't it?

What is going on?

I would be glad to have a detailed technical explanation, if that can clarify my doubts.

sensors are still a long way from being shot noise limited at high ISO, you ought to be able to get a recognisable image with an average of 10 photons per pixel, which corresponds to ISO 6 million! At ISO 64,000, which corresponds to 100 photons per pixel shot noise is very mild and only visible on close up viewing.
–
Matt GrumOct 24 '12 at 15:22

1 Answer
1

Your intuition is correct. What happened is that technology has improved. There has been lots of small improvements piling up, things like better micro-lenses, gapless microlenses, cleaner read paths, on-chip noise-reduction, less noise gain circuits. All this and more adds up to substantial improvement.

Increasing megapixels which reduce pixel-sizes takes some of that back on a pixel level but if you consider a fixed-size output, things are more-or-less even, depending on the camera. That is what sites like DxOMark measure. Here is the 650D vs 600D vs 350D comparison. Look at the low-light figure to compare high-ISO.

Diffraction on the other hand is pure physics. It's moving closer as pixels get smaller. On the 18 MB Canon EOS 7D, it kicks it around F/11. That is one reason I suppose we are seeing lower-cost full-frame cameras. The Nikon D600 for example diffracts at F/19 since it is the equivalent of a 10 MP APS-C camera in terms of pixel-size.

The bottom line is that yes you will hit diffraction sooner and yes things have improved.

Note, too, that increasing pixel (sensel) density decreases pixel pitch (the distance between the same physical point on two adjacent sensels on the chip), but more recent designs have proportionally larger sensels per photosite (that is, more of the surface layer of the chip is devoted to light-sensing and less to other circuitry). It's not enough to make up the difference itself, of course, but in combination with everything else, it makes the trick seem a little less magic.
–
user2719Oct 23 '12 at 21:55

Yes and this is still no way close to an exhaustive list. A patent search would surely reveal hundreds if not thousands of innovations on the topic.
–
ItaiOct 23 '12 at 22:48