Never heard that before and not sure what would account for such a difference.

Its old tech. It used to be that the CCDs were making better color uniformity. CMOS looked artificial.

Many people thought it was based on the method of charge conversion. CMOS integrated the A/D converter into the pixel. CCD sent all the charge to a row converter. The theory was the CCD method allowed calibration of the value to a standard with higher tolerance than a single value from a pixel.

The last generation of CMOS from all the DSLR manufacturers seems to have much better color accuracy than a few years ago. Pictures now look realistic, like looking through a window whereas the CMOS a few years back looks a bit artificial. Maybe its improved bayer algorithms. Maybe its the latest updates to RAW converters. Something has changed on the CMOS side.

CMOS never integrated the ADC into a pixel. The major difference is that CMOS essentially outputs a voltage while CCDs shift the charge of each pixel to the edge of the sensor. CMOS needs a few gates to handle readout in each pixel while CCD has no electronics at all.

On older Nikons the ADC-s were off chip, they were using Burr&Brown converters in external housings. I'm pretty sure that Canon also uses off chip converters.

The most significant difference between CCD and CMOS may be that CMOS can be read out nondestructively. So, essentially all CMOS vendor use a method called correlated double sampling where they measure charge on each pixel after reset and by that method are able to eliminate much of the noise.

Its old tech. It used to be that the CCDs were making better color uniformity. CMOS looked artificial.

Many people thought it was based on the method of charge conversion. CMOS integrated the A/D converter into the pixel. CCD sent all the charge to a row converter. The theory was the CCD method allowed calibration of the value to a standard with higher tolerance than a single value from a pixel.

The last generation of CMOS from all the DSLR manufacturers seems to have much better color accuracy than a few years ago. Pictures now look realistic, like looking through a window whereas the CMOS a few years back looks a bit artificial. Maybe its improved bayer algorithms. Maybe its the latest updates to RAW converters. Something has changed on the CMOS side.

We both use Sony and they have advertised on chip A/D conversion for ages.

Yes, Canon and Nikon were doing something different, maybe why they had 14 bit while we were stuck with 12. Pentax also tried high end external 22 bit converters. They went back to something basic in the next version.

To be exact, Nikon D800/D800E, D600 and D3X use Sony Exmoor sensors using on chip converters. D3/D3S and D4 uses Nikon designed chips using external converters. Nikon D3X had both 14 bit readout (slow 2FPS) and 12 bit readout. I guess the reason Sony had 12 bit converters was more related to Bionz being a 12 bit chip. The Alpha 77 I have now has 14 bit conversion.

Pentax used Samsung chips in older cameras the K5 has an Exmoor with 14 bit readout.

We both use Sony and they have advertised on chip A/D conversion for ages.

Yes, Canon and Nikon were doing something different, maybe why they had 14 bit while we were stuck with 12. Pentax also tried high end external 22 bit converters. They went back to something basic in the next version.

I remember that those values are scaled for same size but I actually feel that they are more relevant than actual pixel values, also they were much easier to find.

You are absolutely right that it is not possible to encode more than 14 EV in 14 bit linear space.

The interesting point is that all the top cameras use Sony technology. Nikon uses Toshiba for D5200 by the way. I guess Sony is not alone with on chip per column conversion. The new sensor for the Leica M is also a CMOS design with column type on chip converters.

I would suggest that there are a lot of myths in this area. CMOS is main stream. CCDs are used by small companies making high end stuff. Most CCDs are made by Kodak (R.I.P.) or Dalsa. In general, CCDs often lack OLP filtering which is good for perceived sharpness but creates a lot of fake detail.

Kodak and Dalsa may use a different CGA (Color Grid Array) than say Nikon or Canon. So the myth goes that CCD has better color then CMOS, but that has little to do with CCD vs CMOS and a lot to do with CGA design.

Previously, you stated that better "uniformity" was "still" an advantage for CCD-based sensors.

I don't think there is anything inherent in CCD technology that would account for that.

I believed it until I started seeing a lot of pictures from people using the Sony A99, A77 ( I use sony), the Nikon D800, D600.Maybe its the down-sampling leeway of the 24MP of data. Maybe its improved software. I dont know. All I know is these cameras are outputting images that look very realistic vs older generation DSLRs.

The DR and the color tone accuracy are improved, from hardware or software is unknown.

I have recently acquired a Sony Alpha 99. Naturally enough I shot some comparison pictures with my Alpha 900 and could see no difference at low ISO. I did shoot a horse jumping at ISO 6400 and printed A3, something I wouldn't dream of with the Alpha 900, so I'm pretty sure the Alpha 99 has a cleaner sensor (less readout noise and probably a higher FWC).

I believed it until I started seeing a lot of pictures from people using the Sony A99, A77 ( I use sony), the Nikon D800, D600.Maybe its the down-sampling leeway of the 24MP of data. Maybe its improved software. I dont know. All I know is these cameras are outputting images that look very realistic vs older generation DSLRs.

The DR and the color tone accuracy are improved, from hardware or software is unknown.

Lightroom is beyond impressive to me ... it the single most impressive pieces of "consumer" software I have ever seen. The ambition and "game-changing" nature of the project is inspiring to me as someone who lives in "legacy" world trying to innovate.

Lightroom is beyond impressive to me ... it the single most impressive pieces of "consumer" software I have ever seen. The ambition and "game-changing" nature of the project is inspiring to me as someone who lives in "legacy" world trying to innovate.

I manage a lot of software development as part of my professional life. One of the "catchphrases" I use often in the context of user-driven software companies and projects ... "Rather than build a little something for everyone, build everything for someone."

I manage a lot of software development as part of my professional life. One of the "catchphrases" I use often in the context of user-driven software companies and projects ... "Rather than build a little something for everyone, build everything for someone."

From my perspective, Lightroom is a very nice light-table, filing and printing program with a good-enough Raw converter. If you want superlative Raw conversion, and only raw conversion, you can often get it for free with the manufacturer's software eg. Canon DPP, freeware like RPP, or resort to boutique products like Iridient's Raw Developer, or even Capture One which I believe quite a few people on this forum have used.