Why is it do you think the 16 bit myth refuses to die?Who is it that is still promoting this?

I just noticed a recent well known dealer's ad for (a good deal) on a back and the 2nd thing they promote is "true 16 bit quality".

I remember my dealer gave me that line many years back when I laid out $20k for my first back and I also remember the feeling when I learned the truth about this bit of sales speak. Not that it made my back any worse, but it did make me think about every technical thing they told me and reassess just how knowledgeable they really were.

The point is to explain to someone who isn't a scientist why most 22mp backs are better than most 22mp dSLRs. Why pixels are not all created equally. Assume you have about 10 characters worth of space to explain that and the person is not the 5% of users who get deeply involved in the science. "True 16 bit" is about as good as I can come up with as most customers understand the difference between 8 and 16 bits elsewhere in photography. I also considered "Great tonal smoothness" or "holds up to strong styling in post processing well" but it didn't have the same ring and took up much more room. I'm open for suggestions. We're not trying to be dishonest or disingenuous.

The real world advantage is real (e.g. how the file looks after you add a lot of contrast and local dodge/burn or a significant change in WB) but the explanation could take up many pages regarding the entire imaging chain (including the quality, not just bit depth, of the A/D convertor and dark frame technology) and the emphasis throughout the chain on image quality over convenience, features, speed, or cost.

Anyone who has dealt with us (Capture Integration) knows our emphasis is on the real world and on results and real world testing not specs or marketing spiels - but the reality of marketing is you have only a few seconds of attention span and a few characters to use it.

The point is to explain to someone who isn't a scientist why most 22mp backs are better than most 22mp dSLRs. Why pixels are not all created equally. Assume you have about 10 characters worth of space to explain that and the person is not the 5% of users who get deeply involved in the science. "True 16 bit" is about as good as I can come up with as most customers understand the difference between 8 and 16 bits elsewhere in photography. I also considered "Great tonal smoothness" or "holds up to strong styling in post processing well" but it didn't have the same ring and took up much more room. I'm open for suggestions. We're not trying to be dishonest or disingenuous.

The true reasons you are alluding to have to do more with the area of the sensor. So why wouldn't you just say that? There are plenty of true reasons to want your products.

Saying "16 bit" is already false, but adding the word "true" as if to persuade the buyer "no really really" seems to cross the boundary. There is no need to make things up. You have a good product.

Quote

The real world advantage is real (e.g. how the file looks after you add a lot of contrast and local dodge/burn or a significant change in WB) but the explanation could take up many pages regarding the entire imaging chain (including the quality, not just bit depth, of the A/D convertor and dark frame technology) and the emphasis throughout the chain on image quality over convenience, features, speed, or cost.

But there are still true things you could say, instead of false things. The size of the sensor comes to mind, and the importance of that.

Quote

Anyone who has dealt with us (Capture Integration) knows our emphasis is on the real world and on results and real world testing not specs or marketing spiels - but the reality of marketing is you have only a few seconds of attention span and a few characters to use it.

I'm serious when I say I'm open for suggestions.

I've seen you here enough to know that you put a lot of care into your relations. But the 16-bit claim shouldn't be excused as an expedient. You can say that it is a very high fidelity capture system without exaggerating its specs.

Why is it do you think the 16 bit myth refuses to die?I just noticed a recent well known dealer's ad for (a good deal) on a back and the 2nd thing they promote is "true 16 bit quality".

The question could be, are they saying this back truly produces 16-bits or that it is high bit and assuming or misunderstanding that Photoshop considers all documents with more than 8-bits per color a “16-bit file”?

Or is the question, do people today really still question the usefulness of high bit data? Well one fellow continues to do so, but he’s best ignored.

The question could be, are they saying this back truly produces 16-bits

I am sure they are equiped with parts producing 16 bits data. So writing "true 16 bits" is no more false advertising than Epson writing "true 4800dpi" for their scanners or than YBA writing about the power that their high end amps can handle.

The question is whether these 16 bits include more useful data than a 14 bits pipe. There are little evidence pointing to a yes. Am I saying that backs do not have smoother transitions? Nope. I am saying than even if they do, the true reason is not the bit depth the inaging pipe can handle, but more likely a combination of CCD sensor and the quality of the adc parts used.

As a side comments, this is related to DR also, since you obviously need more sampling information to cover a wider range of values. But it is also pretty obvious that 14bits are sufficient to cover the DR backs can handle.

I understand the point Doug is making about the difficulty to market these because of the lack if easily understandable/measurable metrics associated with the smoothness of color transitions in a file.

Besides, oher than a few posts at LL among people who mostly do not own backs, the 16 bits "myth" probably works wonders among backs owners. Just like a heavier amp Must sound better. A 16 bits imaging pipe Musr deliver better transitions. Once the buyer is convinced of sonething the deal is closed. :-)

So I would personnally stick to "true 16 bits" if I were in the Phaseone eco-system.

The truth Doug is you a telling lies when you say “true 16 bit quality” plan and simple.

Regards

Simon

Simon,

I don't know why you need to launch a personal attack on Doug. It only shows your ignorance, and adds nothing to the topic.

Yes, I use an IQ180 back, and that quality is great, and I don't care where the discussion of bit-depth leads. On the other hand, I have known Doug for several years, and have had the pleasure of getting to spend some time with him in Carmel, CA. I have nothing but respect for his integrity.

If you have better science, please feel free to share it, but please don't stoop to personal attacks.

I am sure they are equiped with parts producing 16 bits data. So writing "true 16 bits" is no more false advertising than Epson writing "true 4800dpi" for their scanners or than YBA writing about the power that their high end amps can handle.

Most all DSLRs are equipped with 16-bit data buses, and maybe even some P&S cameras. So they are all equally entitled to call themselves "true 16 it"? Or none of them are? What would you call a camera that actually succeeds in recording 16 bit of DR -- a "really really real 16-bit?"

Quote

The question is whether these 16 bits include more useful data than a 14 bits pipe. There are little evidence pointing to a yes. Am I saying that backs do not have smoother transitions? Nope. I am saying than even if they do, the true reason is not the bit depth the inaging pipe can handle, but more likely a combination of CCD sensor and the quality of the adc parts used.

I think it's the area of the sensor. I don't see where the Exmor is lagging behind the CCD per unit area of the sensor.

Quote

...

Quote

Besides, oher than a few posts at LL among people who mostly do not own backs, the 16 bits "myth" probably works wonders among backs owners. Just like a heavier amp Must sound better. A 16 bits imaging pipe Musr deliver better transitions. Once the buyer is convinced of sonething the deal is closed. :-)

So I would personnally stick to "true 16 bits" if I were in the Phaseone eco-system.

This is strange logic. Because it's customary to lie, it isn't morally wrong, and closing the deal is the only thing that matters?

Most all DSLRs are equipped with 16-bit data buses, and maybe even some P&S cameras. So they are all equally entitled to call themselves "true 16 it"? Or none of them are? What would you call a camera that actually succeeds in recording 16 bit of DR -- a "really really real 16-bit?"

I don't believe that any non MFDB cameras uses a 16 bits analog to digital converter.

The fact is that if he has made such a comment about there backs shoot in 16 bit then it is a lie and is false advertising.

Simon,

No, it is not a lie since the backs use 16 bits parts.

I don't see what is so hard to understand here.

Nothing prevents you favorite DSLR brand from also using 16 bits parts, the cost would be trivial considering the volumes. The raw files would be 13% larger with little or no tangible benefits, but they could do it. They don't do it mostly because their marketing dpt is telling them that the ROI would not be significant compared to other forms of improvement.

Also, it is well known that Japanese companies talk with each other to define the main steps of technological evolution, and they have obviously decided that it would be 14bits for a few more generations so you should buy a MFDB if it bothers you not to own a device with the highest bit depth spec. :-)

I don't believe that any non MFDB cameras uses a 16 bits analog to digital converter.

Do you have any facts proving me wrong?

Cheers,Bernard

When you referred to "parts producing 16 bits data" I assumed that you meant firmware processes. What I referred to were 16 bit "data buses," for internal DSP functions, used by all DSLRs to provide two extra bits of precision to avoid cumulative error. For there are no parts producing 16 bit data otherwise. Surely even 16-bit converters could not produce 16 bits of "data" where bits 15-16 were uncorrelated noise at best.

When you referred to "parts producing 16 bits data" I assumed that you meant firmware processes. What I referred to were 16 bit "data buses," for internal DSP functions, used by all DSLRs to provide two extra bits of precision to avoid cumulative error. For there are no parts producing 16 bit data otherwise. Surely even 16-bit converters could not produce 16 bits of "data" where bits 15-16 were uncorrelated noise at best.

So we do agree that backs have a factual spec diferentiator (16 bits adc) that no other camera has that justifies the "true 16 bits" wording, right?

A 16 Bit file is infinitely more robust under heavy PS work ( color grading / exposure adjustment ) Many 35mm CMOS chip cameras are only 14 Bit files, even when captured RAW and processed to " 16 Bit " they are simply interpolated up to 16b Bit depth size from their native 14 Bit capability. As we all know image interpolation is basically crap. The difference in fidelity / integrity between a 16 Bit and a 14 Bit and an 8 Bit is huge, if you understand the maths of Bit depth you will appreciate the difference in descriptive capabilities of 16 Bits of data over 14 Bits is simply HUGE ..........

For those disbelievers I seriously suggest you do some homework and a simple test,

Take any RAW file, process it to say a 3 stops underexposed, and to the wrong color balance by say 3,000 K in both 8 Bit and 16 Bit, then correct the two files to correct exposure and color, then look at the two histograms........... well if you still feel good giving your client the 8 Bit file I'm really happy because ultimately it means there is one more lazy shooter out there selling weak files that means my files will look comparatively better