The class action lawsuit points out the glossy screens Apple introduced with the launched of its MacBook last year. Apple advertised that the new glossy screens provided users with deeper blacks and whites that are more vibrant. However, many customers experienced graininess and sparkling effects common to dithering techniques, according to the lawsuit.

According to the complaint:

Many such dissatisfied purchasers were chastised by Apple agents and employees for being too picky about their assessments of the quality of the display. Other dissatisfied purchasers were told that they were imagining the complained about defects.

The complaint also points out that many of the disgruntled customers posted messages on Apple's own forums only later to have their posts moderated or completely removed by Apple forum administrators.

"It appears that Apple has engaged in substantial editing of the posts on the discussion forum," the lawsuit indicates.

The lawsuit alleges Apple uses dithering techniques to create an illusion of colors that don't actually exist. In fact, the lawsuit claims if a MacBook or MacBook Pro users installs Windows XP, they will notice superior image quality in areas such as gradients. The test seems to indicate Apple is using some sort of software at work in OS X.

"The displays are only capable of displaying the illusion of millions of colors through the use of a software technique referred to as 'dithering'," the lawsuit claims.

Comments

Threshold

Username

Password

remember me

This article is over a month old, voting and posting comments is disabled

The apple panels were 6-bit. You might be "surprised" and it may strike you as "utter BS" but Apple indeed marketed LCDs not capable of 8 bits/pixel. Even if one grants that at one point in time, the whole pixel is one of 2^24 colors (and everyone grants this except you), the panels in question have pixels that can only display one of 2^18 colors:

8-bit: one point (sufficiently large; i.e. 1 pixel) at one instant in time: 1 of 16.7 million colors6-bit: one point (idem) at one instant in time: 1 of 262,144 colors6-bit, as marketed: One point (idem) averaged over a sufficiently small WINDOW in time (i.e. long enough for our subpixels to alternate, but not so long as to blur with the next color sent to that pixel): 1 of 16.2 million colors.

I suspect you will agree that this leap is difficult to justify.

However, that is industry standard. The correct targets are panel manufacturers. Honestly, the best solution, but one that doesn't make people as happy as suing, is taking all the time enery, and $$ wasted on this lawsuit and using them to force a revision of the standards governing the technical specifications for LCD panels. As it stands, this suit will probably be dismissed outright, because the legitimate source of the deceptive information are defined, industry standards, which obfuscate the meaning of "discrete colors," and with which Apple and its panel manufacturers have operated in full compliance.

I have to agree. I think it would be simple and sufficient if displays, when sold, were clearly labeled as either being 6 or 8 bit displays; when I was shopping for my display, I had to ignore about half of the choices on the market because they simply didn't list whether they were 6/8/18/24bit (however they wanted to write it) or displayed 16.2/16.7 million colors (I require 24bit color and A-RGB color profile for photo editing).

Honestly, it wouldn't be that hard for the manufacturers to list this.

Show me the specs on the LCDs that Apple ships that clearly state the LCDs are only capable of 6 bits for each of Red, Green and Blue!

No one in this forum (or any other I have read) has given manufacturer's specs on the LCD screens themselves. Until someone does then I still call BS.

All your ranting about "LCDs not capable of 8 bits/pixel" is pure BS until you give the model of the LCDs themselves along with manufacturer specs.

Additionally, each "pixel" is almost NEVER 8 bit unless you are talking about the 256 color mode! Each pixel is made up of 3 (count them! -- red, green and blue ... yep 3!) subpixels that make up one full pixel. There NEVER has been a 6-bit pixel! No one has ever mass produced a LCD panel which has only two bits per red, green and blue.

Many of the posts early in this thread stated clearly what the 6bit & 8bit designations meant.

Since you missed the above

6bit LCD has three colors per display pixel. The numeric value for each of the three colors requires 6 bits. These displays are also designated 18bit displays due to the number of bits required to specify all 3 color values.

8bit LCD has three colors per display pixel. The numeric value for each of the three colors requires 8 bits. These displays are also designated 24bit displays due to the number of bits required to specify all 3 color values.

The above obviously applies only to LCD displays that use RGB. For Monochrome displays it is usual to convert the RGB value to the 8bit value corresponding to the brightness of the pixel. Or if it is a 6bit monochrome display, the 6bit grayscale value.

Also stated in many posts is that the manufacturers, resellers and OEMs building devices using the LCD parts do not disclose which of the 3 bit levels is used. LCDs supporting the 10bit standard are likely to be the exception as they will be selling into a specialty market that will pay premium prices for 10b hardware.

Simulated 8 or 10 bit per color generates visible artifacts for certain colors. This inaccurate color rendering is visible if any of the problem colors are used. This is why real 8bit has 16.7m colors and simulated 8bit has 16.2m colors. It is also why trained users can see a visible difference in the result.

"Nowadays, security guys break the Mac every single day. Every single day, they come out with a total exploit, your machine can be taken over totally. I dare anybody to do that once a month on the Windows machine." -- Bill Gates