Apple, others, sued over hard-drive size claims

Comments

The standards people had a chance to fix the ambiguity once and for all, but instead they chose a route that will cause it to drag on for a long time. If these common prefixes had been approved to mean a multiplier of 1024 when used with binary bits and bytes, that would have ended it.

You do that and you destroy the existing standard. The point is a standard means one standard thing not different meanings for different contexts.

I fail to see how stripping the last two letters from the SI standard and adding bi, for binary, is difficult for people anyway. If you can keep using the SI happily you could use the new standard just as happily.

The standards people had a chance to fix the ambiguity once and for all, but instead they chose a route that will cause it to drag on for a long time. If these common prefixes had been approved to mean a multiplier of 1024 when used with binary bits and bytes, that would have ended it.

It would have begun a great deal of other grief, though.

Remember the fiasco NASA got into because they used both metric and imperial measurements in the same project? Well, now imagine that again, except that they're all using units with the same names, in the same system - it's just that your "mega" and my "mega" mean two different things!

The discrepancy between what the two mean with RAM vs. hard drives has been confusing and angering people for decades as it is. It's just recently that someone got pissed off enough to sue. I'm not sure about the merits of this particular case, but the complaint is rooted in a long-simmering and not unjustified discomfort with the status quo.

I wonder how many of your stances would shift if Apple were not part of this lawsuit

Ha! These people are suing backward. They claim the one technically correct value is misleading, and want it changed so it is wrong and matches everything else that is wrong. At least that could be the take of anyone wishing to strictly adhere to prefix standards. Apple and others are not being sued for using prefixes incorrectly. Evidently these folks like the common 'misuse' of prefixes.

Your getting closer but not quite there, because even 1000 MB is 1,024,000 KB, which in turn is 1,048,576,000 Bytes. So it is worse than you state. Drive vendors say the GB is 1,000,000,000 Bytes. The real, binary GB is 1,073,741,824 Bytes. That is more than a 7 percent difference. When we get to TB drives, the difference will be worse yet, almost 10 percent. On the old floppies, measured in KB, the difference was hardly noticed.

Hmm... I'm confused. Books such as Mac Secrets tell me 1,024MBs. But Eugene says 1000MBs. Oh, and to snoopy, yes I knew that, I just didn't want to make things complicated. However, thanks for posting, now other people can understand that.

. . . it's just that your "mega" and my "mega" mean two different things!

The discrepancy between what the two mean with RAM vs. hard drives has been confusing and angering people for decades as it is. . .

There should be no confusion if mega always means multiply by 1024^2 when the units are bits or bytes, and multiply by 1000^2 otherwise. I don't think people have any trouble with that today. The ones who might get lost would not be helped in the least by using a different prefix. Nor can I see how a standard would be destroyed by introducing context, as Telomar claims. This prefix is simply word usage, not part of a computer program. I explained it to my son and he had no trouble with it.

The discrepancy between RAM and hard drives will not go away by simply using a new prefix for RAM. It might alert a few observant people to the fact that something is different, but it does nothing to solve the problem. People will still want the numbers used for hard drive capacity to be the same kind of numbers used by the OS. Since two types of prefixes are approved for describing bytes, the OS and hard drives can remain different forever, one using 1024 multipliers and the other 1000.

I must be the most tech unsavy guy on this board, because I just don't get all the technical explanations posted. So I can advertise a HD as being 60 Gb, but really it's only 56 Gb and that's okay? And I'm supposed to know and accept this as an educated consumer? And what's this about decimal vs. binary?

I use Apple products because they work, and I don't need to be some sort of technical expert to figure things out. From my simple view, 60 Gb better be 60 Gb - I don't care for well-rehearsed arguments about different measuring systems and standards groups. Someone in the marketing division needs a swift kick in the pants for fibbing.

I believe the root of the problem is that the drive makers have been manufacturing their drives differently. They are making them with fewer bytes, but advertising them as having the same number of bytes through a tricky little marketing loophole.

If you bought, say, and LC II ten years ago and it came with an 80 MB drive, you'd have 80 MB to store stuff on. The hard drive would have a capacity of 83.89 million bytes, which is equal to 80 MB after converting. Life was good.

Then drive makers realized that if they eliminated those extra 3.89 million bytes, they'd still be able to advertise their drives as being 80 megabytes because "megabytes" by definition is "million bytes," just as a "megaton" is a "million tons." Computers measure things differently, but they figured they'd still be on solid legal ground, they'd save about 5% on the cost of each drive while charging the same, and no one would really notice those extra three or four MB anyway. I think they could also pass the blame onto computer makers, because they're the ones pushing this whole weird system based on eights and powers of two and everything. Hard drives have a certain number of bytes, it's simple. They just blame the lost megabytes/gigabytes on computer companies because of "formatting," and it appears to be working because people are suing them instead of the hard drive manufacturers.

So to recap, ten years ago a hard drive company's "80 MB drive" had 83.89 million bytes, which the computer tells you is equal to 80 MB. Nowadays, if a company were to make an "80 MB drive," they'd put 80.00 million bytes on it, and the computer would tell you that it's equal to 76.29 MB. They are not the same size hard drive. The hard drive makers have just changed the way they make their drives and how they advertise them.

Thanks - finally some clarity in this thread. So it is about marketing and cost-cutting, and not arcane arguments about decimal vs. binary. Lawyers, full-steam ahead!

GG

No. It IS about decimal vs binary and the definition of the term "gigabyte." Luca Rescigno may be right about older HDDs using the binary units, but it doesn't even matter. I'm not about to go check my old HDDs to check. You may be surprised how early manufacturers started using the SI prefixes to mean what they actually mean...

Are we really at the point where it's okay to sue somebody for lying when he is really telling the truth?

There is a more legitimate case in suing Apple for telling me my DV stream file is 3.3 Gigabytes when it is in fact 3.6 Gigabytes.

But why advertise in binary when the world uses decimal in common usage. Binary seems deliberately misleading, especially if, as Luca suggests, manufacturers initially advertised in decimal, but then switched to binary advertising as a means to cut costs by reducing drive size while still being able to claim 60 Gb instead of the actual 56 Gb.

This reminds me of those adds in the back of Consumer Reports - you know, the ones where a company lowers the serving size and then claims 20% fewer calories per serving. Legally and technically accurate, but quite misleading.

First, 60 Gigabytes and 56 Gibibytes are the same value. Both are accurate. This is the standard. The HDD manufacturers choose to use Gigabytes (GB). It's not deceitful because they are actually trying to conform to the right units. I don't know about you but I've been conditioned to use multiples of 10. My brain wraps around that more easily.

The problem is that Mac OS X and probably Windows both actually measure in binary Gibibytes, Mebibytes, Kibibytes, and so on. They further obfuscate things by using the WRONG symbols with these units.

And I don't think in binary either, which is why any figure used for general consumption (e.g., advertising, the OS interface, whatever) should be done in decimal. If you want to do binary behind the scenes for the experts, then fine.

There's just a difference between how the computer sees it and how the manufacturer advertises it. The OS says you have "X number of GB available" when they should be using a weird term called "Gibibytes."

The SI prefix of "giga" means "a billion," there's no arguing with that. Unfortunately, the computer world has standardized on "giga" meaning "1024 times 1024 times 1024" or "1,073,741,824" or "a little more than a billion." If the drive makers added those extra few bytes to their hard drives, or if the OS started telling you the space with the regular SI prefixes, it would be fixed.

Isn't it true that we, as a group of posters to this discussion, simply do not agree on what the issue or problem really is? To some, the computer industry is at fault for not sticking to the defined meaning of metric prefixes, when applying these prefixes to binary numbers. Others have no problem with use of prefixes in this manner, but fault drive makers for using the normal, defined meaning a prefix that is attached to a binary number. However, I believe the real issue for most people is that drive makers and computer makers do not use the same convention. I don't think the average person cares whether kilobyte is 1000 bytes or 1024 bytes, just so hard drives and operating systems use the same definition. If computer makers began using different prefixes for bytes, it would not help at all. The new prefixes simply add a formal blessing to the existing problem, and help ensure that it will never go away.

We can poke fun at those who are suing computer makers all we want, but at the heart of the matter is the fact that a big discrepancy exists in the reporting of hard drive capacity. I would not be surprised to see computer makers agree among themselves to begin advertising hard drive capacity differently. If they all do it, one brand does not have a perceived advantage over another. Doing so then puts some heat on drive makers to conform.

OK, I take a simpler approach. I don't remember this happening before with my other drives. If I buy a drive that says 250GB, I expect the drive to show up as 250GB minus a few hundred MB of formatting. (Not 10GB!) If the unformatted drive shows up as less than 250GB drive, the specs on the box are misleading and need to be changed. My brother just bought a 250GB drive and I was racking my brain trying to find out where the extra memory went. I think I understand now, but I spent HOURS trying to reformat to get the extra space.

It really needs to be changed back.

No, you are just wrong.

Here's an example: How many bytes are in a kilobyte?

If you are using the straightforward english semantics, then kilo = 1000, so you would say that there are 1000 bytes in a kilobyte. Unfortunately, as far as your computer is concerned, there are 1024 bytes in a kilobyte. The 1024 bytes/kb is the correct number.

Either way, there are 100 billion bytes on your 100GB hard drive. It isn't as if you are loosing space, just that your computer counts differently.

The lawsuit is baseless. It will loose. I can't believe that people are suing over this. Their lawyers are so doomed in court that I wish I could get the videos of the case.