Bits, bits, and more bits

Can I ask a stupid question? Well, too late for that, but I have been in thoughtful repose about the fancy specification attached to all gaming systems - the number of bits.
Someone, please tell me how a 32-bit processor like the one found in X-Box can outperform a console from 2 years ago (Dreamcast) that claims to be 128-bit graphics. I must be missing some point in the middle, but it seems like the number of bits is a hokey parameter that is meant to impress the J6P in a game store.
Please put your flamethrowers down...I am unarmed...
Jason

Once polygons came into the mix, "bit" became a pretty worthless term, console wise;
pretty much the only time it carried much weight as far as the market goes was the difference between 8 bit and 16 bit systems.

Bits are a worthless technology term - now, most processors and motherboards involve so many different bit-widths that comparison is impossible.
Since game systems all use different types of processors, its quite possible for an Intel 733MHz Pentium III (X-Box?) to lose out to a 200MHz Hitachi RISC (Dreamcast), a 485MHz PowerPC (Gamecube), or a 300MHz RISC variant (PS2).
Intel's x86 architecture is more than 20 years old, remember.
FWIW, keep in mind that Intel's SSE extended instruction set is technically "128-bit".
Right now, the closest thing to performance measurement would be fillrate, but outside of the PC world (where there is a common platform), its extremely hard to find out anything one way or the other: for example, 64MB of DDR RAM sounds fast, but its a lot slower than 24MB of SRAM, and back and forth and so-on.
Richard

You also got to remember that the bits of the CPU can be misleading. The floowing systems are all 32-Bit, but vary in terms of power: the 3D0, Game Boy Advance, PlayStation, Saturn, X-Box, FM Towns Marty, Gamecube, Jaguar, and Virtual Boy. They're all 32-Bit, but it's easy to see that the 3D0 is much weaker than the Gamecube, for example.
Oh, and for the record, the Dreamcast is 64-Bit, not 128-Bit. The only current system on the market that is 128-Bit is the PlayStation 2.
------------------
Dave

"Oh, and for the record, the Dreamcast is 64-Bit, not 128-Bit. The only current system on the market that is 128-Bit is the PlayStation 2. "
The Dreamcast, when refering to the graphics chip, which is what most of the marketing refers to, is 128bit, not 64bit. Dreamcast used the NEC PowerVR chip, which is 128bit.
The Xbox uses a variation of the Geforce graphics chip, and it's 256bit. The CPU on the other hand, is a 32bit processor. Speaking of CPU speeds, on can not compare each CPU Mhz to Mhz since they process instructions differently. For example, as most folks might not realize, a Pentium 3 1Ghz will outperform a Pentium 4 1Ghz (one doesn't exist though).... they are not the same chip. Which is one of the reasons Intel makes them avaible at higher clock speeds then the Pentium 3. Did I lose anyone?
I can not comment on the PS2 or Gamecube.

Mike_D,
OK, I am with you on this one. But I am still unclear what the "bit" actually defines. Is it the amount of information per pixel (specifying the color, maybe the texture, etc.), or does it define something else.
Jason

Jason,
A bit, in the most simplistic term, is basically like a light switch, it's either "on" or "off". A bit is represented by either a "0" or a "1". The processor uses strings of these bits to determine what it does. A 32bit processor can process 32 of these 1's and 0's, while a 128bit can do 128 of them. But each processor is different in HOW they process them.
So basically it's all these different combinations of "on" and "off" that make a computer (or game console) operate. My knowledge of HOW they actually WORK is limited, but I think I explained the basics. Anyone care to explain further?
Mike D.

I understand what a bit is. I guess I just do not understand what the bits are defining in graphics terms. What are those 256 bits doing to make the graphics look as good as they are?
Then again, maybe I do not understand what a bit is
Jason

The more bits a graphics processor can handle, the more information it can move. And that results in more detailed graphics.
But it doesn't always come down to the bits. Processor speed, memory, and memory bandwidth are important factors. As well as the graphic routines they support. Both the Gamecube and Xbox have Transform & Lighting built in. Basically the the graphics processor can do the geometery and lighting effects, freeing up the main CPU, where as on the PS2 and Dreamcast, the programmer has to do it in the software and it taxes the CPU.
Hope this helps!
Mike D.

Mike,
Thanks for the very informative discussion about bits. It is truly unfortunate that we have such loosely defined terms with ambiguous meaning - and more unfortunate that these terms are sometimes used as a benchmark to sell game machines to J6P.
Jason