G4 Macs ... A Super Computer?

I've heard from a friend that Apple G4s cannot be exported from the US because they are considered "Super Computers". Now, we've all heard Apple refer to them as that. You have to admit that 11.5Gflopsis pretty amazing for a computer under $5000. So, has anyone ehse heard this?

Yeah it's probably garbage. Although it's really not possible to compare dedicated processing units like the nvidia GPUs because they are not general purpose, rather, they cant be deployed in such a way that the op/s would mean real computing power.

quote:Originally posted by Nikolai(km): Yeah it's probably garbage. Although it's really not possible to compare dedicated processing units like the nvidia GPUs because they are not general purpose, rather, they cant be deployed in such a way that the op/s would mean real computing power.

I know about the comparing Apples to Oranges, but since that's what's done as part of the Apple propaganda, that's what I did as well. Ya might note the quotes I put around "operations".

The PS2 export limitation is real though, and there were even a few stories about this, especially with Iraq in mind.

If i'm not mistaken, the reason that Apple started that "propoganda" was that the old definition of a super computer was for the CPU to reach 1 gigaflop. And according to some US trade laws, a "super computer" is considered arms, and thus could not be exported. Of course, that's kind of stretching it, but nonetheless, i'm fairly sure that's how it came to be.

This article spouting the "Super Computing" abilities of a G4 wasn't from any particular Apple Propaganda magazine, but from a PC Magazine or PC World blurb. Though I would consider it some propaganda, yes, they do have a point about the "pure power" of the chip. How many Gflops can an A4-1200 or P4-1800 do? How about $/flops?

quote:Originally posted by Crackhead Johny: I think the PS2 does something like 7.2 or is it 7.8 Gflops.

IIRC, The GPU of Nvidia's original Geforce was 1.2Gflops and the Geforce 2 GPU was 8 Gflops.

So, as I was saying earlier, according to the supercomputer crap, and yes, Apple started it first, not PCWorld or PCMagazine IIRC. So, we shouldn't be allowed to ship just the video cards to anyone either

It's as bad as the crap about the Intel P3 series making the internet faster. Now the P4 making the internet faster is mostly crap, but if it improves MP3 and video compression (how about decompression?) then you could vaguely say yeah.

quote:You have to admit that 11.5Gflops is pretty amazing for a computer under $5000.

You can forrrrrrrrrrrrrrrget about it! You're not going to get 11.5 usaful Gflops out of a G4. You're not going to run any kind of normal app (or nuclear simulation or whatever) at 11.5 Gflops. You just don't have enough bandwith.

Take a dually p4, count full SSE2, and its marketting numbers will waste the g4. Intel, however, realises that nothing will >>>EVER<<< use every single instruction sse2 is capable of simultaniously, so they don't publish that number.

Apple does.

More interesting are actual benchmark numbers. If you want to see something, check my system benchmarks. www.geocities.com/deedlitcryogenic/geekclit.htmAdd the raw FPU and the raw 3dnow! numbers to get the max real world "flops" my cpu has been tested at. That's a 120 dollar cpu. My system (minus the dual raid arrays) costs well under 1k. A dual 1.2 athlon MP will show real world performance in the "11.5 gigaflops" range and cost thousands less. (no data on xeons, sorry)

Now as any person in the know will tell you, ops very from processor to processor, so even "gigaflops" isn't a good judge of processor speed. There's alot more involved.

Geez.. why are people so bitter about Apple? Did you guys buy the stock at $155 before it fell or something?

The peak sustained performance of the 866.6 Mhz G4e chip is something like 11.5 billion floating point operations per second. THIS IS A FACT.

US Export restrictions define anything over some arbitrary number of GFlops as a 'Supercomputer'. THIS IS A FACT.

So yes, the G4 (even the slowest 400Mhz model) is a Supercomputer by US Export Regulations. DEAL WITH IT. It has nothing to do with whether or not you happen to like the computers that use the G4 chip.

So why shouldn't Apple use this fact in their ad campaigns? It sounds good, and it's not a lie. It's called capitalism and the free market. There are truth in advertising laws, but their statement breaks none of them.

quote: Technically speaking A geforce 3's GPU performs more "operations" per second than a Motorola CPU

I seem to recall a lot of people raving about the GPU of nVidia. This is too, a buzzword, that nVidia is using to help sell there product, yet I don't hear people complaining about it's use of the "word".

If it's technically a super computer, great. Show off! However, don't berate Apple because their marketing department resourceful.

quote:Geez.. why are people so bitter about Apple?(snip)So stop whining. Please.

It's not whining. Apple's benchmarketing, and people are calling them on it--if someone came in here asking if a PIII really made the Internet faster, Intel would get the same treatment. If there's any special animosity for Apple, it's because they complain about the meaninglessness of MHz while holding up FLOPS as an accurate measure of speed.

As pauli said, it's friggin retarded. Apple does better ads when they focus on people using their computers rather than trying to explain why a superultrafast processor is better than a megahyperfast processor.

quote:I seem to recall a lot of people raving about the GPU of nVidia. This is too, a buzzword, that nVidia is using to help sell there product, yet I don't hear people complaining about it's use of the "word".

That's because it's being used in relation to the term "CPU". If someone in A/V claimed "GFORCE IS BETTAR IT SI A *GPU*", it would be another matter.