Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Chris Vaughan writes "The 25th edition of the TOP500 list of the world's fastest supercomputers was released today (June 22, 2005) at the 20th International Supercomputing Conference (ISC2005) in Heidelberg Germany. The No. 1 position was again claimed by the previously mentioned BlueGene/L System. At present, IBM and Hewlett-Packard sell the bulk of systems at all performance levels of the TOP500. The U.S is clearly the leading consumer of HPC systems with 294 of the 500 systems installed there (up from 267 six months ago)."

PowerPC is _based_ on POWER. The G5 is basically a modified and scaled down POWER 4 chip.

Apple's got other concerns rather than just raw computing power, and they don't need the features that allow you to have more than 4 or so processors in one system. POWER itself isn't designed for small applications - engineering workstations is about as low end as it gets.

BlueGene/L is also much smaller than Earth Simulator. At 65536 processers you get 32 cabinets (2048 per). While Earth Simulator is 320 cabinets for the CPUs alone, not including the 65 cabinets for the interconnects. Construction of BlueGene/L is not complete it will have 131072 processors when it is fully completed.

And here's a link to the actual list [top500.org]. Also interesting is the historical chart of the TOP500 by manufacturer [top500.org], which tells a story in itself -- the decline of Cray and rise of IBM and Hitachi, for one.

It would be great if we could verify Moore's law through some simple stats using the histrical data from this Top500 list.-For example:How many years did it take for Number ones on average to be dropped off the 500 list?

- How many years after the list was published did it take personal computers tu make it in the 500list? To make it to the number 1 spot?

- How many transistors did these computers have? Did it verify Moore's law?

Maybe we should measure computer performance in more practical terms. Maybe it should be a function of input and output. For how much work the user puts in, how much work does the computer put out? That is the real point of computing, having a machine do work for us, so perhaps it would help to measure power in more concrete terms.

How do you measure the "output" of computers in terms that actually make sense from a practical perspective, though?

That's opening a whole can of worms really. A computer is just a tool, and this kind of measurement is simply not possible. As another example, take a screwdriver - pretty much anyone would agree that it's a tool that makes life easier, but you simply can't measure the output of a screwdriver.

Computers are similar, and the reason for that is that while computers are machines that process dat

OK, I do realize that you're just joking, but it's actually an interesting question, so here are some thoughts:

1. The PS3 isn't out yet.2. The top500 lists does NOT go and look for systems that should/could/might be included - rather, you have to submit benchmark results yourself. I assume they do check them, of course, but as long as you don't approach them about it, you could have the fastest system in the world, and you'd still not get listed if you didn't submit benchmark results (incidentally, this me

Actually that would be an interesting measure of performance for future computers, to see how quickly they could fully render the planet Earth like the Earth Simulator. 2 planet Earths a second would mean that for every second two full renderings could be completed, which seems like a pretty good rate (real-time for many practical purposes).

What's surprising to me is that Cray used to be synonymous with supercomputers and they now have comparatively few entries.

Why is that suprising in any way? At one time, Ford was synonymous with cars, but today have news of Ford laying off managers. IBM used to be synonymous with the desktop PC, but with the sale of their laptop division are now completely out of the market. Sony Walkman was synonymous with portable music, but now everyone has an iPod.

Cray is just another company that had a great product for a while, but couldn't keep innovating and couldn't keep up when the competition joined the market. Nothing at all suprising about it, it happens all the time.

While Linpac may be a bit more parallel then some supercomputer apps it's comprised of real workloads. This means that for some cross-section of the supercomputer market Linpac is a good measure of a systems performance. And of course the people who are given millions to buy a supercomputer generally know the differences between different architectures and why they would want one system over another.

It all depends on the system architecture and the type of problem being solved. Certain problems will adhere better to certain architectures and thus allow for a smaller gap between the theoretical and actual performance. The gaps can also be inherent in the architecture itself (e.g. communications bandwidth like you said).

Uh, for purposes of Top 500 List classification, they are all solving the same problem -- the High Performance Linpack benchmark (solving a system of simultaneous equations via Gaussian elimination with array pivoting). Granted, I beleive some variance in the size of the arrays is allowed, giving more massively parallel machines that can handle larger arrays an advantage.

That fact combined with the large number of IBM-based systems on the to 100 list really makes it look like IBM is dominating this sector of the market.

You know what data is always missing from this list that we'd all like to see ? The cost of the systems. Although, I suppose if you're looking at building the most powerful computer system on the planet, cost might not be your first consideration...

Remember the goal of BlueGene is to build very dense systems. Not only do you have to factor in the costs of the system, but you have the costs of the facilities. This includes costs of construction or renovation of the facilities to handle the power and cooling requirements of these behemoths. BlueGene/L in it's current incarnation is using 32 cabinets for it's processors. While Earth Simulator is comprised of 320 cabinets for the CPUs (an additional 65 for interconnects).

Here's a list of things I would do if I had access to one of the systems on that list:

- See how long it takes Windows ME to boot- See how long it takes pico to open- run 'top'- play a wicked ass game of pong- bitch about having so many CPU's and only 2 USB ports- see if I could get a video card with dual display support- fire up a spreadsheet and make a wicked ass multiplication table going really far (like 10X10!)/had an original IBM PC// bored

With all the more pressing issues for which supercomputers can be used, I don't believe that China is using the 18th fastest computer [top500.org] for weather forecasts. At least not the ones they publish in Xinhua [xinhuanet.com], anyway. Is there any verifiable way to tell what that machine really does?

[...] I don't believe that China is using the 18th fastest computer for weather forecasts.

Why not? Weather forecasts *do* require lots of number crunching power, and when you've got a big country with more than a billion citizens, then I'd say that there's sound economic reasons why you want good and accurate forecasts, too.

Or do you know something we don't? I wouldn't be *surprised* if it turned out that China (a dictatorship, after all) really did use the system for more sinister purposes, but I'm

Because they don't have a supercomputer listed higher, or anywhere in that range, applied to nuclear research. Either weapons, or power. Which is a higher priority for the Chinese government than weather research. In fact, I can't find any Chinese supercomputer which looks like it's the one (or among the ones) they use for nuclear research. That's the kind of paranoia trigger known as "conspicuously absent".

That's a good point. However, looking through the FAQ for the top500 list, it seems as if the list editors only include systems that data is submitted for, anyway.

So, in other words, I'd say it's likely that China (as well as several other countries, most likely including the USA) *does* have faster systems which simply aren't included in the list at all, which in turn makes it more likely again that the one that *is* included is indeed used for the purpose given - weather forecasts / research.

No, your analysis is reasonable. My analysis is paranoid: there are more people thinking about secret Chinese supercomputers, and their security threat, in this thread, than in Bush's cabinet, or anywhere in the White House - or Congress, for that matter.

Actually, I don't think Dubya is protecting us from China any more than required for appearances. His policies have boosted China's economy at our expense. His blown Korea policies have strengthened China's hand. He's let Sudan off the genocide hook, while Sudan sells all its oil to China. Even little details show Dubya to be in China's corner: the Republican eVoting programmer in Florida, who blew the whistle on the FL Republican Representative who requested the eVoting security cracks, says the go-between

These ranking are based on LINPACK doing traditional operations like solving linear equations, so supercomputers like the Cray MTA [cray.com] aren't even listed even though for some grand challenges they destroy everything else, for example when doing dynamic mesh weather simluations. Each processor on the memory grid has 128 processor threads where the active thread switches every cycle (so memory fetch has huge latency). This lets it have a unified memory model and still have extremely high throughput.

And it happens plenty on "traditional" supercomputers. That's why you stick a fast interconnect on them, like myrinet or infiniband and don't use just ethernet (which some of the high machines there do).

Given that your mesh isn't going to morph that fast in most physics codes (that may not be true for weather codes), you can afford to just run static and then pause every few (minutes, hours, days) and re-work your mesh to adapt to changing conditions.

Supercomputer #72 [top500.org], at the Chinese Academy of Science, comes from Lenovo. I wonder how far ahead IBM's sale of its "PC" business to Lenovo has put China's computing industry. And I wonder just what kinds of simulations [sorgonet.com] they actually run on the beast.

Does anyone else think it's wierd that a graphic design company like Animal Logic [top500.org] has 2 of the top 500 supercomputers? Or whatever they call Gaming Company B in China, with the 2.2TFLOPS at each of #150, 151 & 153, or their Taiwanese counterpart at #152?

No, and I'm puzzled by the "Gaming Company [B]" name of the 4 machines, apparently identical, in China and Taiwan - I didn't think that kind of partnership was possible, under their political antipathy. Which makes me suspect that the machines are being used for something other than just games - PC gaming, that is.

So, unlike five years ago most of the large supercomputers (published on the list) are used for scientific research rather than making and maintaining big bombs. Personally I'd say that's real progress, but I have to thank the government for keeping the industry going through what were otherwise so

I didn't bother going any further than the top 10, but when asked by a co-worker "wow...none are running Windows and most are using Linux"...and not to sound like a total linux geek or windows basher, but none of the top 10 are running windows. Reason: Microsoft charges much more for multiple processor support for their OS.

God, this makes me miss *lisp and my CM-2.
With supercomputing being taken over by Big Blue and the like, there seems little room anymore for the smaller, more flexible players like Thinking Machines [wikipedia.org].

Games can easily be programmed in java. I remember when that was all you could find on the web. Nice, simple, easy to play games based on java.

Now everyone thinks that Flash is the way to go because they can throw in more eye candy. Apparently the numerous comments on game playability that come up when talking about game design only apply to console or pc games but not Flash games.

While yes, I do dislike Flash, I have seen one or two pages which use it in a great manner to enhance. Unfortuantel

Personally, I don't think that Human brains are binary based, logic gate controlled computation machines, and this difference accounts for why we have so much diffuclty with developing strong AI on them.

I do believe, however, that we will eventually "crack the code" to the fundamental archetecture of our brains, and once we do that, we will re-design our computers accordingly, and finally achieve strong AI.

I also believe, that our currently architected computers will play a key role in assisting us with cracking this code.

Despite all this computing power, computers still can't think like humans. They can perform calculations faster, but can't perform optimized heuristics or even form optimized heuristics like humans.

That's probably because brains use a completely different architecture than digital computers. Neurons connect in a highly parallel fashion, with trillions simultaneous of connections arranged in 3D directly between various parts of the brain. Even with the 1000000X speed advantage of computer logic, the number

Wrong, Alpha is still fourth [top500.org]. With half of the processors of the 3rd competitor (5120 vs 10000) it achieves more than half of its "R peak". I guess there're lot of factors (interconnexion use, etc) here but it'd be interesting to see a performance/number of CPUs chart

Actually, it is really hard to figure out. The top 500 list is only of those people that have bothered to register their systems with the top 500 list in the first place. If you have a system that beats all of these systems, but don't register with the group, you don't get listed.

What good is syndicated news if there are no links to the syndicated information? What it has lead to is a bunch of people who should be modded as redundant for all providing links to the info that should've been included in the summary. Oh well, it's free news and entertainment. I can't complain. I can laugh at the irony of having a site which often lacks logic even though its content is mainly about science and technology, both of which logic is pretty much the foundation for, but I can't complain.

A good business model for news corporations is to give away the video free, as marketing for premium bundles. Like timecode-correlated transcripts, and hyperlinks to other coverage. Then news analysis corporations can buy the structured multimedia data. And come up with their own business models for funding those purchases. Like subscriptions for political organizations, research institutes, or Michael Jackson fanclubs.