Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

It seems to me a x75 increase in power efficiency should be worth to nVidia (or any competitor) much more than $20M, why does DARPA need to fund this, this seems exactly like the kind of work which doesn't need DARPA money. DARAPA should spend money where it is not clearly economic for others to do so.

Maybe DARPA also wants the thing to withstand radiation. Maybe they want it to have so little computational power that it would not sell in the market. Maybe they want every component to have been made in the US which the market won't care about. Maybe they have other restrictions. You don't know.

Maybe they want it to have so little computational power that it would not sell in the market.

That's an intetresting point. How much power would a 300Mhz pentium of 1998 consume if built with modern technology? That thing could do quite a bit of computing, but as you say, wouldn't survive in todays PC market. But for embedded applications it would be great, not least for the fact that you can get the comfort of a full computer.

A 75x increase is absolutely needed if they are going to continue to make faster supercomputers. IBM is trying to make a 1,000 Petaflops supercomputer before the end of this decade. That supercomputer would require a Gigawatt of power to run at a Gigaflop per watt. Now one is talking about a nuclear power plant to generate enough power to run the computer. At a dollar per watt one is also talking about spending a billion dollars on the power plant which is a lot more money than they plan on spending t

It seems to me a x75 increase in power efficiency should be worth to nVidia (or any competitor) much more than $20M, why does DARPA need to fund this, this seems exactly like the kind of work which doesn't need DARPA money. DARAPA should spend money where it is not clearly economic for others to do so.

You were under the assumption that we live in a purely capitalistic society? My mistake. Even in countries that are into extreme capitalism shift back to subsidies and support when it comes to certain things.

Though, usually those things involve essential services like fire-fighters, road maintenance, and policing...

DARAPA should spend money where it is not clearly economic for others to do so.

Well, that's good advice, and I'm sure they'll take it over at DARAPA. On the other hand, DARPA has certain goals, whatever they might be, and if this is the most economical way to achieve them, then it's money well-spent from their perspective. If you're going to have a thing like DARPA, then you need to permit it to do things like this if you want it to be efficient. On the other hand, if you're going to do things like this, you need substantial oversight in place to prevent abuse. And on the gripping han

There is no way anyone can make an intelligent decision on what to purchase today. There are way too many variables so one can only guess. I am sure that a Gigaflops per watt is much better than the computer I am typing on now. So if they can go to 75 Gigaflops per watt why would anyone buy a computer again since a dump terminal could be better. But that decision would take a lot more knowledge than I have and I know a lot more about computers than the average person.
I contribute to a project sponsored

There is no way anyone can make an intelligent decision on what to purchase today.

Why not? If I'm shopping for an Android tablet, say, there's only a small handful of credible processors, and there's only so many screen technologies and manufacturers, and I can gauge a manufacturer's past quality and hope that it will serve as a useful predictor of future performance. I can look at their financial statements and find out if they have been purchased by vulture capitalists. And you can do all of this from a free or nearly-free computer. (I've given away computers more than adequate to the

Practicality. When talking about energy consumption, it's usually given in watts because the practical implications are time-dependant. You've got to account for the time it takes to run the calculations (which may be time-critical - you don't want your amalgamated radar data on a five-minute delay) and need to know the wattage to calculate cooling requirements. While operations/joule and flops/watt are equivilent, it's easier to think in terms of the former.

I work with high-performance computing in physics -- all of my peers know the difference between energy and power. Sometimes people use "flops" as an abbreviation for "floating point operations" ("It takes XYZ flops per site to compute the Wilson Dirac operator" or "The flops/bytes ratio describes the balance between processing and communication in the algorithm") without the "per second".

We passed 1e+07 operations per kWh in 1965.
We passed 1e+08 operations per kWh in 1971.
We passed 1e+09 operations per kWh in 1976.
We passed 1e+10 operations per kWh in 1981.
We passed 1e+11 operations per kWh in 1987.
We passed 1e+12 operations per kWh in 1992.
We passed 1e+13 operations per kWh in 1997.
We passed 1e+14 operations per kWh in 2001.
We passed 1e+15 operations per kWh in 2008.

Energy efficiency consistently doubles approximately every 1.6 years, so if we are at ~16 glops/watt right now, then we will blow past DARPA's target early in 2016... just a little over 3 years from now.

Energy efficiency consistently doubles approximately every 1.6 years, so if we are at ~16 glops/watt right now, then we will blow past DARPA's target early in 2016... just a little over 3 years from now.

It's not guaranteed, and that's the whole point of this contract; to ensure that we will reach that point. Also, the article talks about chips for sensor systems, not GPUs or similar.

Out of curiosity (and I ask because I genuinely don't know), how many flops/watt do modern smartphones do? What about the GPU coprocessors in them? Modern GPU's are great, but they're not even optimized that strongly for power consumption.

I would think GPUs are actually worked more on for peak efficiency because top cards have been consuming hundreds of watts and particularly workstation and compute cards will often run at 100% when in use for a big render/compute job. Smartphones are much more about dynamic power, adjusting clocks and voltages and tons of sleep modes, if you're doing 100% load on all cores then none of that will have an effect. Sure they care about power usage at peak too, but I don't think more than GPUs.

As things usually do, the results of this research will eventually trickle down to desktops, laptops and mobile devices, and will result in either lesser power consumption or the same power consumption but in higher performance -- either way it's a plus. I just wish the contract could've been given to someone other than NVIDIA as it would be nice if the results of the research were released completely for free to the public instead of being patented up the wazoo, but alas, NVIDIA has so much experience in these things that it just makes sense to slap them with it if you expect results.

In the spirit of the flame war you may have begun, you do know that AMD generally has faster chips?

I don't know that. I haven't seen any extensive research on such a topic and I do not have the time or money to come to such conclusion myself. Also, I am not taking any stance whatsoever on which one of the two would've been better suited for the task at hand, I'll leave waging such silly flame wars to you.

re: I prefer it was spent on computing, rather than explosions..
Don't forget that they can use the improved computational power and that improved computational power efficiency to simulate and design better explosions! But look at how much innovation comes about from war and war/defense funding. (It's not hard to search for it). Heck, even canned food had its research and development funded by Napoleon to help the French military.
And it cuts both ways: any innovation can be put to use in the aid of def

I'm no genius in development or marketing, but if that could have been done, it would have already.

I don't see in TFA where it says how long they have to complete this project. So that makes one wonder if they'll (based on Moore's Law) have it out one week earlier than all competitors with that small lump of change.

Note that these cards are slightly different than consumer graphics cards. They have more double-precision pipelines because scientific computing cares more about that kind of math. They are also much more expensive than consumer cards. The underlying chip design is similar to the 600-series graphics cards. You can think of it as a modified version optimized for math, since the 600 series came out first, and is being produced in higher volume.

My theory is that by putting investment capital in to the tech, they have a "mob-like" hand in the technology. Yes, it doesn't seem like a good investment of DARPA's money now, but the favor WILL be repaid by nVidia at some point, probably to a tune of much greater than $20 million. GPU's have incredible potential for processing power even in current day, and DARPA is one of the government divisions that I would expect might need such power for various project(s).