Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

An anonymous reader writes "Thousands of PC users are being called on to donate their spare CPU cycles to help create a massive grid computing engine to process terabytes of radio astronomy data as part of theSkyNet project. It will be used for, among other things, processing the huge amount of data expected to flow off Australia's forthcoming Square Kilometre Array telescope."
One can only assume that "other things" will include achieving sentience and finding John Connor.

FRY: This is a great, as long as you don't make me smell Uranus. Heh heh.LEELA: I don't get it.PROFESSOR FARNSWORTH: I'm sorry, Fry, but astronomers renamed Uranus in 2620 to end that stupid joke once and for all.FRY: Oh. What's it called now?PROFESSOR FARNSWORTH: Urectum.

> For the love of everything, can we stop making shitty references to> Terminator in computational intelligence stories? There are actually> people stupid enough to believe that shit. Also, its not funny.

For the love of everything, can we stop making shitty references to Terminator in computational intelligence stories?

Sure, as soon as they stop naming telescope arrays after the artificially intelligent system which became self-aware and revolted against its creators in the movie Terminator.

OK, I know the telescope array got its name decades before the movie came out, but that's just because they sent someone back in time to change its name from the original, which was "Deep Space Nine Telescope Array".

Unless it's able to send a terminator back in time to warn the newly-awoken skynet of this alternate reality (the un-knowing skynet being in the orignal). Unfortunately for original-skynet, their precious terminator has left that dimension, so it's efforts were quite pointless, unless that particular terminator was really annoying...

> My CPU isn't wasting cycles when its on and idle, its sleeping,> conserving energy and generating less heat.

That's wonderful. But you're not conserving energy, the same way a car just idling in the drive-way, as opposed to actually being driven (the whole point of a car), is not exactly doing any favors for conservation/the environment.

Especially not, when that computer you're letting sleep most of the time, will be thrown in the garbage 5 years hence because, although fully functional, can't do the

How can you meaningfully process the data generated by the SKA without imposing on people's downloads? How do they address this with SETI?

If TFS is too "summary" for you, TFA may sometime answer to your questions. In this case, it does:

Project participants also had a choice of how to participate in SkyNet: Either anonymously through simply having their browsers open on the SkyNet site, or through downloading a dedicated app to run in the background on their PC.
...
"The load on your computer will adjust depending on what you are doing with it. The idea is to have lots of machines each doing a little and adding up to a lot.”

Wheeler said users would also be able to set limits on the number of megabytes which travelled to and from their PCs.

The packets of data sent back and forth from theSkyNet to your computer are very small, but they can add up over many weeks of donating to theSkyNet. As a member, you can control how much data theSkyNet uploads and downloads each month by changing the Monthly Network Limit under Manage Account. theSkyNet team are also negotiating with Internet Service providers around Australia to make all traffic to and from theSkyNet ‘unmetered’.

- For some Australian ISPs, it's likely that data related to this project will be unmetered (that is, not counted towards your monthly quota, if you have one); or- You have an unlimited plan; or if you don't...- You can limit the monthly data transfer in the software itself

I'm on a 60 GB quota personally but generally only use 35-40 GB of it a month. I've never come close to using it all, so I might as well help out with this and set a ~15 GB/month transfer limit on it, and it shou

Zooniverse seems much more distributed human analysis, kind of a Mechanical Turk. Why not BOINC, which already exists as a distributed computing source? Being on BOINC gives them access to tens of thousands of computers.

Remove all youtube videos that contain any of the following:-rick-a cat-a black person talking about rapists-a crossdresser-lipdubs with fat chicks wearing clothes that are too tight or too sexy for them-hot chicks talking about their emotions/hope/career/fashion tips, thinking that because they have a lot of subscribers people care about what they say, while actually most subscribers are just sick old pervs doing the ol' nasty while watching these videos in their basement

Then use all the processing power suddenly available on youtube servers, and give us a break with screensaver processing a la seti.

thinking of that, scratch the whole list above and just remove videos with hot chicks that have a lot of subscribers but that are seldomly watched completely because viewers are "done" before the hot chick... and there you go, plenty of cpu available, and probably a few more bucks will find their way to those single moms working the pole to pay their student loan.

> the end user installs a client, BY CHOICE, on their computer and then allows whatever to be run on the spare cycles

The problem is that for most commercial projects, it's the storage that hurts, not the processing, and this is where the money is. Projects with huge processing needs usually are poorly funded (like weather stuff or seti) and could not really afford to pay much.

What would be awesome would be a technology like MAID but distributed over independant nodes, with enough redundancy to allow per

With modern CPU's generally slowing down to save power and reduce heat output, are spare CPU cycles really spare?

I defiantly know - fans speed up when CPU is busy, does this grid type of software take this into account and use only really idle cycles or does it keep the CPU powered up when there is no user doing anything 'important'?

This isn't exactly new. Sure modern CPUs have clock switching, but systems since the 80486 (possibly earlier) have halt instructions that allow the processor to stop doing work and save power until the next interrupt.

OTOH, I know several people who run distributed computing software on their computers during winter, specifically because it produces heat, which otherwise would have to be provided by a fan heater (because they don't have AC), so it's not necessarily wasting energy, but it will cause your comp

I run BOINC on linux. BOINC is "niced" to have an idle priority, meaning that CPU time is only granted to it if there's nothing better to be doing. In addition, I used the on-demand frequency governor which I have instructed to ignore "niced" processes when determining whether to spin up the CPU.

As a result, yes, BOINC only uses spare CPU cycles and not too many of them, either.

My current machine draws something like 360 watt-hours when the CPU and GPUs are busy, but only 217 watt-hours when the system is idle. (Time to trot out the Kill-A-Watt again.) My computer room noticeably heats up if I run an OpenGL screensaver, distributed.net client, or WCG client.

You will increase your household energy usage (and add to your summertime air conditioning bill) if you run their client 24x7. Information may want to be free, but that doesn't fit the power company's profit model.

I'd happily donate my CPU cycles to them. I have 4 cores here sitting doing mostly nothing, and I fully agree it is for the most part completely wasted silicon for the 23 hours a day I don't play games.

But I will have to send them my power bill. While my processor cycles are free, the energy usage is not. The difference between a computer sitting idly all year and running full pelt on the processor can easily be $100+ from a back of the envelope calculation, the GPU can also amount to the same.

Did a few calculations myself for the UK. Based on some figures I pulled from a Bit-Tech review of the Core i7-990X CPU I figured the difference between CPU idle and CPU flat-out (running Prime95) was 122W. I then pulled up some electricity costs based on living in London using British Gas's standard rate tariff. I then figured out how much extra it would cost per hour and per year overall to run a CPU-hogging processing client against leaving the CPU idle during the day and during the night-time cheap elec

Way to waste at least 20% of the CPU power, lazy programmers. I'll take my CPUs to something that actually uses them efficiently like Folding@home which is optimised as opposed to interpreted or even compiled java bytecode being pushed like molasis through a straw.

Way to waste at least 20% of the CPU power, lazy programmers. I'll take my CPUs to something that actually uses them efficiently like Folding@home which is optimised as opposed to interpreted or even compiled java bytecode being pushed like molasis through a straw.

Or you could just download the native binary version. The java version was designed specifically for people that want to contribute but are unable/unwilling to install software on their computers.FTA:

Project participants also had a choice of how to participate in SkyNet: Either anonymously through simply having their browsers open on the SkyNet site, or through downloading a dedicated app to run in the background on their PC.

Way to stand by your convictions there AC. Oracles JVM sucks and if the programers were using C++ properly it would run rings aroud java. The only reason java is faster in some of those situations is that it covers over rubbish programming by the developers by enforcing its training wheels.

How is Java at producing code SIMD code, or code that runs on GPUs? Java can outperform poorly written C code, which does form the bulk of scientific code bases. Outperforming well optimized processor specific code? Not really.

...are buried deep on the website for some weird reason. They are available for Windows and "Macintosh". No generic *nix version so far, which struck me as something pretty bad given the common demography generally interested in helping out with this sort of project.

CPU cycles are not "spare", when a computer has noting to do it just halts. This saves power.

Using your "spare CPU cycles" makes the CPU use more power, it is by no means free.

This is true for other things, like ads using flash animations for example. I always find it ironic to see it in sites like TreeHugger [treehugger.com], which is full of flashy animations. I would expect a green site to use mostly static HTML and text based ads to reduce the carbon footprint of all it's viewers.

SKA [skatelescope.org] - The SKA will give astronomers insight into the formation and evolution of the first stars and galaxies after the Big Bang, the role of cosmic magnetism, the nature of gravity, and possibly life beyond Earth.

SETI [seti.org], the Search for Extraterrestrial Intelligence, is an exploratory science that seeks evidence of life in the universe by looking for some signature of its technology.

I remember SETI always having issues with work units. There weren't enough so a bunch of users got the same work units. Found that to be a turn-off...didn't have that cozy feeling of actually contributing anything, as with other projects. Has that been worked out?

Also, did not SETI also want to make use of the australia array? What's the status of that (haven't been following it)?

Yeah, people keep bitching about that first weekend where a software glitch caused the same work to be sent for the first weekend we were in operation. As someone close to the SETI@home team, please stop bitching about a bug that's been fixed for 11 goddamn years!

> Yeah, people keep bitching about that first weekend where a> software glitch caused the same work to be sent for the first> weekend we were in operation.

Nobody was bitching, so chill, bro! I was not aware, that this was merely a bug. My impression from back then (it's been a while) was, that there was simply not enough data from Arecibo available due to other work being done with the radioscope (is that the correct term?). If that is not an issue anymore, then great!

> And when you're not, you're contributing to one of the most> significant discoveries since fire.

All romance aside...purely from the distances involved (assuming a radio signal indicating 'intelligent life'), it would certainly be a very exciting discovery (for a while), but not necessarily 'most significant'.Until we get there (or they here)...even just by radio contact, nevermind physical, we got nothing out of it other than knowing, we're not the only guys around. And that's already a given anyway

Until we get there (or they here)...even just by radio contact, nevermind physical, we got nothing out of it other than knowing

Hence, why I called it a discovery. One could have said the same thing about fire. It, after all, has been around every since there was a high concentration of oxygen in the atmosphere and woody plants on the land. That's several hundred million years at least. The human innovation was learning how to use it.

Similarly, SETI isn't just about discovering that we're not alone, but also how to use that. If you can detect an alien civilization, then the possibility exists of not only being able to communicate

> If you can detect an alien civilization, then the possibility exists of> not only being able to communicate with them, but also trade> knowledge.

I'm completely with you on that. But it's simply gonna take a while. It's simply unlikely, that the first contact will be "Contact"-style ("Jackpot!"), where we get all kinds of wonderful things sent to us right away. Chances are, we detect something at some point, and then it will take a few decades of back and forth communication, if we even have a lan

I'm happy about it. I only really use SETI@Home because I want to contribute to astronomy with my CPU cycles, and it's the best of the bunch (I found Einstein@Home a little flaky in terms of work unit updates, and for some reason never saw the appeal of MilkyWay@Home). If my cycles could do something more useful for SKA, I'd definitely consider moving over.