Background: Most of the time I have low CPU needs. But sometimes I need more CPU power. For example when I record TV programs with my DVB-T card converting the recorded .mpeg to a smaller .mkv with handbrake takes a lot of time (several hours). I could buy better hardware. I bet many others are in a similar situation: they normally have low CPU needs but occasionally need more. We could all buy better hardware. But I also bet that many of us often have a lot of spare bandwidth.

Idea: here cloud processing would make a lot of sense! Programs like Handbrake could have an option to upload segments of the input file to a cloud of powerful CPUs who process it quickly and send data back. That could make economic sense. Instead of each of us buying an expensive new CPU that is seldom fully used we'd pay a small metered amount for cloud processing only when we need it. That would also save time as the user can rent a lot of CPU power for short periods of time.

All this leads me to some questions:1. don't you all want this too? 2. are there already any programs for end users that have cloud processing like sketched above built in?3. are there already any fairly easy and fairly inexpensive manual ways for users to temporarily set up something like cloud processing? For example, to at a low cost rent a box with ubuntu and popular CPU intensive programs like Handbrake installed and with a lot of CPU power? I'd use VPN to upload a file, quickly process it on the cloud box, then download the output and then terminate the cloud box.4. if there's nothing like that for end users today are there any such projects on the horizon? I think some generic cloud processing service would be most useful. A standardized module of some sort that different applications could implement. The user would then only need one single cloud processing account and would only need to log in to it and allow the specific application to use up some of the "cloud processing credits" the user has previously payed for.

I realise that this is perhaps only a small part of what you're envisioning, but in the current climate it's unlikely that any business in it's right mind would provide the ability to transcode media files

Renegade: Video transcoding was only one example. I can see many other uses: image manipulation, 3D rendering tasks, complex OCR tasks for a lot of documents, and so on. Basically, any task where (upload time + download time + cloud processing time) < local processing time

Bandwidth might be an issue but shouldn't be exaggerated, even for the case of video transcoding. A fairly large number of people have had 100Mbit connections at home for some time. Very far from all of course. But many enough for this kind of service to take off I'd think.

Cloud transcoding might pose some legal (copyright) problems. But I'm not too sure about that. Couldn't a generic cloud processing service claim to be a mere infrastructure? Recording digital TV broadcast with a TV card is legal where I live and so is transcoding those recordings on my PC for personal use. And a system could perhaps be built so that the local computer only uploads obfuscated calculation tasks to the cloud. Then the cloud processing service can sincerely say that they can't know or control what the processing is for.

Anyway, if legal risk explain why no company tries it then a P2P version of the same idea could still be possible.

We already have P2P file sharing and P2P proxies (TOR). I've also seen attempts at P2P cloud storage (though I can't mention a specific example). So why not P2P processing?

We actually already have a large P2P processing system at work in folding@home . But there users only donate their CPU cycles for science.

We also already have many examples of limited cloud processing for end users already. Webapps that let us upload a file for malware check, sites that convert pdfs to other formats, and so on. But those are limited in doing only very specific tasks that the user can't modify much.

And a system could perhaps be built so that the local computer only uploads obfuscated calculation tasks to the cloud. Then the cloud processing service can sincerely say that they can't know or control what the processing is for.

Don't know where you are, but where I sit the law is pretty definite that you won't automatically be held (criminally) liable for something if you were innocently unaware of it. (Intent is a major factor in criminal proceedings after all.) But that's not an absolute rule. And you are also not allowed to create a deliberate blind spot and then use that as your defense against being charged as an accessory to a criminal act.

The simple fact you were deliberately obfuscating data streams would be enough to convince the average judge you were attempting to evade responsibility and culpability by use of a technical dodge. That alone could make you open to prosecution for a variety of "conspiracy to commit" charges - even in the absence of a real violation.

I'm not so sure about those legal worries given that we live in a world where ISPs, commercial proxies, AWS and P2P projects like TOR and a lot of file sharing software persists despite the fact that some of their users do unlawful things through those services. I'm not sure why a "processing service provider" in the cloud should be on shakier ground than an ISP is.

Anyway, let's put legalities aside. I was more intrigued by the basic tech idea and why we don't have it built into applications and operating systems already (it would make sense in Linux desktop OSes I'd think, sharing CPU cycles with the community).

To focus on a perhaps better example imagine a thousand people all with highspeed bandwidth. Each of them now and then dabble in CPU intensive 3D rendering. When they do rendering their CPU is maxed out but it still takes many hours. Then their CPU and bandwidth idles a lot until weeks later when they need the CPU for rendering again. Here it would make sense for them to band together in a P2P processing system. Each user donates spare CPU cycles and in return can use a lot of CPU power for short bursts of time.

What you're describing sounds very much like something David Gerlernter discussed in his book Mirror Worlds.

Gerlernter proposed a mechanism whereby all the computing power on a given network could be harnessed as a sort of distributed supercomputer. This anticipated the now common notion of clustering.

cloud processing for end users - when? already?

This was taken beyond a theoretical proposal when Gerlernter developed a "coordination" language that was called Linda to accomplish exactly that. There's a NYT article entitled David Gelernter's Romance With Linda that discusses what Linda brings to the table:

What made Linda different from basic clustering (and far more interesting) was that each member of the Linda assemblage could 'negotiate' processing availability or demand with other devices on the network rather than have it statically assigned by a human scheduler. As Gerlernter characterized it, you just toss your problem to Linda, and Linda figures out how best to run it based on what else she currently has on her plate resource-wise.

Renegade: Video transcoding was only one example. I can see many other uses: image manipulation, 3D rendering tasks, complex OCR tasks for a lot of documents, and so on. Basically, any task where (upload time + download time + cloud processing time) < local processing time

Bandwidth might be an issue but shouldn't be exaggerated, even for the case of video transcoding. A fairly large number of people have had 100Mbit connections at home for some time. Very far from all of course. But many enough for this kind of service to take off I'd think.

What I mean is that we have could computing for business right now, and businesses typically have lots of bandwidth with servers in data centers.

The thing is for consumers... Different story. While a lot of consumers have bandwidth that can handle large tasks like that, I don't believe that the consumer market has reached a critical saturation point where the business model for consumer cloud computing for bandwidth intensive tasks makes sense.

Hmmm... that's a mouthful. Let me simplify it.

I don't think that there are enough consumers (that have enough bandwidth) for a company to justify creating those services. It wouldn't be profitable right now. Given time, better Internet infrastructure can be rolled out, which would then make the business model practical.

Any company that jumps into the market there would likely be trying to get the first mover advantage, and counting on increased bandwidth in the consumer sector to drive their growth in the future.

Bandwidth might be an issue but shouldn't be exaggerated, even for the case of video transcoding. A fairly large number of people have had 100Mbit connections at home for some time. Very far from all of course. But many enough for this kind of service to take off I'd think.

Bandwidth might be an issue but shouldn't be exaggerated, even for the case of video transcoding. A fairly large number of people have had 100Mbit connections at home for some time. Very far from all of course. But many enough for this kind of service to take off I'd think.

40hz: I hadn't read about the Mirror Worlds book before. Putting it on my to read list.

Renegade: The number of consumers might be small. But it is not as if a completely new service is needed. Services that already caters to business could aim for end users too with some modification. The one I found above, zencoder.com , actually seems open to individual customers even though they're clearly geared towards companies (see the list of customer cases on their site). They have have a pay-as-you-go option: $0.05 per minute of output video which would mean $3 for a one hour video. AFAICT all needed is to set up a FTP on the home computer for input/output and use Curl or something similar to create an encoding job. They even have a free test account (you get a few seconds of video back only) so I'll test it out when I have more time. The AWS alternative is also open to individuals I think.