Make sure you're using 1.7.0 as the bucket server for older versions is being shut down. My connection is 1.5 and that's the fastest available here from QWorst. I've called them a dozen times. If I try to use even that they send threats. Long ago ex Pres Clinton announced the "Information Superhighway". Still, all we have in the US is an information cowpath. Seems that privatizing the phone companies here was a very bad idea

Make sure you're using 1.7.0 as the bucket server for older versions is being shut down. My connection is 1.5 and that's the fastest available here from QWorst. I've called them a dozen times. If I try to use even that they send threats. Long ago ex Pres Clinton announced the "Information Superhighway". Still, all we have in the US is an information cowpath. Seems that privatizing the phone companies here was a very bad idea

Too bad. Just to rub it in: I recently had fiber run to my house, and the fastest speed I can get is 400/400 (Mbit/s). The reason I don't have that particular subscription is that it costs roughly $1000/month and I haven't won the lottery, so so I'm sticking to the slowest option for now

Okay I have 1.7 installed on two machines that use that lab link. I set them for 2meg, 60% down, 80% up, 30 workers, 15 buckets.

I need to know how to join the team. I could not figure that out from poking around in the app and on their site.

Just as an FYI, on previous versions of the application, I've had to set workers to around 75-80 to achieve ~5 MB/s crawling [that was across 3 machines]. That could have been related to the class of machine I was running it on, so I guess just make sure you're actually seeing the download speeds you need on the UI at those levels. Also, I usually kept 100 buckets 'in play' at any one time, because [again when I was running] the bucket servers went down constantly and having that reserve came in handy numerous times.

Well, it is a moot point. I got flagged by Websense so I am done. I even changed the settings to only crawl .org's but was still showing up as hitting too many restricted sites. I thought about leaving it on overnight saying I forgot to kill it, but I already told the guy I would kill it. He's cool. What I don't need is his Senior Manager coming down on me. So I am going to pull the plug. Sorry guys!

I will, however, probably drop this on the file server at home when I get it rebuilt. It will run at a crawl, but over time, that builds up I know.

I started it up last night, but my router must not be robust enough to handle too many connections. I've been having problems with this router and multiple connections before, so it's definitely that. Any recommendations of gigabit routers that work well?

This sounds interesting...I seem to recall another company trying this years ago (they did not succeed and eventually shut down).Anyway, I have an unlimited bandwidth cable connection at home, so I'll give it a try. Joined the Ars Technica team.

Andrew, welcome to the team! It's great to see people jumping in to help out

One thing worries me about this project, though...What exactly are we getting out of it? The search engine supposedly powered by it appears to be down. All I see that works is their paid SEO subscription service. If all we're doing is powering their paid site, then I don't think I'm going to participate. The whole point of distributed computing is that the community/research community/world/etc. gets something out of it (like work towards cancer cures, new prime numbers, or maybe even aliens), not for some private company to just get free bandwidth...

We generated those results, why don't we get to see them without paying?

That makes it a commercial project and whilst I'll try running it a bit more for the sake of the team, I think we need to think long and hard about whether we should support it or not at this juncture, regardless of what it does to our DC Vault standing (although I'm of the opinion it's too commercial to be worth of it as well).

Don't know if anyone will agree but it's out there. Thanks for the heads up Andrew. We had an argument about MJ-12 and commercialism before. At that time they were not selling their data and promised to share profit (if there ever was any) with the crunchers. The "charter" MJ-12 presented was vague at the time. So here we go again.

My take is that Majestic 12 is basically a commercial company that buys very cheap bandwidth off of crunchers in exchange for a nominal amount of shares and some stats. Our work units - the URL barrels - build their commercial product. Yes, there's a limited amount of free stuff, but unless you fork out £250 per month, you ain't getting all the results.

If Amazon decided to release stats for users of their EC2 service, would that constitute a DC project?

"Hey 99designs have used 148 days of CPU time and 100 GB of disk space, but Yelp.com have passed them with 150 days of CPU time and 150 GB of disk space! See the stats!"

It's actually not that different. There's no research going with Majestic 12. It's just our work being turned directly into a commercial project. I've read the forums and this is a business, not a DC project. Contributors are just building the MajesticSEO backlink database and any payout basically makes them investors. So unless we all start tracking our pension pots and get points for who is going to get the cushiest old age, then I find it hard to to justify why this actually counts as a project.

Originally Posted by Razor_FX_II Commercial projects are not breaking the DC-Vault acceptance rules and since there is steady work units and the project is open to new members, I think that it should stay in the DC-Vault.

This won't be the first time that the issue of changing the Vault rules has been brought up. If that is your hang up then *I* vote for amending the Vault rules to disallow commercial projects like this.

This won't be the first time that the issue of changing the Vault rules has been brought up. If that is your hang up then *I* vote for amending the Vault rules to disallow commercial projects like this.

We shouldn't just quit till the hammer goes down on this project. Bump also...

I've decided to try and lay some smack down on this. I actually received payment for my past contributions (about a whopping $6 in total) which from memory were around 850-900 GB of bandwidth. In the past 5 days, I've managed approx 150 GB, so I reckon that if the payment rate is similar then the cheapo VPS I hired will pay for itself.

If anyone else is interested in doing something similar, I'm happy to link you up n' stuff.

I got a T-Shirt some time last year ... and it's still running, though at pretty low bandwidth and still getting DNS errors, but it does serve the purpose of keeping my connection alive and (mostly) unconstipated.