LHC@home 2.0 is a volunteer computing platform for physicists working at the Large Hadron Collider - the world's largest particle accelerator - at CERN, the European Particle Physics Laboratory. This platform will host several sub-projects for different LHC physics groups.

LHC@home 2.0 is an extension of the LHC@home platform, launched in 2004 to help physicists simulate protons beam dynamics in the LHC. At that time, doing full-fledged simulations of particle collisions was beyond the scope of volunteer computing. But the evolution of computer software and hardware, and in particular the use of virtual machine technology, has enabled a breakthrough...LHC@home 2.0!

As a result you can now be part of a global effort to simulate data that physicists will use in their analysis of LHC data, by running simulations of particle collisions on your home computer. The first project to run on the LHC@home 2.0 platform - currently in test phase - is called Test4Theory.

Fri, 08/12/2011 - 19:19 — Daniel Lombraña... Interest in LHC@home 2.0 has been overwhelming, following the huge press coverage that a brief mention in a CERN press release got us. Thank you, everyone! Yesterday we reached nearly 8000 registered volunteers, which saturated our server. So we had to put further eager participants on hold while we sorted out how to handle this huge amount of support.
We're going to open gradually to more participants in the coming days. In the meantime, if you are new to the field of volunteer computing, we warmly encourage you to browse here some of the many other exciting science projects you can contribute to, using the same BOINC platform that LHC@home is running on.
The LHC@home 2.0 Team

So, at present they are not accepting new registrations and it will only open gradually. I'll keep checking on this project ... just because it sounds really neat.

I suppose the real question (or one of them): Is t4t suitable for inclusion? Is work consistent enough for the Vault? Has anyone been running the project and what are their thoughts? And lastly, if t4t is suitable, should t4t and lhc be combined as a single project or be two projects. I'd lean in the former direction myself, at least at first thought.

As for adding T4T:
Yes, the project ticks all the requirement boxes (kinda hard not too). It's stable (as long as you don't update to the latest virtualbox version), there is work and active project admins.

However I am against adding it to the vault for two reasons:
1. Officially the project is still in BETA.
2. not a fan of the client. It's a VirtualBox VM, so besides having BOINC installed you also need to manually install VirtualBox. Then the massive download starts, you need at least 9GB of free space BOINC can use.

As for combining stats:
I see no reason why you would combine them, T4T and 6T are two separate projects that do their own thing and have different project organisations/admins.

It could be seen as 2 applications from one project (like WCG is one project with alot of different applications). But since both are separated by CERN why bother with the extra work? and also make the vaults stats incomparable with all other stats sites in the process.

However I am against adding it to the vault for two reasons:
1. Officially the project is still in BETA.
2. not a fan of the client. It's a VirtualBox VM, so besides having BOINC installed you also need to manually install VirtualBox. Then the massive download starts, you need at least 9GB of free space BOINC can use.

As for combining stats:
I see no reason why you would combine them, T4T and 6T are two separate projects that do their own thing and have different project organisations/admins.

It could be seen as 2 applications from one project (like WCG is one project with alot of different applications). But since both are separated by CERN why bother with the extra work? and also make the vaults stats incomparable with all other stats sites in the process.

+1

__________________
Proud Member of Team [H]ard|OCP & the [H]ard DC Commandos

In looking over the project website, I must say I am opposed to adding it to the vault, for 2 reasons:

- It requires virtualbox software. It complicates things for new users, creates a dependency on distinct versions (they now advise against using 4.2.0, for instance), and I need VMWare on my workstation, and don't really want to install another hypervisor as they might interfere (I did install it on a testbox, and it seems ok, though. But what with future versions?)
- If I understand correctly (reading their FAQ), all tests run for exactly 24 hours - WITHOUT checkpoints. This, for me, is a showstopper. In general, I think checkpoints should be a requirement for inclusion in the vault, or at least for tasks running longer than an hour or so.

As for combining stats : No. They are two seperate projects. Apart from that, what if one of them runs out of work or otherwise fails to meet vault requirements? That would mean removing both of them from the vault, even though one of them still qualifies?