Umm this about sums it up in a nutshell. We have about 6 months of gpu units may be.

As qouted by knreed:

We are still assessing the impact of the acceleration on the end date of the HCC1 project. Our rough estimate with 'known' work based on this acceleration is about 100-130 days until end of project. BUT my use of the word 'known' simply means the number of batches we are aware to run. What we are are is not equal to what the researchers want to run so we are discussing this further with them at this time.

However, we do not yet have any research project beyond HCC1 that we know can run on a GPU. While this has definitely shown the power of GPU working well, for an application that clearly fits the profile of being suitable for GPU, the researchers are still the ones that provide the code and they will have to determine if GPU would work for their project and then make the modifications.

It is important to us to have GPU capable projects, but we cannot promise them. Even then, we could find ourselves with a project that was written for CUDA and not be able to run on AMD graphics cards (or written for AMD but not CUDA). As a result, we ask that people not make purchasing decision without understand these caveats.

both rigs are crunching again after a hard week end waiting for fedex maybe friday, i'll add a i5 3570k and 3x 7850 after that,my next dedicated cruncher are scheduled for mid november (2 amd fx8150 without gpu for now)

Umm this about sums it up in a nutshell. We have about 6 months of gpu units may be.

As qouted by knreed:

We are still assessing the impact of the acceleration on the end date of the HCC1 project. Our rough estimate with 'known' work based on this acceleration is about 100-130 days until end of project. BUT my use of the word 'known' simply means the number of batches we are aware to run. What we are are is not equal to what the researchers want to run so we are discussing this further with them at this time.

However, we do not yet have any research project beyond HCC1 that we know can run on a GPU. While this has definitely shown the power of GPU working well, for an application that clearly fits the profile of being suitable for GPU, the researchers are still the ones that provide the code and they will have to determine if GPU would work for their project and then make the modifications.

It is important to us to have GPU capable projects, but we cannot promise them. Even then, we could find ourselves with a project that was written for CUDA and not be able to run on AMD graphics cards (or written for AMD but not CUDA). As a result, we ask that people not make purchasing decision without understand these caveats.

Click to expand...

Thanks!

So it looks like we have 3-4 months left for sure, maybe more. I guess I'd be a bit more hesitant about more ATI cards in the future--I think I'd be better off with nV ones that also do FAH well.

Yes so maybe waiting a little bit to build a full gpu cruncher might be the best thing right now. Atleast till we get more info on where this is going and if other projects are going to adapt to gpu crunching.

Yes so maybe waiting a little bit to build a full gpu cruncher might be the best thing right now. Atleast till we get more info on where this is going and if other projects are going to adapt to gpu crunching.

Click to expand...

Well some of the things to consider with a new build should be:
- at least 2 PCIE X16/X8 slots (3 or 4 even better)
- a PSU that can run 2-4 high powered GPU's
- a 6-8 core/thread CPU

Well some of the things to consider with a new build should be:
- at least 2 PCIE X16/X8 slots (3 or 4 even better)
- a PSU that can run 2-4 high powered GPU's
- a 6-8 core/thread CPU

Click to expand...

Those are the strategies I'm trying to follow. That's why I'm happy with my 650w Antec PSUs and (hopefully) the EVGA 3X SLI board + i7. That way, I could theoretically run 3 HD7850s and two tasks per (or two and three tasks per) and it would be fine.

Although, running so many GPU tasks makes it harder to get badges in the new projects

Those are the strategies I'm trying to follow. That's why I'm happy with my 650w Antec PSUs and (hopefully) the EVGA 3X SLI board + i7. That way, I could theoretically run 3 HD7850s and two tasks per (or two and three tasks per) and it would be fine.

Although, running so many GPU tasks makes it harder to get badges in the new projects

Click to expand...

"Badges? We don't need no stinking badges!"

* from the movie- Blazing Saddles. Sorry, I couldn't resist

P.S.> FreeDC's latest update is ready... Kiex may break 100k on his SR-2/7770 rig at the pace he's going since it's already over 40k right now

Oh damn. This GPU tweak is really amazing...why must I have set my buffers so high

Click to expand...

I'm going to start dropping off the size of my buffer as I run out of wu's (GPU and CPU) otherwise I will end up running for days without any of them. It's going to take longer but I'll at least be able to run the wu as the buffer drops off. The buffer on my main rig is at 6 days atm...

My only other option is to transfer the 7870 into the 1045T rig (that buffer is only set for 0.5 days) but I can't/don't want to run my main rig with an 8400GS

I'm going to start dropping off the size of my buffer as I run out of wu's (GPU and CPU) otherwise I will end up running for days without any of them. It's going to take longer but I'll at least be able to run the wu as the buffer drops off. The buffer on my main rig is at 6 days atm...

My only other option is to transfer the 7870 into the 1045T rig (that buffer is only set for 0.5 days) but I can't/don't want to run my main rig with an 8400GS

Click to expand...

An 8400GS can be slow for even Aero on a 1080P display--no, you don't want that

What I did is set the buffer to .05 days, but did not check the "No New Tasks". That way, it will decrease the buffer of CPU WUs, but will keep downloading the GPU ones as needed.

Oh damn. This GPU tweak is really amazing...why must I have set my buffers so high

Click to expand...

I didn't try it yet - my 450 GTS runs at 98 % and my 260 about the same. Even my 680 runs between 80 and 95 % BUT I did order a red card (7770) so in a day or two the 260 or 450 will go back to be a back up GPU.

An 8400GS can be slow for even Aero on a 1080P display--no, you don't want that

What I did is set the buffer to .05 days, but did not check the "No New Tasks". That way, it will decrease the buffer of CPU WUs, but will keep downloading the GPU ones as needed.

Does WCG on the Radeon make your desktop choppy?

Click to expand...

Hmmm.... that sounds like the ticket right there

The gpu wu's barely affect the desktop with the 7850 and 7870's. I only see a slight frame drag on the TV Tuner once in a while. I even ran a Unigine Heaven benchmark with it while gpu crunching- benchmark was a bit less than normal

The 6870 drags a bit more when gpu crunching but the rig is still completely usable

The gpu wu's barely affect the desktop with the 7850 and 7870's. I only see a slight frame drag on the TV Tuner once in a while. I even ran a Unigine Heaven benchmark with it while gpu crunching- benchmark was a bit less than normal

The 6870 drags a bit more when gpu crunching but the rig is still completely usable

Click to expand...

Awesome. Crunching doesn't make the system quite as unusable as folding does, but it's still maddening. To the extent that I mainly use the laptop, as the desktop is too slow to use.

I'm going to start dropping off the size of my buffer as I run out of wu's (GPU and CPU) otherwise I will end up running for days without any of them. It's going to take longer but I'll at least be able to run the wu as the buffer drops off. The buffer on my main rig is at 6 days atm...

My only other option is to transfer the 7870 into the 1045T rig (that buffer is only set for 0.5 days) but I can't/don't want to run my main rig with an 8400GS

Click to expand...

I am going to drop my buffer back to about .005 and click the no new work tab and let it run out of work. The reset it and move the buffer up some.

One of those where EVERYTHING has to be dismantled to get it out.
And couldn't get the MB out of the plastic because the DC jack cable just wouldn't budge(Besides being at a most frustrating location).

And then I realised I had to remove the heatpipes to get the plastic shroud off, and didn't feel like doing that, since that could spoil whatever is transferring the heat from the chips to the heatpipe.

And was actually surprised to find 2 heatpipes in my laptop, totally only expected 1.

One of those where EVERYTHING has to be dismantled to get it out.
And couldn't get the MB out of the plastic because the DC jack cable just wouldn't budge(Besides being at a most frustrating location).

And then I realised I had to remove the heatpipes to get the plastic shroud off, and didn't feel like doing that, since that could spoil whatever is transferring the heat from the chips to the heatpipe.

And was actually surprised to find 2 heatpipes in my laptop, totally only expected 1.

Click to expand...

Got a old Dell that I gave up on. I wanted to clean and re-seat the heat sink but found out I had to take everything apart. My newer no-name laptop has a combined heatpipe and since it's using both the GPU and the CPU crunching now it will have an overhaul in the were near future. The 1' fan I use to help to keep it below 80 C is good but I remember it was 62 C when it was new (CPU alone).