Curious about DC computing power over time

Was just wondering about this today. I feel like DC was really popular when it first appeared 16-17 years ago, back when I think fewer people worried about power consumption of their PCs or perhaps just because the idea was new and exciting. Obviously DC is still going, but it seems like it's become a more niche interest. At the same time, computing power and core counts have increased, and there are projects using GPU computing as well.

I was curious to know if the amount of computing power used for DC has been steadily growing, or did it peak back at the start and drop from there?

"Back in the day" a Pentium 4 Willamette was going to draw almost the same wattage idle as working, so there was no real impact by DC to watts used. This is very much not the case today, Skylake, or whatever replaced it, is very low wattage when idle, so a DC project radically changes the power draw of the PC.

Even though the number of users diminishes, the PetaFLOPs keeps climbing as the CPUs and GPUs get more and more capable.

Currently, the most common x86 FAH_core (A4) uses SSE code in the CPU. (I think this dates to the Pentium 3) "Real Soon Now" a new core (A8?) will use AVX2 instructions. So as the hardware gets brighter, the software follows along. (there is a good chance A8 will only run on Haswell or newer i3/i5/i7 and the AMD Carrizo CPUs, so A4 will continue to find work)

"Back in the day" a Pentium 4 Willamette was going to draw almost the same wattage idle as working, so there was no real impact by DC to watts used. This is very much not the case today, Skylake, or whatever replaced it, is very low wattage when idle, so a DC project radically changes the power draw of the PC.

Interesting. I didn't think of that but that does make a lot of sense.

"Back in the day" a Pentium 4 Willamette was going to draw almost the same wattage idle as working, so there was no real impact by DC to watts used. This is very much not the case today, Skylake, or whatever replaced it, is very low wattage when idle, so a DC project radically changes the power draw of the PC.

Interesting. I didn't think of that but that does make a lot of sense.

the latest generations of GPUs are prioritizing energy conservation, so for projects that use the GPU, users will suddenly discover there are substantial costs in DC. Previous generations used almost the same wattage idle as computing, so once you bought a Graphics card for gaming, you might as well Compute. Modern cards will winnow out users who find the costs of operating the cards too high.

GoofyxGrid@Home it's a boinc server for some project which we would like to examine.

Monkeys application will check some aspects for infinite monkey theorem:

monkeys_v1 - draws word for examine if thats word existing in Polish or English dictiora for now (I wonder about using more dictionaries)monkeys_v2 - draws word until it not exactly like in WUmonkeys_v3 - draws whole text and check if it not a fragment for example some Sheakspir works

Is this really just trying to see whether randomly sampling words from a dictionary can recreate literature? Because not only is the answer yes, but it's pretty easy to calculate the probability (and hence the expected number of attempts) that this will happen... it's really, really, low -- so the project is likely to go on for a while. But what's the merit of such a project?

Is this really just trying to see whether randomly sampling words from a dictionary can recreate literature? Because not only is the answer yes, but it's pretty easy to calculate the probability (and hence the expected number of attempts) that this will happen... it's really, really, low -- so the project is likely to go on for a while. But what's the merit of such a project?

It solves that most important of all eternal questions: how to move the Ars team up in the Vault stats.