Bitcoin Currency and GPU Mining Performance Comparison

GPU Performance per Dollar and per Watt

The most basic information (and the most fundamentally important) we can provide is the Mhash/s of each graphics card we are testing. For dual-GPU graphics cards we ran two instances of our poclbm kernel under the GUIMiner client and rates were added together to get a total maximum output per card.

Again, the first thing we notice is just how poorly the NVIDIA cards are stacking up against the AMD offerings. If you look at the GeForce GTX 580, a $469 graphics card, against the Radeon HD 6850 at $209, the AMD card still has a commanding 35% performance advantage. The Radeon HD 5830, which can often be found for around $100, is easily one of the best deals for Bitcoin mining and is even faster than NVIDIA dual-Fermi-based GTX 590 card! The dual-GPU AMD cards even more impressive and we can see from the HD 6990 that the Cayman architecture brings a lot to the table as it can pull about 340 Mhash/s for each GPU. The overclocked ARES card takes the top spot with over 800 Mhash/s (a 38% boost over stock settings for that card).

Also take note of the AMD A8-3850 APU - with a Mhash/s rate of 80.7 it is actually computing faster than the GTX 560 Ti, the GTX 460 and even the dual-GPU (but aging) GTX 295. It looks like if you want to maximize your Bitcoin mining experience building a system around an AMD APU could be a good way to supplement discrete cards.

Sure, you can build the fastest super computer if you have an unlimited slush fund but what is the most cost effective way to mine for Bitcoins?

The larger bar the better for your wallet here and we can see why the Radeon HD 5830 cards have been flying off Newegg's shelves! With a current rate of 1.544 Mhash/s/$ it comes in well ahead of anything from NVIDIA and about 20% better than its closest competitor in our testing, the HD 5750. No NVIDIA card even comes close the AMD allotment here as even the A8-3850 is able to best the GeForce lineup. Keep in mind in that case as well you are getting the GPU computing power for mining as well as a quad-core processor as well increasing the overall real-world "value" of the part.

Even though the ARES card and the HD 6990 took the top performance rankings they come up short to the less expensive Radeon HD cards like the HD 6850. It appears that as with gaming performance, Bitcoin mining performance sees the law of diminishing returns in GPU price increases.

If we take cost out of the picture and you just want to build a more power efficient Bitcoin miner, what cards will make the most sense?

Kind of surprisingly, the dual-GPU Radeon cards make the biggest splash here with the overclocked ARES card (that used 50 watts or so more power than the base ARES settings) taking the overall crown at 1.584 Mhash/s/watt. The HD 5970 and HD 6990 also gathered closely to the base ARES settings to show us that even though they user more power, the single card scenario definitely helps with efficiency in this case.

For single GPU cards the Radeon HD 5830 is the winner yet again making it the easy choice for the best card (so far) for Bitcoin mining enthusiasts. NVIDIA cards which have had the stigma of being power hogs for years now seem to showcase that drawback substantially here.

Any reason the AMD 6950 & 6970 cards was left out of the experiment? As the flagship AMD single GPU cards, I think this data would be really salient. Is there another card on the list from which we could easily extrapolate 6950/6970 performance?

In my personal testing, the 6950 gets somewhere around 340 mhash/s with a few optimizations. Overclocking and unlocking can get you around 400. You can get the 6950/70 performance by dividing the results of the 6990 GPU results in the graph.

My understanding of the GPUs used were based on what was available in house for testing.

More specific results (please keep in mind that I am using different settings than Ken so they are not necessarily comparable):

My 6950 unlocked to 6970 shaders at 840 core gets 372.7 mhash/s using GUIMiner, and two kernel tweaks of the poclbm kernel, and AMD Cat 1.7 drivers and whatever version of Stream SDK comes with that. I'm further using the following flags which are gfx card version specific: -k poclbm VECTORS BFI_INT AGGRESSION=9 WORKSIZE=128

I'd be interested in a little more information. I'm running a Sapphire 6950 2GB with unlocked (6970) shaders (but not flashed to 6970 speeds; I just OC when I need the boost). I'm also sporting a Core2Duo E8400 OC'd to 3.6 Ghz. I started mining last night, following the guides Ryan mentioned, and I'm consistently getting 320 Mhash/s, not the 340 you mentioned was possible with a few "optimizations."

Do you know if the optimizations you mentioned (the flags) should work for my 6950; you said they are gfx card version specific - did you mean vendor specific, or just 6950 specific? Is it possible to use those flags when I'm using the GUIMiner, or do I need to be using a console? Thanks for any input!

Hi Adster, I am running a XFX 6950 2GB card with an edited BIOS to have unlocked shaders but not 6970 speeds (though the card is capable of running at them, I didn't want to risk running the memory at the higher speed full time).

The flags that I mentioned will work for you 6950, they are specific to the version of card you have, in this case these flags are best used with AMD 6xxx series cards. You can set the flags in the GUIMiner extra flags area; however, you will need to edit the poclbm kernel file for the other optimizations. You can find those by searching the bitcoin forums for kernel optimizations.

I hope it helps, let me know if you need any help in sqeezing all the mhash possible outta that card :)

Honestly, we just didn't test it because we skipped some cards. Looking back, we should have done one of them. You can see on our screenshot of "The Beast" that we eventually plugged one in and got about ~ 344 Mhash/s.

This is a great article, and pushed me over the edge to start mining. The only big question I have (aside from my earlier question about 6950/6970 performance), is how the cost of electricity factors in.

Obviously we are all subject to different utility rates, so you couldn't give a cost-breakdown that would apply to everyone. However, I am curious how much the average cost of electricity would deduct from the profits in your chart?

I have a dedicated mining machine which runs 24/7 in the closet (no, really -- it sits in the closet). It has the cheapest AMD CPU I could find (sempron processor), 1GB of ram, a flash drive used as the hard drive running Ubuntu 10.4 on a headless (monitorless) system. The only thing really going on is the 2x5850 Xtreme graphics cards pumping out ~ 700 MHash. When I bought this rig, it ran me $530 after rebates from Tiger Direct.

I think it is your responsibility to deter readers more actively from investing in hardware in order to conduct bitcoin mining and distance yourselves from those activities. It is easy for people to understand that they can make money from computing power, but it takes some very careful reading to understand that by design, this whole enterprise will become less and less profitable over time. So I think it would be better to put the emphasis of the article on parallel computing performance and to use bitcoin merely for illustrative purposes. At the very least, you should factor in the energy costs in your profitability analysis, but in my opinion, calculating projections is misleading and even deceptive, given the facts about Bitcoin (see below).

So my warning here:
!!! WARNING !!!
===================================================
Investing in hardware in order to engage in bitcoin mining is a highly risky and quite possibly loss-making idea!
The calculations of "Days to payoff" and "1 year profit" in this article are misleading: Not only is the rate of bitcoin creation is deliberately being slowed as the total number of bitcoins approaches 21 millions, it is also getting more and more difficult to accumulate enough computing power as the number of participants in bitcoin mining is increasing (as people reading this article and others will start setting up their own mining operation). The only effect countering this deterioration in profitability would be an increase in the dollar value of the bitcoin, which is uncertain and unpredictable.
====================================================

"Please keep in mind that we understand that these values will change over time not only because of the exchange rate differences but because your ability to mine Bitcoins will slow down over time as the algorithm to find coins becomes more and more complex as the network hashing power increases. Read over the first two pages of the article again to understand WHY this happens but just know the results you will see below are based on an instance in time during this writing process!"

Nothing really. Plus a virus which specifically only attempted GPU mining would be alot easier to hide in the windows environment since most users are unlikely to be monitoring GPU usage levels when simply web browsing etc.

A virus which intelligently slowed its mining attack if the user was trying to do something GPU intensive (gaming), in order to hide the system use and keep the user from noticing massive in-game slowdown, could likely mine away unnoticed.

I do not fully understand the setup in regards to mining as a pool though, which is what you would ultimately want all your zombied systems to do. I guess it is probably not 'that' difficult to setup a pooling setup given how many continue popping up, plus presumably someone writing a virus specifically targetted at hijacking GPU cycles is probably a decent enough coder.

There have actually been some botnets (that have since been shut down) mining for pools; however, they were thousands of computers using the CPUs to mine as access to the GPU hardware is more difficult/would require more end user cooperation to get that botnet software installed and running, AFAIK.

What price did you use for power in your profit calculations? At 500W for a simple 6990 system, total power consumption in a year of 24/7 use will be 4380 kWh. Your profit after one year will be negative if your price for power is more than about 35 cents, assuming constant difficulty. All Nvidia cards will operate at a loss unless your power is very cheap or free.

Difficulty is about 1000 times larger now than half a year ago, btw. Power cost has become the most important factor in mining profitabilty.

For european readers, the power use is a bit more important. 1kwh of power costs on average around 0.25 euro.
Which means a system like the beast (using 1kw of power) will cost you 0.25*24*365 = 2190 euros per year in electricity.
The beast yearly produces 3637 dollar equivalent bit coins, which is about 2584 Euros.

That means it will effectively only produce 394 euros. And that is not counting the cost of buying the system.

A 6990 in the default BIOS position should generate 330 MHash/s PER core. That's 660 MHash/s per card. I'm not sure why, but your card is showing a much slower speed on one of the cores. (~285.3 MHash/s)

Switch the BIOS switch to position 2 and you'll be at 360-375 MHash/s per core.

You also seem to be missing the most basic flags for GUIMiner running poclbm: -v -w128

Your cpu usage is silly!
It's prolly the guiminer interface or something.
My 'rig' runs 350Mh/s on i7 2600k and HD6970 and rarely hits 4% cpu usage.
And that is while i run an active minecraft server and use the rig to watch videos and stuff (gets it to about 8% for SD video).
I'm using the Phoenix miner btw.

It's 4 integer operations/instruction x 3 instructions/clock x 4 cores x 3.4 GHz = 163.2 GigaInstructions/second. AVX is 4 integer operations/instruction or 8 floating point operations/instruction. Each clock can issue up to 3 instructions if they don't depend on the answer of the previous instructions. The nice thing with AVX over SSE2 is AVX has instructions like a = b *operation* c vs a = a *operation* b for SSE2.

So your telling me you put a Virus on your computer that helps criminals launder money.
you let it operate through your GPU because there is no security there.
and you spend hundreds on hardware and power, for a experiment in social engeneering?
then you think that because 3 places are taking the hype of the bitcoin as a COUPON to sell you shit at 3 times the normal cost, that the bitcoin is therin a currency?

what do you think your GPU is really processing?
or does anyone think?

You just dont get it,
the GPU is processing YOU!
It is internally cyclicly redundant pre-processing your own non-trasnactions, into a multilevel advertising purchacing and marketing scheme.
If they do not enable the user with a journey, then there is no game to be played. There is no Corelation to alternative universal dimentional shifting of exchange goods in virtuality, when there still is nothing but virtuality in existance.
How do you perceive that something exist when one person tells you that it exist, and masses of people join that ONE person to confirm that it exists.
That is a singularity of the black hole variety.

Issue -problem guiminer with dual gpu card HD6870x2 powercolor. After creating new worker for the second Gpu, it still doesnt work 0 Mhashes the first gpu at 304 Mhashes clock at 970 Mhz 60% fan speed temp 74 degrees Celsius
flags -v -w128 -a4. Does anybody know how to setup this correctly , so that both gpu´s work at the same time thank you for helping me out.

everybody in this bitcoin thing are obsessed with the hash performance and what not, while everyone forgot the actual essence: which GPU actually makes more bitcoin(money) per month, or how much money do you think those crossfire 6990HD GPUs will ever give you in a month?

I do online forex and earn $300-$400 PER DAY only using a laptop. can these expensive "rigs" setups with all those crunching numbers/"stats" and huge amount of energy consumptions(carbon emission contributions)give you better than that?