Bitcoin Currency and GPU Mining Performance Comparison

Testing Configuration and Software Setup

Software Configuration

True ease of use is something that the Bitcoin ecosystem doesn't really have yet though they are steadily improving on it. You'll need a couple different items up and running on one or more machines to really start with your mining experience.

The first thing you'll need is the Bitcoin client application that acts like your wallet and actually accesses your wallet.dat file. While this doesn't necessarily need to be running on the same hardware that is doing the mining, you'll need to run this to get your key information to share with the mining apps.

For your mining application, there are several options including some command-line based apps and graphical ones. For the quickest setup and configuration time we liked GUIMiner, seen above. The interface you use does not necessarily determine the kernel you use for computing the Bitcoins and which kernel you use can alter performance pretty dramatically. In its infancy the Bitcoin community ran CPU-based kernels until the performance difficulty got to a point where they were incredibly inefficient leading to the creation of several GPU-based designs.

For our testing we went with the poclbm kernel that is built around OpenCL and works with AMD Radeon HD 4000 series and above and NVIDIA GeForce 8000 series and above graphics cards. There definitely are other options out there for Bitcoin mining and many enthusiasts argue that some perform better than others across different ranges of CPUs and GPUs but in terms of popularity today, poclbm seems to be the winner.

The above image shows us actually running a pair of the kernels, one for each GPU on a multi-GPU graphics card. If you have more than one GPU in your system, whether on a single card or multiple, you need only assign a kernel to each available processor to max out your processing performance.

(Side note: because it is built on OpenCL, you can actually run this on CPUs that have compliant OpenCL stacks. However, it is not the most efficient on that class of processor by any means.)

When running a Bitcoin mining application be prepared for a lot of GPU utilization but not much on the CPU side of things.

Here you can see our Core i7 Sandy Bridge based processor is not getting a heavy workout while running the GUIMiner application with our OpenCL-based client focused on the GPU. Looking at the graphics card workload however...

The ASUS ARES card (dual Radeon HD 5870 GPUs) is working hard with a 99% GPU load reached and temperature slowly rising. If you have a GPU with a loud fan or are sensitive to the heat created by your graphics card then this mining process might not be for you!

Hardware Configurations

For our testing we ran the Bitcoin clients on our standard GPU testing bed built out of the following:

Testing Configuration

ASUS P6X58D Premium Motherboard

Intel Core i7-965 @ 3.33 GHz Processor

3 x 2GB Corsair DDR3-1333 MHz Memory

Western Digital VelociRaptor 600GB HDD

Corsair Professional Series 1200w PSU

NVIDIA Driver: 275.33

AMD Driver: 11.6

Our graphics card selection was based on trying to compare some current options to some previous generation cards that are likely to already be in the hands of potential GPU Bitcoin Miners. Here is the lineup with a few curveballs tossed in:

GeForce GTX 285 - $300

GeForce GTX 295 - $289

GeForce GTX 460 - $160

GeForce GTX 560 Ti - $319

GeForce GTX 580 - $469

GeForce GTX 590 - $749

Radeon HD 4890 - $240

Radeon HD 5750 - $115

Radeon HD 5830 - $129

Radeon HD 5970 - $620

Radeon HD 6850 - $159

Radeon HD 6990 - $750

Radeon HD 5870 x2 (ASUS ARES) - $1100

Radeon HD 5870 x2 (Overclocked) - $1100

AMD A8-3850 APU - $139

"The Beast" - $1710

We have covered the bases of the last several years by starting with the HD 4890 and GTX 285 cards of yester-year. We included a range of modern cards including the very popular GeForce GTX 460 and the lower end Radeon HD 5750. Dual-GPU cards make a frequent showing with the GT X 295, GTX 590, HDF 5970 and HD 6990 as well as the ASUS ARES in a standard and overclocked setting. Standard clock rate on the ASUS ARES is 850 MHz and our overclocked setting pushed that to 1005 MHz - an 18% increase. Basically, we just wanted to see how high we could push that $1100 graphics card.

The big outlier is the new AMD A8-3850 APU released this month that combines a quad-core CPU and "discrete class" GPU on a processor. The Radeon HD 6550D GPU on that die (as it is branded) has 400 stream processors and uses a DDR3 memory interface that is shared with the x86 cores. Because the computing process at work in Bitcoin mining is not memory dependent, we kind of expected the APU to do well for its price and position.

The pricing listed here is used throughout our performance review to judge value and profitability. Keep in mind that some of these numbers were hard to really nail down especially for cards like the GTX 285, GTX 295, HD 4890, HD 5970 and ASUS ARES that are hard to find anywhere but eBay and very small online stores. The prices here are my best estimates at what you would have to pay (on average) to acquire a card like this today.

You might also be wondering what "The Beast" is in our list above. That is a mega-crunching machine we put together after doing all of our other card testing to see just how much we could push out of a single system. Using the same base test bed, we installed the Radeon HD 6990 4GB, Radeon HD 5970 2GB (both dual-GPU cards) and the Radeon HD 6970 2GB single-GPU cards. While we wanted to include the ASUS ARES in this configuration we weren't given that option since it required three PCIe power connections and our Corsair AX1200 power supply only supplied us with six of them. You will have to wait until later in the article to see the results of that setup as I decided to leave it off the single card result graphs as it tended to skew the scale quite a bit.

What to look for

The first thing you are going to notice is that the AMD graphics cards solidly outperform the NVIDIA GPUs for reasons we are still diving into. The VLIW architecture at work on the 4000/5000/6000 series of cards is seeing some very high utilization by the poclbm kernel and it is definitely one of those few applications nearly reaching the theoretical limits of TFLOPs claimed by AMD over the years.

What else is there to evaluate?

Pure Mhash/s rates - how fast is each GPU in computing the math required for Bitcoin mining? The higher the Mhash/s rate the faster the card and quicker you will get to finding the next coin in the currency.

Performance per Dollar - Mhash/s/$ - This is probably the most important factor for users that might consider Bitcoin mining as a way to make money and pay for things they want to buy. Which card is going to bring the most "value" the mining experience?

Performance per Watt - Mhash/s/watt - If you value your air conditioning bill more than most or maybe want to cram as many cards into an enclosure as possible for a mining power house you might want to know which cards and GPUs are the most power efficient.

Dollars per day - Step 1: Mine. Step 2: ?? Step 3: Profit. How much money can you make on a given card on a daily basis? This metric will fluctuate from day to day based on the actual exchange rate of a Bitcoin with USD (or your own currency) but we will evaluate it based on the numbers as of this writing.

Time to Graphics Card Payoff - Based on the amount of money you can earn per day, how long will it take you to pay off the card you purchased for this purpose and start making the aforementioned profits? Obviously for cards that are either end-of-lifed or just plain hard to find this is going to be a rough estimate (go ahead and find me an average price for a GTX 285 today) but it provides another useful data point for professional miners.

One Year Profit - If you took that daily earned amount and could apply it perfect for one year (which we know you can't really because of the changing algorithm) and subtracted the cost of that graphics card, how much could you possibly MAKE in a year? The numbers might surprise you!

So there you have it - let's jump into the results and see what our testing brought forth with more details and explanations along the way!

Any reason the AMD 6950 & 6970 cards was left out of the experiment? As the flagship AMD single GPU cards, I think this data would be really salient. Is there another card on the list from which we could easily extrapolate 6950/6970 performance?

In my personal testing, the 6950 gets somewhere around 340 mhash/s with a few optimizations. Overclocking and unlocking can get you around 400. You can get the 6950/70 performance by dividing the results of the 6990 GPU results in the graph.

My understanding of the GPUs used were based on what was available in house for testing.

More specific results (please keep in mind that I am using different settings than Ken so they are not necessarily comparable):

My 6950 unlocked to 6970 shaders at 840 core gets 372.7 mhash/s using GUIMiner, and two kernel tweaks of the poclbm kernel, and AMD Cat 1.7 drivers and whatever version of Stream SDK comes with that. I'm further using the following flags which are gfx card version specific: -k poclbm VECTORS BFI_INT AGGRESSION=9 WORKSIZE=128

I'd be interested in a little more information. I'm running a Sapphire 6950 2GB with unlocked (6970) shaders (but not flashed to 6970 speeds; I just OC when I need the boost). I'm also sporting a Core2Duo E8400 OC'd to 3.6 Ghz. I started mining last night, following the guides Ryan mentioned, and I'm consistently getting 320 Mhash/s, not the 340 you mentioned was possible with a few "optimizations."

Do you know if the optimizations you mentioned (the flags) should work for my 6950; you said they are gfx card version specific - did you mean vendor specific, or just 6950 specific? Is it possible to use those flags when I'm using the GUIMiner, or do I need to be using a console? Thanks for any input!

Hi Adster, I am running a XFX 6950 2GB card with an edited BIOS to have unlocked shaders but not 6970 speeds (though the card is capable of running at them, I didn't want to risk running the memory at the higher speed full time).

The flags that I mentioned will work for you 6950, they are specific to the version of card you have, in this case these flags are best used with AMD 6xxx series cards. You can set the flags in the GUIMiner extra flags area; however, you will need to edit the poclbm kernel file for the other optimizations. You can find those by searching the bitcoin forums for kernel optimizations.

I hope it helps, let me know if you need any help in sqeezing all the mhash possible outta that card :)

Honestly, we just didn't test it because we skipped some cards. Looking back, we should have done one of them. You can see on our screenshot of "The Beast" that we eventually plugged one in and got about ~ 344 Mhash/s.

This is a great article, and pushed me over the edge to start mining. The only big question I have (aside from my earlier question about 6950/6970 performance), is how the cost of electricity factors in.

Obviously we are all subject to different utility rates, so you couldn't give a cost-breakdown that would apply to everyone. However, I am curious how much the average cost of electricity would deduct from the profits in your chart?

I have a dedicated mining machine which runs 24/7 in the closet (no, really -- it sits in the closet). It has the cheapest AMD CPU I could find (sempron processor), 1GB of ram, a flash drive used as the hard drive running Ubuntu 10.4 on a headless (monitorless) system. The only thing really going on is the 2x5850 Xtreme graphics cards pumping out ~ 700 MHash. When I bought this rig, it ran me $530 after rebates from Tiger Direct.

I think it is your responsibility to deter readers more actively from investing in hardware in order to conduct bitcoin mining and distance yourselves from those activities. It is easy for people to understand that they can make money from computing power, but it takes some very careful reading to understand that by design, this whole enterprise will become less and less profitable over time. So I think it would be better to put the emphasis of the article on parallel computing performance and to use bitcoin merely for illustrative purposes. At the very least, you should factor in the energy costs in your profitability analysis, but in my opinion, calculating projections is misleading and even deceptive, given the facts about Bitcoin (see below).

So my warning here:
!!! WARNING !!!
===================================================
Investing in hardware in order to engage in bitcoin mining is a highly risky and quite possibly loss-making idea!
The calculations of "Days to payoff" and "1 year profit" in this article are misleading: Not only is the rate of bitcoin creation is deliberately being slowed as the total number of bitcoins approaches 21 millions, it is also getting more and more difficult to accumulate enough computing power as the number of participants in bitcoin mining is increasing (as people reading this article and others will start setting up their own mining operation). The only effect countering this deterioration in profitability would be an increase in the dollar value of the bitcoin, which is uncertain and unpredictable.
====================================================

"Please keep in mind that we understand that these values will change over time not only because of the exchange rate differences but because your ability to mine Bitcoins will slow down over time as the algorithm to find coins becomes more and more complex as the network hashing power increases. Read over the first two pages of the article again to understand WHY this happens but just know the results you will see below are based on an instance in time during this writing process!"

Nothing really. Plus a virus which specifically only attempted GPU mining would be alot easier to hide in the windows environment since most users are unlikely to be monitoring GPU usage levels when simply web browsing etc.

A virus which intelligently slowed its mining attack if the user was trying to do something GPU intensive (gaming), in order to hide the system use and keep the user from noticing massive in-game slowdown, could likely mine away unnoticed.

I do not fully understand the setup in regards to mining as a pool though, which is what you would ultimately want all your zombied systems to do. I guess it is probably not 'that' difficult to setup a pooling setup given how many continue popping up, plus presumably someone writing a virus specifically targetted at hijacking GPU cycles is probably a decent enough coder.

There have actually been some botnets (that have since been shut down) mining for pools; however, they were thousands of computers using the CPUs to mine as access to the GPU hardware is more difficult/would require more end user cooperation to get that botnet software installed and running, AFAIK.

What price did you use for power in your profit calculations? At 500W for a simple 6990 system, total power consumption in a year of 24/7 use will be 4380 kWh. Your profit after one year will be negative if your price for power is more than about 35 cents, assuming constant difficulty. All Nvidia cards will operate at a loss unless your power is very cheap or free.

Difficulty is about 1000 times larger now than half a year ago, btw. Power cost has become the most important factor in mining profitabilty.

For european readers, the power use is a bit more important. 1kwh of power costs on average around 0.25 euro.
Which means a system like the beast (using 1kw of power) will cost you 0.25*24*365 = 2190 euros per year in electricity.
The beast yearly produces 3637 dollar equivalent bit coins, which is about 2584 Euros.

That means it will effectively only produce 394 euros. And that is not counting the cost of buying the system.

A 6990 in the default BIOS position should generate 330 MHash/s PER core. That's 660 MHash/s per card. I'm not sure why, but your card is showing a much slower speed on one of the cores. (~285.3 MHash/s)

Switch the BIOS switch to position 2 and you'll be at 360-375 MHash/s per core.

You also seem to be missing the most basic flags for GUIMiner running poclbm: -v -w128

Your cpu usage is silly!
It's prolly the guiminer interface or something.
My 'rig' runs 350Mh/s on i7 2600k and HD6970 and rarely hits 4% cpu usage.
And that is while i run an active minecraft server and use the rig to watch videos and stuff (gets it to about 8% for SD video).
I'm using the Phoenix miner btw.

It's 4 integer operations/instruction x 3 instructions/clock x 4 cores x 3.4 GHz = 163.2 GigaInstructions/second. AVX is 4 integer operations/instruction or 8 floating point operations/instruction. Each clock can issue up to 3 instructions if they don't depend on the answer of the previous instructions. The nice thing with AVX over SSE2 is AVX has instructions like a = b *operation* c vs a = a *operation* b for SSE2.

So your telling me you put a Virus on your computer that helps criminals launder money.
you let it operate through your GPU because there is no security there.
and you spend hundreds on hardware and power, for a experiment in social engeneering?
then you think that because 3 places are taking the hype of the bitcoin as a COUPON to sell you shit at 3 times the normal cost, that the bitcoin is therin a currency?

what do you think your GPU is really processing?
or does anyone think?

You just dont get it,
the GPU is processing YOU!
It is internally cyclicly redundant pre-processing your own non-trasnactions, into a multilevel advertising purchacing and marketing scheme.
If they do not enable the user with a journey, then there is no game to be played. There is no Corelation to alternative universal dimentional shifting of exchange goods in virtuality, when there still is nothing but virtuality in existance.
How do you perceive that something exist when one person tells you that it exist, and masses of people join that ONE person to confirm that it exists.
That is a singularity of the black hole variety.

Issue -problem guiminer with dual gpu card HD6870x2 powercolor. After creating new worker for the second Gpu, it still doesnt work 0 Mhashes the first gpu at 304 Mhashes clock at 970 Mhz 60% fan speed temp 74 degrees Celsius
flags -v -w128 -a4. Does anybody know how to setup this correctly , so that both gpu´s work at the same time thank you for helping me out.

everybody in this bitcoin thing are obsessed with the hash performance and what not, while everyone forgot the actual essence: which GPU actually makes more bitcoin(money) per month, or how much money do you think those crossfire 6990HD GPUs will ever give you in a month?

I do online forex and earn $300-$400 PER DAY only using a laptop. can these expensive "rigs" setups with all those crunching numbers/"stats" and huge amount of energy consumptions(carbon emission contributions)give you better than that?