If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

How many months is it going to take to pay for the CPU price difference?
$100 / $0.65 = 153 months

How many years is that?
12.75 years

But wait, there's more... Intel motherboards are generally more expensive than AMD boards as well.
Maybe an extra 50 dollars more. So we need to add another 6 years onto that....

Of course, I love advanced fab chips, but when companies like Intel intentionally make their chips and motherboards overpriced by so much, any kind of power savings goes flying out the window instantly.

I don't think you're understanding what you're reading.
Peak power is worthless for anything other than determining what power supply to buy...
You cannot use peak power to make comments about how much power the PC uses over *ANY* period of time.
TDP, by definition is the expected amount of power the CPU uses over a period of time.

Legitreviews says that the average load power of a PC with the 8350 is 219 watts compared to 163 watts of the Intel chip while running CPU benchmarks..
Which is exactly what the TDP difference between the chips predicts.. Running games, the difference is even less.
Source link here...http://www.legitreviews.com/article/2055/13/

You know it takes a *LOT* of power to turn over a car engine but that doesn't mean you drive your car with your starter running 24/7. Peak power is an almost worthless measurement, especially for a CPU that can ramp up and down within time frames as small as 1/1000000 of a second.

Show me your numbers... Right now....
57 dollars for 1000kWh (1MWh).... and that's assuming I use it during the worst time of day.
But wait, there's more... Intel motherboards are generally more expensive than AMD boards as well.
Maybe an extra 50 dollars more. So we need to add another 6 years onto that....

But wait, there's more... Intel motherboards are generally more expensive than AMD boards as well.
Maybe an extra 50 dollars more. So we need to add another 6 years onto that....

Of course, I love advanced fab chips, but when companies like Intel intentionally make their chips and motherboards overpriced by so much, any kind of power savings goes flying out the window instantly.

Nice attempt, but it's flawed. You assume both intel and AMD take the same time to complete the tasks at hand. They don't, AMD will burn power for longer in most cases.
You also simplified by assuming continuous full power; if we consider a mix of idle, light load and heavy load, we might have something more realistic (and quite probably beyond the arithmetics anyone on this forum is willing to do ).

i7 3770 is essentially Intel's flagship processor, and the FX-8350 beats it in some benchmarks? Even if it does lose overall (which it does) and gets trounced in others (which it also does), this is excellent news for AMD. Coupled with a much better GPU, it's a viable option again.

i7 3770 is essentially Intel's flagship processor, and the FX-8350 beats it in some benchmarks? Even if it does lose overall (which it does) and gets trounced in others (which it also does), this is excellent news for AMD. Coupled with a much better GPU, it's a viable option again.

Indeed, this does put AMD in a better position than before. But, and this is a big but, AMD's wins are based on a higher clock speed, higher TDP and a die which is twice the size of Ivy Bridge. The latter means that while AMD may be pricing these aggresively, their profits are still hurting.

Indeed, this does put AMD in a better position than before. But, and this is a big but, AMD's wins are based on a higher clock speed, higher TDP and a die which is twice the size of Ivy Bridge. The latter means that while AMD may be pricing these aggresively, their profits are still hurting.

This is true, but it also shows that the new architecture is not the fail it originally seemed, and that once the growing pains are taken care of, they might have a competitive platform again.

I'm guessing that moving to a smaller process is really what they need now. How realistic this is, I don't know. Didn't Global Foundries lauch a 28nm fab this year?

How many months is it going to take to pay for the CPU price difference?
$100 / $0.65 = 153 months

How many years is that?
12.75 years

But wait, there's more... Intel motherboards are generally more expensive than AMD boards as well.
Maybe an extra 50 dollars more. So we need to add another 6 years onto that....

Of course, I love advanced fab chips, but when companies like Intel intentionally make their chips and motherboards overpriced by so much, any kind of power savings goes flying out the window instantly.

This is assuming idle use is same - we see very good progress here, kudos to AMD!
Due to this we discard idle running costs, because AMD machine uses only 10% more of Intel when Idling (this is OK). We calculate only extra costs.

Because the previous generation (I have that - Athlon II x4) had HUGE idle consumption difference (120 Watts compared to 60).
Lowering idle to ~level of competition is more important in desktop market. If your system consumes 60Watts more constantly when its up, its more fatal than say 60 or 100 Watts more when full load in this market.
For high-performance market its opposite, there full load plays vital role. Currently its not shining...

And it does not include GPU/APU, since we assume them to be the same, ofc.

Also, small unrelated note: Higher Hz and higher consumption result in higher wear-out levels. AMD has higher Hz and higher consumption for same performance.
Means, AMD system will fail at much more probable rate than Intel assuming the quality is same and technology level is equal when it comes to wear-out.