If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Energy consumption is important on portable devices. On desktops I'd rather have more processing power than more efficiency.

Originally Posted by soldant

Well to be fair that big research gap could equate to why AMD weren't able to keep up the distance, but in saying that Intel have released their fair share of crap over the years (the Atom line up until now has been a joke for example). Also AMD trying to push out with more cores isn't necessarily a bad move, it's just ahead of widespread software support (particularly in games, and since we're on a gaming forum...). Of course none of that excuses the fact that releasing something which is useful only in apps that rely heavily on multi-threaded applications that really use those extra cores is a bit foolish in the general consumer market, since it's not meeting people's needs.

But you can't criticise AMD! You're a fanboy if you do that!

AMD has better GPUs at the moment. I'd much rather buy a HD7950 vs a GTX 660TI but I'm not sure how well that's going for them in terms of actual sales.
My post was all over the place, I didn't mean to bash AMD but the main point was that they had enough time to come up with something against Intel. Instead they chose to rely on their cheap medium range CPUs. Intel eventually countered them with the 2500K and since then AMD is struggling. I really hope they can find a source of financing to put their CPU division on the right track.

Yep. Most are less than normal bulbs (although most in this country are on those power saving bulbs now). It's only worth it for offices and companies with thousands of PCS (then the bill is millions ;P ).

I think you should worry more about heat emission then electricity bills. Cooler usually means better performance.

You realise that heat comes from electricity don't you? :)

There's no magic to this - the hotter your PC gets, the more power it's using. It doesn't "use" electricity for making your games work, it's ALL turned into heat (well, a bit of light from your LEDs and sound that you can't really hear - but it's mostly heat)

I think you should worry more about heat emission then electricity bills. Cooler usually means better performance.

Yeah, but you can only exploit that extra thermal headroom with an Intel CPU if it's overclockable. And all their budget CPUs are multiplier-locked.

If something like the Core i3-3220 was overclockable... that would blow a great big hole in the market. But I think Intel would lose as much as AMD doing that, from people deciding not to get a Core i5 after all.

And here you come full circle, same move you pulled at the beginning with that line "I'll pick the superior package", comfortably ignoring that that only existed because Intel had to scrap Netburst ahead of time, and would still ignore that even after it was explained to you. Now AMD is foolish because they're trying to change the game ? Breaking news, it's exactly they have been doing for years on end.

Except they aren't changing the game. The software support is still quite limited except in specific cases which aren't often common usage (or aren't important enough to justify performance loss in other applications). When did I say Intel never made mistakes? Never. And again, during the P4 era I was using AMD CPUs. The argument that the CPU tech drives absolutely everything isn't accurate. Hypothetically in a world where there was no Intel (to swap shoes for a second) and AMD was running the show, it might actually force a move to better multi-threading support through a lack of alternatives. But instead there are better options. It's pointless getting on a soapbox to say that AMD are trying to change the game when they're clearly doing a bad job of it. Not all innovation is good - there are plenty attempts that turn out to be mistakes.

During the 90s when Intel was on top until about 99 when the Athlon turned up, did technology stagnate? No. In between then and now there are a range of other factors to consider besides competition to determine pricing changes or innovation - to state that competition alone is responsible for innovation and price drops is blatantly incorrect. It's an absurd oversimplification of reality.

Originally Posted by alms

Kicking the underdog because it's fallen to the ground is non-sensical to say the least.

Supporting the underdog who can't compete is non-sensical too. We don't reward failure. AMD was the choice in the early 2000s - cheaper, faster, better. The Pentium 4 was bad. Intel hit back, AMD ended up going off with a design that had little in the way of common software support, and paid for it.

Originally Posted by Wulfgar

AMD has better GPUs at the moment. I'd much rather buy a HD7950 vs a GTX 660TI but I'm not sure how well that's going for them in terms of actual sales.

Perhaps, I'm not 100% convinced of that (I've had too many bad experiences with AMD/ATI drivers, particularly on the release of new games) but I'm only talking about AMD's CPU arm, not their GPU line.

Business sales matter but the majority are of bog-basic machines you don't need to create anything special for - the consumer market is important because it drives high-end development (along with the workstation market, although that tends to follow a slightly different path).

The slump of PC sales in the last 3 years is undoubtedly behind this - people are holding onto PCs or buying smartphones and tablets. Just before that there was the Netbook boom that the industry thought would be 'new sales' but which mostly just cannabalised laptop sales - it's been a poor few years for PC makers in general I reckon.

If AMD do leave the CPU and GPU market it will be a VERY miserable few years ahead for sure tho...

Business sales matter but the majority are of bog-basic machines you don't need to create anything special for - the consumer market is important because it drives high-end development (along with the workstation market, although that tends to follow a slightly different path).

But most consumer PCs aren't high end gaming PCs or anything special either. High-end development is usually in gaming or other specialist areas. The majority of average PC users don't need 8 core CPUs or high-end GPUs.

But most consumer PCs aren't high end gaming PCs or anything special either. High-end development is usually in gaming or other specialist areas. The majority of average PC users don't need 8 core CPUs or high-end GPUs.

The majority of average PC users are easy to get on board via marketing though, since they have no clue.

The "mid-range" gaming system on the cheap is nonsense. If you're going to be in PC gaming you need to be in the higher mid range, which is not the sub-$500 market. Otherwise you're barely punching above a console, and if that's the case you might as well just get a console.

I have nothing to add to the AMD yay/nay discussion, but can we stop this elitist lunacy? There are plenty of good reasons to be a PC gamer and there are far, far more games that don't require a high end PC than there are games that do. What about people who like strategy games, MMOs, simulators, retro games, indie games, etc? Are they not PC gamers?

There are plenty of good reasons to be a PC gamer and there are far, far more games that don't require a high end PC than there are games that do. What about people who like strategy games, MMOs, simulators, retro games, indie games, etc? Are they not PC gamers?

Actually, MMOs are like the #1 reason to get a better cpu. I've been pc-gaming for my whole life on shitty systems because I never thought going high-end would be worth it. Recently I had some extra dosh and because my cpu had been bottlenecking my GTX 570 ALL THE TIME I thought 'what-the-heck' and replaced my AMD Phenom II based rig with an Intel i5-3570K rig, keeping the same gpu. And I just have to say.. it's SILLY how big of a difference there is, I seriously doubled my fps in GW2 (from 30-40 with AMD to 60-90 with Intel) and I can finally run all source-games and cod-games at 120 fps which works great with my 120hz monitor. Oh and I probably QUADRUPLED or atleast tripled my fps in Minecraft and I'm sure other cpu-hungry games like Dwarf Fortress also benefits.

TL;DR: If gaming is your main hobby it will always be worth it to get a high-end cpu (Intel today), it makes a bigger difference than you might think. Take it from someone who used AMD his whole life and recently got his first Intel-cpu.

I have nothing to add to the AMD yay/nay discussion, but can we stop this elitist lunacy? There are plenty of good reasons to be a PC gamer and there are far, far more games that don't require a high end PC than there are games that do. What about people who like strategy games, MMOs, simulators, retro games, indie games, etc? Are they not PC gamers?

It's not elitism, and we're not going to get into a pointless debate about who is or isn't a PC gamer (or I can pull out Facebook games and hold them up too I guess). But sub-$500 sector is quite low end, and given that one of the major benefits of PC gaming that people like to bring up is the increased graphical fidelity and support for resolutions that don't look like Vaseline has been smeared all over the screen. And although consoles cooled the hardware wars quite a bit and hardware lasts longer, that's still in the upper-mid end of the hardware sector.

Originally Posted by Sakkura

Well, I think Dell are Intel-only, so there...

Also, they happily shove an Intel Core i7 in the face of computer-illiterates. With DDR3-1333 in some cases, just to make computer literate heads explode.

Actually Dell have had AMD systems, though they're fairly limited. There are quite a few AMD pre-built systems from smaller local retailers too... though even now most of them have swapped to Intel around where I live. I haven't seen any Dell AMD systems for a while now, I think the last ones I saw were low-end laptops but with the ultrabook form factor all of that stuff is gone.

And yes, Dell will put an i7 in front of the ignorant public, but that's just marketing that every single company is guilty of because it's marketing.

Actually Dell have had AMD systems, though they're fairly limited. There are quite a few AMD pre-built systems from smaller local retailers too... though even now most of them have swapped to Intel around where I live. I haven't seen any Dell AMD systems for a while now, I think the last ones I saw were low-end laptops but with the ultrabook form factor all of that stuff is gone.

And yes, Dell will put an i7 in front of the ignorant public, but that's just marketing that every single company is guilty of because it's marketing.

Well, I couldn't find any AMD-based systems when I looked, but I didn't check every nook and cranny. But certainly, prebuilt systems from other retailers do feature AMD processors. But they can easily be 8-core processors just like when Dell shoves a Core i7 at ma and pa. Even more so when an FX 8350/8320/8150 etc. costs less than a Core i7.

Well, I couldn't find any AMD-based systems when I looked, but I didn't check every nook and cranny. But certainly, prebuilt systems from other retailers do feature AMD processors. But they can easily be 8-core processors just like when Dell shoves a Core i7 at ma and pa. Even more so when an FX 8350/8320/8150 etc. costs less than a Core i7.

As I say this was before the ultrabooks started to hit, I haven't seen them for a while now, but Dell did offer AMD systems if you looked hard enough.

And you're right that they could shove an 8 core AMD CPU at the public, but again it comes back to software support, and unless you're constantly working with apps that are heavily multi-threaded, it probably wouldn't be a good choice.

It's not elitism, and we're not going to get into a pointless debate about who is or isn't a PC gamer (or I can pull out Facebook games and hold them up too I guess). But sub-$500 sector is quite low end, and given that one of the major benefits of PC gaming that people like to bring up is the increased graphical fidelity and support for resolutions that don't look like Vaseline has been smeared all over the screen. And although consoles cooled the hardware wars quite a bit and hardware lasts longer, that's still in the upper-mid end of the hardware sector.

Saying that you need a mid-high range system or you might as well get a console is a statement about what PC gaming is, and it's a very silly one. Many of the PC exclusive games that have attracted a lot of attention in the last couple of years require significantly less powerful systems than a console.