I hope people are starting to sit up an take notice. The desire for fulfill the prophesies of Moore's law and to have ever faster and more powerful computing has already exhausted itself. Games are just about as good as they are going to get without new display technologies. The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.

So now things are going for lower power, lower operating temperature and all that. What sort of things benefit from that? How about "embedded systems"? Things that people don't want or need to reboot? The current versions of Windows are too bloaty, power and memory hungry to fit within that framework, so it'll have to be another OS. We know this because of the horrible failure "Netbook" computing has been. People wanted it, but expected it to run Windows. Windows couldn't really do it effectively. (I know... people are still doing it... I've still got two netbooks running XP and going strong... but anyone selling XP?) Microsoft shows no remorse over their architectural choices and show no signs of slimming down and getting lighter. So nothing points in Microsoft's direction... not even Microsoft. They are raising prices to make up for the lack of interest in what they are doing now.

AMD has huge advantages in the server market, I'm really surprised people are so stuck on XEON's.

You can't cram 64 XEON cores into a 1U. Not to mention Intel is spotty on their hardware virtualization extensions.

Intel has the lead in power consumption, sure. But if you're looking into running anything Xen, KVM or VMware in production, the cost savings AMD brings to the table makes them a competitive contender.

I'm in the market for a new Workstation. I've been looking at an Opteron instead of the desktop models. Primary reason being 16 cores on one chip, at a lower power consumption than the 8-core Desktop model.

The desire for fulfill the prophesies of Moore's law and to have ever faster and more powerful computing has already exhausted itself.

Not sure I follow. Transistor density has kept on increasing. It's been a little slower recently, I think, but several manufacturers are now sub 30nm for a variety of different process types.

Games are just about as good as they are going to get without new display technologies.

Really? Seems unlikely.

The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.

Are you sure? Have you looked at the recent CPU benchmarks? More and more programs are taking advantages of multiple cores. All sorts of things that people actually do, like file compression, web browsing, media transcoding. Certainly the things I do benefit from multiple cores.

We know this because of the horrible failure "Netbook" computing has been.

Netbook computing was fine until microsoft moved quickly to kill it. Then the manufacturers seemed bent on suicide after that for inexplicable reasons. Oh, and intel came up with bizarro licensing for the Atom restricting manufacturers yet they haven't (with few exceptions) switched to the faster and cheaper Bobcat CPUs which lack such bizzare licensing restrictions.

Why can't I buy a machine at the low price point and low eright of the EEE 900? That machine sold many millions. Netbooks used to be sub 1kG in the beginning. Now the lightweight ones are 1.5. What happened?

Venduhs are strange. Why did they drop all the high res screens from laptops 10 years ago only to scrabble to play catch up after Apple decided to bring in high res displays? Makes no sense.

That said, there's still a quite decent range of cheap netbook machines around, but they're just not as good as they were.

The desire for fulfill the prophesies of Moore's law and to have ever faster and more powerful computing has already exhausted itself.

While software has been hampered by web "technology" over the last decade, we are hardly at the pinnacle of software and computing... it's more like the Dark Ages, actually. Some stuff is being done elsewhere (GPUs, mobile), but we're still mired in fundamentally stagnant and backwards principles on the desktop (and server, really).

Games are just about as good as they are going to get without new display technologies.

Laughable. Let's assume anything video-related is "new display technology," and that we certainly have a long way to go to realtime radiosity and raytracing at extremely high resolution in a mobile device, then toss it 3D for good measure, so that's a given. But in terms of gameplay, all the computing and RAM you can get can be eaten up for a very long while. Simulation in games, today, isn't anything like what it could be. If I can't build a city at the SimCity level, zoom in and rampage through it at the GTA level, and walk up to each and every person on the street and learn their personal history and daily routines at an RPG level, then go into every structure and demolish it bit-by-bit with full soft-body dynamics, you've got quite a long way to go.

The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.

This is true to some extent, but "resorting to multi-processor and multi-core" means the desktop isn't maxed out. The primary OS (and software) may not be taking advantage of these things, but they are there and we're far from done yet.

Microsoft shows no remorse over their architectural choices and show no signs of slimming down and getting lighter. So nothing points in Microsoft's direction... not even Microsoft. They are raising prices to make up for the lack of interest in what they are doing now.

Microsoft is irrelevant. They have been for a long time. They may not be going away anytime soon, but they've been irrelevant since Google used the web to effectively route technology around them (due to earlier attempted lock-in). Of course, this has resulted in aforementioned Dark Age of Software, but at least we're not stuck on one platform. We're at the point where Valve is looking to seriously move gaming away from Windows, and there are alternatives for everything else, so what happened before doesn't really apply to what can happen in the future.

The i7 3770K has a TDP of 95W. And the FX-8350 is a very good chip and much cheaper than the i7. The benchmarks relative to the i7 are all over the place. In most cases it sits somewhere between the i5 and i7. In some cases it is destroyed by the i7, in other cases, the reverse is true. The single threaded performane is quite weak and usually substantially less than the i5, but then the i5 to i7 difference isn't enormous. The difference from FX8350 to i7 seems to be around 20-50% in most cases.

Curiously the AMD processors tend to stack up better on the Linux benchmark suites.

Anyway.

This thread is about the Opteron processors, which are still (a) competing against SB, (b) benefit from substantially cheaper full system costs and (c) you aren't terribly sensitive to single thread performance if you're buying a 4 socket server.

Not to mention you're investing into a platform with little future

What does that even mean? It's all x86, so even if AMD vanishes tomorrow you can keep using the servers and then transition to intel when you need new ones. The whole point of having more than one vendor means that no matter what, you're not investing in a platform with no futuer.

I know that, at least in the past, Intel used to issue TDP numbers that represented "typical" heat, while AMD used to issue TDP numbers that represented worst-case heat (which is what TDP ought to be IMHO). I have read here on Slashdot that more recently, AMD has started playing those games as well.

But according to NordicHardware [nordichardware.com], in this case Intel is under-promising and over-delivering, and the chips really do dissipate only 77W despite being rated for 95W. (But how did they measure that? Is this a "typical" 77W? I guess it's not that hard to run a benchmark test that should hammer the chip and get a worst-case number that way.)

Curiously the AMD processors tend to stack up better on the Linux benchmark suites.

This is probably because Linux benchmarks were compiled with GCC or Clang rather than the Intel compiler. The Intel compiler deliberately generates code that makes the compiled code run poorly on non-Intel processors. The code checks the CPU ID, and the code has two major branches: the good path, which Intel chips get to run, and the poor path, which other chips run.

The irony is that Intel, by investing heavily in fab technology, is about two generations ahead of everyone else, so they can make faster and/or lower-power parts than everyone else. This means they could be competing fairly and win.

But because Intel does evil things like making their compiler sabotage their competition, I refuse to buy Intel. They have lost my business. They don't care of course, because there aren't many like me who are paying attention and care enough to change their buying habits.

If you want the fastest possible desktop computer, pay the big bucks for a top-of-the-line i7 system. But if you merely want a very fast desktop computer that can play all the games, an AMD will do quite well, and will cost a bit less. So giving up Intel isn't a hard thing to do, really.

Oh lord not this again. First it was the netbook that was gonna kill "the big bad M$" then it was the tablet, then the phone, now you are gonna say embedded...really? Give it up Sparky, nobody is giving up their desktops or laptops for some 1Ghz ARM embedded in the TV, okay? Hell one of the Apple fanbois tried giving up ALL X86 for a month, just one month, and using nothing but his iPad and his iPhone...what happened? he gave up after a week and a half because it was hobbling him too damned much.

The ONLY thing you got right is that PCs have gotten insanely powerful, but you know what? Computers have been insanely powerful for most of the decade, hasn't stopped people from buying them. What HAS stopped people from buying them is the fact we are in the midst of a global recession (I would argue depression, but whatever) so people simply aren't spending money they don't absolutely have to and with their desktops sporting triples and quads, and their laptops sporting duals capable of 1080p? They really don't have to.

But the simple fact is even with an economy in the shitter we are talking 300 MILLION plus computers being sold, and yes nearly all of them running Windows, why? because that is where the software is. they don't want ersatz software, like Gimp for Photoshop, Tux Racer for DIRT, they want to use the billions of dollars worth of software they are sitting on, everything from Quickbooks to that God awful EasyShare your grandma loves so much, and NONE of that shit is gonna run on some embedded ARM chip.

And I hate to break the news to ya but ARM is about to slam face first into the thermal wall, just as X86 did half a decade ago. This is why the ARM Holdings Group have been talking about "dark silicon" for their last several press releases, and why Nvidia is now up to FIVE, count 'em, five cores in their Tegra chips, only ARM hit the thermal wall with a frankly shitty IPC so they are throwing more cores at it but as we saw with AMD there is only so much you can make up for IPC by throwing more cores at it, because most software today still don't thread for shit.

So X86 isn't going anywhere, Windows will be back up once Ballmer's fat ass is thrown out of the big chair and the abortion known as Win 8 is replaced by a much better Win 9 (Star Trek rule in play) and people will continue to buy hundreds of millions of X86 units every year, just at a slower pace because grandma can't stress out that quad like she could that old P4. Does that mean ARM is gonna disappear? Nope, it means its gonna have an insanely quick race to the bottom and several corps will go broke selling Android units, because by this time next year you'll have 7 inch dual core tablets with Android 5 selling for $50 at the Big Lots, and just like X86 people will find they can't tell the difference between a dual core and a quad so they'll get the cheaper unit. Look at the financials of the companies selling Android, Samsung is barely making a profit as is HTC, the rest are bleeding money.

But to say embedded is gonna take out X86 is as stupid as saying mopeds are gonna take out the trucking industry. They are completely different units built for completely different tasks and I have YET to see a single person, even one, replacing their X86 laptops and desktops for some cell phone chip. Sorry, ain't gonna happen.