I hope people are starting to sit up an take notice. The desire for fulfill the prophesies of Moore's law and to have ever faster and more powerful computing has already exhausted itself. Games are just about as good as they are going to get without new display technologies. The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.

So now things are going for lower power, lower operating temperature and all that. What sort of things benefit from that? How about "embedded systems"? Things that people don't want or need to reboot? The current versions of Windows are too bloaty, power and memory hungry to fit within that framework, so it'll have to be another OS. We know this because of the horrible failure "Netbook" computing has been. People wanted it, but expected it to run Windows. Windows couldn't really do it effectively. (I know... people are still doing it... I've still got two netbooks running XP and going strong... but anyone selling XP?) Microsoft shows no remorse over their architectural choices and show no signs of slimming down and getting lighter. So nothing points in Microsoft's direction... not even Microsoft. They are raising prices to make up for the lack of interest in what they are doing now.

Actually, all modern OSs do a fantastic job of taking advantage of multiple cores. It's the apps that fail to do so.

As for OSs that take advantage of low power CPUs, you only mention MS - who (I suppose) has done a good job of this with Windows RT on the Surface. And maybe even a good job with whatever the hell Windows Phones run. It's just that consumers have not liked the apps. Of course Apple and Google both have solid contenders in the embedded space.

I think he's saying that CPUs bought several years ago are good enough for most people and the need to upgrade hardware every few years is not as pressing as it once was. One way to force this is to bloat software like OS so that you needed new processors.

This leaves MS in a difficult place as most consumer tend to buy new machines to get new Windows versions instead of upgrading. There are rumors that MS is switching to a yearly release to entice consumers to upgrade. It is nearly the same model that Apple uses.

A key difference is that while Apple might make some profit on OS upgrades, they make a lot more on hardware. Thus MS is trying to get into the hardware business as well.

I think he's saying that CPUs bought several years ago are good enough for most people and the need to upgrade hardware every few years is not as pressing as it once was. One way to force this is to bloat software like OS so that you needed new processors.

This leaves MS in a difficult place as most consumer tend to buy new machines to get new Windows versions instead of upgrading. There are rumors that MS is switching to a yearly release to entice consumers to upgrade. It is nearly the same model that Apple uses.

A key difference is that while Apple might make some profit on OS upgrades, they make a lot more on hardware. Thus MS is trying to get into the hardware business as well.

If only that were true. The merging CPU/GPGPU and then HSA approach to ubiquitous computing is going to need OS Level tools and frameworks to make that huge leap and make it uniformly on a platform so that Application layer development isn't spinning its wheels reinventing custom threading models and distributed design architectures because there doesn't exist a set of Core APIs to aide. Apple is way ahead in this regard. AMD is also way ahead in this regard. Ironic that both MSFT and Intel are behind when

Speaking a bit offtopically about selling XP, I think that there is a single measure that would make the SW and HW markets much healthier without drastic changes to the copyright law: compulsory downgrade rights to the SW products that aren't sold anymore.

This is a`reasonable compromise for consumers and vendors: the former get to use the software they want, the latter continue to get their money, just without the ability to create additional artificial scarcity.

So you are saying that a company should be forced to sell software they no longer want to support?

What about that companies support costs. If the company is still selling the software their customers are going to demand support and the main reason that the company stopped selling said product is that the products support costs were too damn high. So how does that get the company more money? All it gets the company is pissed off customers and frazzled tech support.

No, I company shouldn't be forced to support obsolete software, that would be unreasonable. But customers should have the right to obtain it legally with no support obligation from the vendor - and the company shouldn't get to restrict it artificially.

The keyword is "downgrade": for example, when I buy a box copy of Windows 8 Pro, in my proposed solution I am legally entitled to install Windows 7, Vista, XP, 2000 Pro box or less expensive editions of these instead of the 8. Here, Microsoft gets the money from me for its latest product, and I get to legally use its comparable obsolete product that I find better.

How EXACTLY are they "restricting it artificially" pray tell? If you want an XP license today you can get software assurance, an MSDN, or just buy a retail copy, that is three different ways to get XP right there.

People simply aren't buying XP because more and more software has skipped Vista and moved straight to Win 7/8 so as XP gets ever longer in the tooth (Good Lord its over a decade old folks, and has patches on top of patches and doesn't even have support for more than 3Gb of RAM) all the remaining XP machines out there are legacy boxes that people for one reason or another have just not decided to upgrade. this is understandable as most people don't want to spend money upgrading some 7 year old PC that requires more expensive parts than a new one, but that isn't any kind of restriction, that's just common sense.

And if you'll be happy to show us a reliable (IE not a "ZOMFG M$ is gonna kill the servers and burn babies ZOMFG!" blog post) source that has a single quote from MSFT about killing the activation servers? The only quote I've seen from MSFT is a quote saying if they decide to kill the activation servers they will simply point the systems to a page where they can download a simple activation killer, no different than how you can still find the KBs and patches for Win2K at Microsoft, they just don't support it anymore so if you run it you're on your own.

So I don't see how they can't obtain XP, several ways to buy it and I'd be happy to provide links but I figure most can Google, hell you can even use WSUS Offline and set up your own XP (Or 2K3, or Vista, or 7) Update server and keep installing XP all you want, you just won't be getting support after Apr 2014 which since we are talking about an OS that came out when the average system was a 400Mhz PII with 64Mb of RAM? Really not unreasonable and frankly longer than anybody else out there. Does Apple still support OSX 1? Does Linux still provide patches to Debian 2 or whatever was released in 2001? Nope, so I really don't get why anybody is having a fit over this, hell they gave it twice the lifespan of Win9X and have made 10 years of support standard on ALL of their OSes, not just Pro and Enterprise like before, but even Basic and Home, so I honestly don't see what is up with the teeth gnashing and double standards.

I mean is anybody REALLY taking that brand new i3 or AMD quad with 4Gb of RAM and slapping XP on it? Because if so they don't need more licenses, they need a cat scan, you are crippling the system with a creaky old OS that was never made to run on the specs we have now and its just nuts. Even the $200 netbooks are several times faster than the workstations were back then, its just pointless to use XP now for anything but legacy apps.

How EXACTLY are they "restricting it artificially" pray tell? If you want an XP license today you can get software assurance, an MSDN, or just buy a retail copy, that is three different ways to get XP right there.

1) I'm speaking generally, MS is just an example.2) I'm not only talking about today but also about the future. Even if something is available today, tomorrow it may not be.3) Re XP:(a) AFAIK, software assurance is not for individuals and I'm not even sure that XP will be available indefinitely vi

Then you don't need more licenses, you need a shrink. and how can it possibly be "less hassle" than "stick in discs, go have a sammich" with Win 7? MSFT provides all the tools for free to make your own custom Win 7 images fully automated, you can install from disc or over the network, and no need for hacking SATA support into the OS like with XP.

The simple fact is when you can buy kits with 8gb of RAM for less than $250 off tiger sticking to an 11 year old creaky as hell OS is just insane. you are crippli

Oh I completely dig that idea. If it is of no use to you (ie. you aren't selling it) then you have apparently exhausted its value to you as a business. It is now your responsibility under the contract of copyright, to release it to the public domain. But no. "The value" is maintained by keeping it away from the public in order to ensure that they keep buying the same things over and over and over again. This is a public abuse which could only be enabled by copyright law.

So copyright went from the right to copy and distribute to the right to take it away from the public and to withhold information, arts and technology.

1. Some people still are running Win98SE2. If WinXP were free, people would likely kick in with their own fixes and updates

Also, if the copyright for Windows XP were turned over to the public domain, I think the source code would ALSO be made available somehow. They could try to keep it to themselves... you know they wouldn't be obligated to publish it anywhere as far as I know... not sure how the library of congress works with regards to all of that, but I get the feeling the code would get leaked somehow

But in general, I think when people are ready to give up on the new version of Windows, they will pirate and old one and/or move to another OS. The options are limited but flavors of Linux like Ubuntu are extremely effective with new users. (I am not an Ubuntu fan. Not at all!)

I'm not predicting doom for Windows and certainly not setting a date. I am extrapolating the more numerous and recent failures Microsoft has had and they are huge. The primary cause is cu

I partly wrote my response the way I did and where I did because of a previous response to your post that complained that your proposal would force companies to continue to support software after they considered it obsolete. My response is the answer to that, somewhat legitimate, objection to your original proposal.
Perhaps a good solution would be to implement yours, but give companies the option of just releasing old versions to public domain if they wish to avoid any support issues that they were afrai

No, I think we have only exhausted the demands of what you might call simplistic computing - fragile algorithms that efficiently follow usually a fixed number of steps to transform their input into some determined output, like drawing a rectangle. Given any imperfect input, they simply explode. Nothing in nature works this way. Living things are more messy, rooted in pattern matching and open-ended searching. We still can't even really simulate glass of water tipping over and spilling off the table. Co

The desire for fulfill the prophesies of Moore's law and to have ever faster and more powerful computing has already exhausted itself.

Not sure I follow. Transistor density has kept on increasing. It's been a little slower recently, I think, but several manufacturers are now sub 30nm for a variety of different process types.

Games are just about as good as they are going to get without new display technologies.

Really? Seems unlikely.

The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.

Are you sure? Have you looked at the recent CPU benchmarks? More and more programs are taking advantages of multiple cores. All sorts of things that people actually do, like file compression, web browsing, media transcoding. Certainly the things I do benefit from multiple cores.

We know this because of the horrible failure "Netbook" computing has been.

Netbook computing was fine until microsoft moved quickly to kill it. Then the manufacturers seemed bent on suicide after that for inexplicable reasons. Oh, and intel came up with bizarro licensing for the Atom restricting manufacturers yet they haven't (with few exceptions) switched to the faster and cheaper Bobcat CPUs which lack such bizzare licensing restrictions.

Why can't I buy a machine at the low price point and low eright of the EEE 900? That machine sold many millions. Netbooks used to be sub 1kG in the beginning. Now the lightweight ones are 1.5. What happened?

Venduhs are strange. Why did they drop all the high res screens from laptops 10 years ago only to scrabble to play catch up after Apple decided to bring in high res displays? Makes no sense.

That said, there's still a quite decent range of cheap netbook machines around, but they're just not as good as they were.

I agree with the games thing. There are many ways they could use more CPU/GPU and still be useful. For example when have you seen wind in trees in a game that actually looks like wind in real trees? Trees in games are always some sort of leaf pattern on a plane with holes in it. Any games with a good deform-able environment? How about reflection in water ripples? Bullets that are actually computed using wind and movement of the player? Actual gravity and friction?

The desire for fulfill the prophesies of Moore's law and to have ever faster and more powerful computing has already exhausted itself.

While software has been hampered by web "technology" over the last decade, we are hardly at the pinnacle of software and computing... it's more like the Dark Ages, actually. Some stuff is being done elsewhere (GPUs, mobile), but we're still mired in fundamentally stagnant and backwards principles on the desktop (and server, really).

Games are just about as good as they are going to get without new display technologies.

Laughable. Let's assume anything video-related is "new display technology," and that we certainly have a long way to go to realtime radiosity and raytracing at extremely high resolution in a mobile device, then toss it 3D for good measure, so that's a given. But in terms of gameplay, all the computing and RAM you can get can be eaten up for a very long while. Simulation in games, today, isn't anything like what it could be. If I can't build a city at the SimCity level, zoom in and rampage through it at the GTA level, and walk up to each and every person on the street and learn their personal history and daily routines at an RPG level, then go into every structure and demolish it bit-by-bit with full soft-body dynamics, you've got quite a long way to go.

The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.

This is true to some extent, but "resorting to multi-processor and multi-core" means the desktop isn't maxed out. The primary OS (and software) may not be taking advantage of these things, but they are there and we're far from done yet.

Microsoft shows no remorse over their architectural choices and show no signs of slimming down and getting lighter. So nothing points in Microsoft's direction... not even Microsoft. They are raising prices to make up for the lack of interest in what they are doing now.

Microsoft is irrelevant. They have been for a long time. They may not be going away anytime soon, but they've been irrelevant since Google used the web to effectively route technology around them (due to earlier attempted lock-in). Of course, this has resulted in aforementioned Dark Age of Software, but at least we're not stuck on one platform. We're at the point where Valve is looking to seriously move gaming away from Windows, and there are alternatives for everything else, so what happened before doesn't really apply to what can happen in the future.

Oh lord not this again. First it was the netbook that was gonna kill "the big bad M$" then it was the tablet, then the phone, now you are gonna say embedded...really? Give it up Sparky, nobody is giving up their desktops or laptops for some 1Ghz ARM embedded in the TV, okay? Hell one of the Apple fanbois tried giving up ALL X86 for a month, just one month, and using nothing but his iPad and his iPhone...what happened? he gave up after a week and a half because it was hobbling him too damned much.

The ONLY thing you got right is that PCs have gotten insanely powerful, but you know what? Computers have been insanely powerful for most of the decade, hasn't stopped people from buying them. What HAS stopped people from buying them is the fact we are in the midst of a global recession (I would argue depression, but whatever) so people simply aren't spending money they don't absolutely have to and with their desktops sporting triples and quads, and their laptops sporting duals capable of 1080p? They really don't have to.

But the simple fact is even with an economy in the shitter we are talking 300 MILLION plus computers being sold, and yes nearly all of them running Windows, why? because that is where the software is. they don't want ersatz software, like Gimp for Photoshop, Tux Racer for DIRT, they want to use the billions of dollars worth of software they are sitting on, everything from Quickbooks to that God awful EasyShare your grandma loves so much, and NONE of that shit is gonna run on some embedded ARM chip.

And I hate to break the news to ya but ARM is about to slam face first into the thermal wall, just as X86 did half a decade ago. This is why the ARM Holdings Group have been talking about "dark silicon" for their last several press releases, and why Nvidia is now up to FIVE, count 'em, five cores in their Tegra chips, only ARM hit the thermal wall with a frankly shitty IPC so they are throwing more cores at it but as we saw with AMD there is only so much you can make up for IPC by throwing more cores at it, because most software today still don't thread for shit.

So X86 isn't going anywhere, Windows will be back up once Ballmer's fat ass is thrown out of the big chair and the abortion known as Win 8 is replaced by a much better Win 9 (Star Trek rule in play) and people will continue to buy hundreds of millions of X86 units every year, just at a slower pace because grandma can't stress out that quad like she could that old P4. Does that mean ARM is gonna disappear? Nope, it means its gonna have an insanely quick race to the bottom and several corps will go broke selling Android units, because by this time next year you'll have 7 inch dual core tablets with Android 5 selling for $50 at the Big Lots, and just like X86 people will find they can't tell the difference between a dual core and a quad so they'll get the cheaper unit. Look at the financials of the companies selling Android, Samsung is barely making a profit as is HTC, the rest are bleeding money.

But to say embedded is gonna take out X86 is as stupid as saying mopeds are gonna take out the trucking industry. They are completely different units built for completely different tasks and I have YET to see a single person, even one, replacing their X86 laptops and desktops for some cell phone chip. Sorry, ain't gonna happen.

Hell one of the Apple fanbois tried giving up ALL X86 for a month, just one month, and using nothing but his iPad and his iPhone...what happened? he gave up after a week and a half because it was hobbling him too damned much.

Do you have a link to that story? Sounds like an interesting read but my efforts at searching for it have failed.

I'm sorry but my Google Fu does sucketh, it was El Reg, ZDNet, or C-Net, I get emails from all 3 which is where I read about it. in the end it was losing the keyboard, he even broke his rule of using JUST the iPhone or iPad by using a BT keyboard and even then he said it was just crimping him too badly so he went back to his Macbook.

Games are just about as good as they are going to get without new display technologies.

Hah, you're totally making that up. 3D simulation has barely scratched the surface of what is possible, and AI of today is just a lame joke. How about audio synthesys that doesn't need vactors? How about fully interactive worlds? How about interacting with proper physics? How about hair that looks and acts like hair?

Trust me, the games you are playing today will look just as dated in ten years as that games you played then years ago.

One problem is that xorg is using 75%... but only of one core. Xorg is a notorious offender when it comes to a program that you'd think would be very well multi-threaded but is actually single-threaded (Firefox being another although that is gradually changing).

A couple of years ago there may have been a reason to run XP on a netbook, but not anymore. Almost no sites require IE and lots of apps are moving to the cloud . Do yourself a favor and install a lightweight linux distro (such as Debian with LXDE) and google Chrome. You'll immediately notice the improvement in boot times, responsiveness, and browser speed. And you'll be a lot less susceptible to malware.

I would just install Windows 7. It runs as fast as XP and, there is more software available. Also, internet video (Flash, HTML5) is garbage in Linux and cannot be played smoothly on a netbook.

AMD has huge advantages in the server market, I'm really surprised people are so stuck on XEON's.

You can't cram 64 XEON cores into a 1U. Not to mention Intel is spotty on their hardware virtualization extensions.

Intel has the lead in power consumption, sure. But if you're looking into running anything Xen, KVM or VMware in production, the cost savings AMD brings to the table makes them a competitive contender.

I'm in the market for a new Workstation. I've been looking at an Opteron instead of the desktop models. Primary reason being 16 cores on one chip, at a lower power consumption than the 8-core Desktop model.

AMDs advantage is lots of CPUs for cheap.For some workloads, that is worth it. I am using some for a VDI deployment. RAM is the limiting factor on how many desktops I can host not CPU and not disk, because I went all SSD.

RAM is the limiting factor on how many desktops I can host not CPU and not disk

Surely in that case it's also worth going for AMD, since you also get excellent value in terms of DIMM sockets. If CPU is really not the limiting factor, you could get a 4 way 6212 (are those the cheapest?). The processors are about Â£200 a pop, and you get 32 DIMM sockets giving you up to 512G of RAM, using 16G DIMMs. 16G DIMMs are now at the point where they are sometimes less/GB than 8G ones.

- a core on an AMD system has about the same performance as a thread on a XEON. So Opteron and XEON are equivalent there.- The e3 - entry level XEON -- the cheapest class of XEON chips -- has hardware virtualization. I don't think it's on every chip, but it's really not hard to pull up Intels site and see if a specific chip supports it.

Until recently, I've been buying 100% AMD for 15 years... but AMD is so far behind that for the first time, I bought several Intel-based servers.

Not sure what advantages you think AMD has over Intel... I would love to see a list. because frankly, it's sad to see AMD where it is.

That's easy. Core density per dollar in the same n rack units.
For the same number of dollars I can buy more real estate on which to run my virtualization stacks with AMD processors than Intel processors. And the savings extend beyond the hardware too. With more cores per socket, my VMWare licensing costs (per core or per VM, however you want to break it out) can be much lower with AMD processors in those sockets. So cheaper CAPEX (hardware and license costs) and cheaper OPEX (support subscription costs) ma

It doesn't because the price difference is still large even if you compare twice the cores on a Xeon to the number of cores on an Opteron. With more than two sockets the clock speed is the same on both.

Mostly, not having to worry about whether the HVM extensions are turned on in a particular motherboard / CPU combination. Because *all* of the AMDs (from the lowly Athlon64 X2 chips and up) have it turned on.

And 45W parts at the desktop are *very* nice in terms of noise. Even with the stock CPU fan, it makes for a very quiet desktop.

I've got a mix, AMD for massively parallel stuff and Xeons for stuff with a low number of threads. It's commercial software from a large company so GPU stuff is going to have to wait until the hype has been around for a decade and they pay some teenagers in India to code it.The number one AMD advantage to me was 64 cores and 128GB on a SuperMicro board in a decent chassis for $9k. The equivalent Intel machine does have more cores but costs more than around five AMD machines.

Tell it to Titan (http://en.wikipedia.org/wiki/Titan_%28supercomputer%29)Also when the price difference is more than 2x the price comparison still wins if you compare threads to cores, so I see your nitpick as misleading. Is it deliberately so or do you just know very little about the subject and made a mistake?

I'm in the market for a new Workstation. I've been looking at an Opteron instead of the desktop models.

Be careful. Strange workstation vendors have decided that if you're blowing $5k on a huge workstation 2 sockets, then naturally you don't mind if it sounds like it's being powered by a gas turbine located inside the case. Oh, and the exhaust of the gas turbine is then fed into a flugelhorn or vuvuzela just incase you are hard of hearing.

I'd probably go for single socket, 16-core Opteron on a supermicro or Tyan standard ATX board I can plop in my existing chassis. Supermicro will need a breakout cable for front panel buttons, but no big deal. In this situation I can fit most sized heatsinks just need to be sure it will fit on the socket.

Do your 'workstations' have intel xeon cpus or amd cpus? Otherwise, it is quite unlikely that they have ECC memory, which is pretty much required of real workstations.
Real workstations are reliable (and quiet). Just being quiet doesn't count.

The loud workstation I'm referring to was an AMD one. The quiet GPU monster was an Intel Core i7, so it didn't have ECC, but neither did the 3 graphics cards which were being used for computation, so it was a bit of a wash, really. And it actually needed proper graphics cards, since it relied heavily on the texture sampling unit for the computations.

That's the other nice thing about AMD is that you can get cheap machines with ECC: the standard single s

I think they do now. Or at least the computation cards do. I don't think they have them on the consumer level graphics cards. It turned out that the consumer graphics cards were better for our needs since they had more and faster texture sampling units. Since the workload was image processing, that was necessary to get good performance. The computation cards tend not to be so good if you're doing graphics or graphics-like workloads as they dedicate more resources to floating point and fewer to pixel specifi

I don't think they have them on the consumer level graphics cards. It turned out that the consumer graphics cards were better for our needs since they had more and faster texture sampling units.

I recently bought a Radeon HD 7850 2GB card that I'm pretty sure has ECC. Cost was $200. It's pretty cool - if you overclock the memory too high, the card doesn't start crashing your games or anything, it just doesn't run any faster.

Can you image the amount of heat 64cores of Bulldozers in a 1U case would create?!

I've been looking at custom building a server at home and when looking in the 64GB-256GB of ram with single-dual socket 8-32 threads, Intel wins in price, performance, and power consumption.

Choosing between an Intel Xeon i5 3.3ghz quad+HT with 65watt TDP compared to an Opteron Bulldozer 4module-8core 2.8ghz with a 125watt TDP and lower IPC, for an almost identical price isn't even a choice.

Can you image the amount of heat 64cores of Bulldozers in a 1U case would create?!

That's why you put the noisy little buggers in a server room and keep the door closed.As a relatively early adopter my 48 core AMD is in 5U, but you can get the newer 4 socket 16 core ones in 1U. I've got a pile of twin 8 core systems in 1U from a few years ago that put out about the same amount of heat as a recent 64 core system would, and they are like hairdryers.

It entirely depends on what the code is doing whether it acts as a complete core or not. The part in question acts as two floating point operators for normal length floating point variables and as one for double length floating point variables. In a lot of situations it may act as a two floating point units for the entire lifetime of the server.The hype makes it look like a crippling stupid hack like the netburst stuff but reality is a fairly simple tradeoff that's not going to make any difference to what

What AMD calls two cores is actually a single core with two integer units with two ALU's each and two FPU's. What Intel calls a single core has two ALU's, 2 FPU's. and 3 SSE units. Their 'dual-core' modules are very similar to a single hyper-threading core.

Although having hot-swap bays in an ESX(i) server is nice, one of the whole points of the VMware infrastructure is that you can down a physical machine for maintenance without affecting any services.

You'd still have some sort of redundant disk (RAID-1 at least) on the ESX server, and if a drive fails, you just migrate all the VMs to a different server, replace the failed drive, let the RAID rebuild while nothing is running on that machine (which means the rebuild can be given high priority and complete more

Did you actually read the article? It says the performance of the 8-core AMD FX-8350 has similar performance as the 4-core Intel Core i5-3570K. In most of the tests, AMD actually performed worse.

So where's the disconnect? If words are too hard for you, just go look at the graphs.

Wake up. Two years ago Intel went on a media blitz dismissing parallelism and GPGPUs. Behind the scenes, they spending billions catching up in an attempt to cut AMD off at the knees because they know AMD has a huge lead with their new architecture. Software from all walks of life are moving to parallelism and OpenCL. OpenCL 1.2 latest updates in the API is a huge leap and AMD with Apple are leading the charge. Intel is busting itself and come next year Xeon Phi will be crammed down everyone's throat via ma

Yes, but the Intel multi socket capable stuff with a lot of cores was also stuck on 2GHz a couple of months back, and I haven't heard of anything since. That means at the top end of highly parallel systems an Intel core is around the same capability as an AMD one, but a lot more expensive. The extra expense may be worth it if you want 80 cores in a box instead of 64, but that 80 core machine is close to ten times the cost of the AMD 64 core machine and the same clock speed per core. For some things there

If I read what you wrote as a sarcastic statement intended to be read as the opposite of what you wrote, you come off sounding a lot more intelligent. Intel does have a leg up at the moment, but everything after that is incorrect and misguided in what you said. Here's hoping competition stays alive, socketed CPUs stay around, and AMD has a long life ahead of them (one they earned, of course).

Okay so they're the only x86 CPU offering 1.25v DDR3 support but the difference between a pair of 1.25v and 1.5v DIMMs is around 4 W [tomshardware.com] and you can save 3 of those 4 W moving to the commonly available 1.35v DDR3. Meanwhile AMD keeps putting out 125W processors like the FX-8350 to not really compete with a 77W processor like the i7-3770K, so this "major datacenter advantage" I think I'll file under "major wishful thinking". Not to mention you're investing into a platform with little future since AMD wants to push ARM servers now. But I guess Intel has let AMD put a positive spin on continuing to deliver on old sockets.

The i7 3770K has a TDP of 95W. And the FX-8350 is a very good chip and much cheaper than the i7. The benchmarks relative to the i7 are all over the place. In most cases it sits somewhere between the i5 and i7. In some cases it is destroyed by the i7, in other cases, the reverse is true. The single threaded performane is quite weak and usually substantially less than the i5, but then the i5 to i7 difference isn't enormous. The difference from FX8350 to i7 seems to be around 20-50% in most cases.

Curiously the AMD processors tend to stack up better on the Linux benchmark suites.

Anyway.

This thread is about the Opteron processors, which are still (a) competing against SB, (b) benefit from substantially cheaper full system costs and (c) you aren't terribly sensitive to single thread performance if you're buying a 4 socket server.

Not to mention you're investing into a platform with little future

What does that even mean? It's all x86, so even if AMD vanishes tomorrow you can keep using the servers and then transition to intel when you need new ones. The whole point of having more than one vendor means that no matter what, you're not investing in a platform with no futuer.

I know that, at least in the past, Intel used to issue TDP numbers that represented "typical" heat, while AMD used to issue TDP numbers that represented worst-case heat (which is what TDP ought to be IMHO). I have read here on Slashdot that more recently, AMD has started playing those games as well.

But according to NordicHardware [nordichardware.com], in this case Intel is under-promising and over-delivering, and the chips really do dissipate only 77W despite being rated for 95W. (But how did they measure that? Is this a "typical" 77W? I guess it's not that hard to run a benchmark test that should hammer the chip and get a worst-case number that way.)

Curiously the AMD processors tend to stack up better on the Linux benchmark suites.

This is probably because Linux benchmarks were compiled with GCC or Clang rather than the Intel compiler. The Intel compiler deliberately generates code that makes the compiled code run poorly on non-Intel processors. The code checks the CPU ID, and the code has two major branches: the good path, which Intel chips get to run, and the poor path, which other chips run.

The irony is that Intel, by investing heavily in fab technology, is about two generations ahead of everyone else, so they can make faster and/or lower-power parts than everyone else. This means they could be competing fairly and win.

But because Intel does evil things like making their compiler sabotage their competition, I refuse to buy Intel. They have lost my business. They don't care of course, because there aren't many like me who are paying attention and care enough to change their buying habits.

If you want the fastest possible desktop computer, pay the big bucks for a top-of-the-line i7 system. But if you merely want a very fast desktop computer that can play all the games, an AMD will do quite well, and will cost a bit less. So giving up Intel isn't a hard thing to do, really.

I know that, at least in the past, Intel used to issue TDP numbers that represented "typical" heat, while AMD used to issue TDP numbers that represented worst-case heat (which is what TDP ought to be IMHO). I have read here on Slashdot that more recently, AMD has started playing those games as well.

But according to NordicHardware [nordichardware.com], in this case Intel is under-promising and over-delivering, and the chips really do dissipate only 77W despite being rated for 95W. (But how did they measure that? Is this a "typical" 77W? I guess it's not that hard to run a benchmark test that should hammer the chip and get a worst-case number that way.)

Curiously the AMD processors tend to stack up better on the Linux benchmark suites.

This is probably because Linux benchmarks were compiled with GCC or Clang rather than the Intel compiler. The Intel compiler deliberately generates code that makes the compiled code run poorly on non-Intel processors. The code checks the CPU ID, and the code has two major branches: the good path, which Intel chips get to run, and the poor path, which other chips run.

The irony is that Intel, by investing heavily in fab technology, is about two generations ahead of everyone else, so they can make faster and/or lower-power parts than everyone else. This means they could be competing fairly and win.

But because Intel does evil things like making their compiler sabotage their competition, I refuse to buy Intel. They have lost my business. They don't care of course, because there aren't many like me who are paying attention and care enough to change their buying habits.

If you want the fastest possible desktop computer, pay the big bucks for a top-of-the-line i7 system. But if you merely want a very fast desktop computer that can play all the games, an AMD will do quite well, and will cost a bit less. So giving up Intel isn't a hard thing to do, really.

I know that, at least in the past, Intel used to issue TDP numbers that represented "typical" heat, while AMD used to issue TDP numbers that represented worst-case heat (which is what TDP ought to be IMHO). I have read here on Slashdot that more recently, AMD has started playing those games as well.

At least in the past, Intel used to issue TDP numbers that ignored the chipset, while AMD would compare CPU+chipset, because intel's chipsets would suck down the power like nobody's business and AMD's wouldn't. In the early amd64 days, companies actually built (17") laptops around the desktop Athlon64 because CPU+Chipset had lower TDP than intel's mobile offering. When you're competing against Mobile P4, that's not a surprise. But the same situation persisted all the way through the Core 2 Duo days! Intel's

The Opterons in the new series have a TDP of 65W or 95W for the 8-core models. At the expense of being clocked lower than a FX-8350, but the performance per watt is still better than for the FX-8350.

Looking at the 4 core models, the 3350HE may be a worthy replacement for the Athlon II X4 910e I have in my current desktop:Four cores like the Athlon, 2.8 GHz clock speed where the Athlon had 2.6 GHz and only 45W TDP versus the 65W of the Athlon. In terms of pricing, the 3350HE seems to be similar to where the

No it doesn't, but I guess if you don't have facts use FUD. Intel has kept a "segment TDP" on the retail packaging because they want all Sandy/Ivy Bridge motherboards, coolers etc. to support 95W processors - the maximum in the Sandy Bridge line - but the actual processor will never use more than 77W. This was explained here [nordichardware.com] but Intel's site and 99,9% of all reviews and online sites will list it as a 77W processor. In fact the 95W figure is so rare that only reason to bring it up - particularly ignoring all

Did you really write about desktop CPUs and then use them to try to debunk the phrase "major datacenter advantage"? You did? Why? Do you really think the people reading this thread are that stupid and why do you want to manipulate them?

And on the desktop side, ZDNet reports that AMD will remain committed to the hobbyist market with socketed CPUs.

On a related note, I find it quite weird that Intel willfully forfeits their own future over to motherboard makers. It makes little sense to me to depend on third parties you've absolutely no control on to fix the price of the final product that your current customers -- computer makers -- end up buying, irrespective of the fact that Intel itself makes motherboards. I must be missing something besides the obvious (aka it's thinner, which incidentally ensures that AMD has to do this too for laptops). Slashdo

On a related note, I find it quite weird that Intel willfully forfeits their own future over to motherboard makers. It makes little sense to me to depend on third parties you've absolutely no control on to fix the price of the final product that your current customers -- computer makers -- end up buying

Intel brings out a new CPU socket every week, so nobody upgrades their Intel CPU because they can't, the new CPU takes a new socket. AMD brings out new CPU sockets only rarely, so people do sometimes upgrade their CPU. Intel processors appeal to the PHB market, and AMD processors appeal to the nerd market. Intel is known for being expensive. Cost-cutting makes sense, and the socket is expensive.

I was hoping AMD could release a faster workstation level Opteron 4300 to match the FX-8350, the top end 4386 is still a 3.1Ghz (turbo to 3.8Ghz) butthe fastest 8 core Opteron 6328 is 3.2Ghz and goes to 3.8Ghz turbo. (but Opteron 4386's TDP is 95w vs Opteron 6328's 115w) while FX-8350 is 4Ghz turbo to 4.2Ghz and consuming 125w