I can tell you why without reading a damned article split into four pages: Because they're still used in all sorts of equipment, from parking meters to vending machines. The economies of scale saturated a long time ago due to PC demand alone, it's very familiar technology and all the computing power that's needed for over 99% of products that have any sort of "smart" functionality.

dragonchild:I can tell you why without reading a damned article split into four pages: Because they're still used in all sorts of equipment, from parking meters to vending machines. The economies of scale saturated a long time ago due to PC demand alone, it's very familiar technology and all the computing power that's needed for over 99% of products that have any sort of "smart" functionality.

Uh Huh. Spoken with true authority, though.

I once used an x86 in a toaster design. Didn't even need a heating element!

1) Intel's attempts at making alternate microarchitectures sucked and had terrible performance and Intel couldn't make good compilers for them.2) Nobody wants to ditch the architecture which is running all of their old programs.

Sounds more like they created experimental designs that commonly had issues because they were experimental. They worked out the issues and merged the new features into the stable design. Rinse and repeat.

I put all the blame on Microsoft. They and the app developers own the x86 architecture. While I can run Linux (and its apps) on anything from my phone to my router to my desktop, Windows only runs in a few places. Until Microsoft comes up with a migration path to a new architecture, Intel is going to be stuck with x86 for the long haul.

Oh, and cost. x86 is a commodity - ARM is pretty close for bang:buck ratio, but Windows RT is pretty much a flop. Why? No apps. There's no incentive to go to another architecture if the cost isn't going to go down in the near term.

Apple was able to get away with it by doing what they usually do: tell people to f*ck off and buy new hardware if they don't like it. They did it in the move to PPC and in the move to x86.

Over time, I think Apple is going to have the same architecture for iOS and OSX. They're already starting this by abstracting the applications using the app store. Microsoft should go that direction, but the landscape for their apps is way bigger.

enry:I put all the blame on Microsoft. They and the app developers own the x86 architecture. While I can run Linux (and its apps) on anything from my phone to my router to my desktop, Windows only runs in a few places. Until Microsoft comes up with a migration path to a new architecture, Intel is going to be stuck with x86 for the long haul.

Most properly designed x86 programs could be ported to a different architecture just by re-compiling the program for a different architecture. For example: a huge portion of Linux's codebase is generic code, with architecture-specific versions for a relatively small amount of architecture-sensitive code.

You might lose some optimization here and there if the program is heavily targeted toward x86 code, but it wouldn't be crazy.

Fubini:enry: I put all the blame on Microsoft. They and the app developers own the x86 architecture. While I can run Linux (and its apps) on anything from my phone to my router to my desktop, Windows only runs in a few places. Until Microsoft comes up with a migration path to a new architecture, Intel is going to be stuck with x86 for the long haul.

Most properly designed x86 programs could be ported to a different architecture just by re-compiling the program for a different architecture. For example: a huge portion of Linux's codebase is generic code, with architecture-specific versions for a relatively small amount of architecture-sensitive code.

You might lose some optimization here and there if the program is heavily targeted toward x86 code, but it wouldn't be crazy.

Distro vendors can recompile the code, and there is some commercial code that is stuck on x86. As for Microsoft, they'd have to do a good bit of development to get code working in another architecture and even then, they'd then have to convince their user base to use it (see Windows NT for Alpha and Windows RT).

Yes, but compatibility wise its no different then a different architecture that happens to also run old x86 code. There has been several other attempts at architecture changes that have included x86 compatibility (but they ran slowly and poorly), this was the first to succeed in the market because it ran old x86 well.

enry:Fubini: enry: I put all the blame on Microsoft. They and the app developers own the x86 architecture. While I can run Linux (and its apps) on anything from my phone to my router to my desktop, Windows only runs in a few places. Until Microsoft comes up with a migration path to a new architecture, Intel is going to be stuck with x86 for the long haul.

Most properly designed x86 programs could be ported to a different architecture just by re-compiling the program for a different architecture. For example: a huge portion of Linux's codebase is generic code, with architecture-specific versions for a relatively small amount of architecture-sensitive code.

You might lose some optimization here and there if the program is heavily targeted toward x86 code, but it wouldn't be crazy.

Distro vendors can recompile the code, and there is some commercial code that is stuck on x86. As for Microsoft, they'd have to do a good bit of development to get code working in another architecture and even then, they'd then have to convince their user base to use it (see Windows NT for Alpha and Windows RT).

But all it would take for widespread adoption of 64bit and the replacement of 32bit... is daily things having fully 64bit builds. Browsers and their plugins and extensions, Microsoft Office, etc etc.

That said, server end when there's crap ton of memory used... there's a reason SQL Server has a 64bit version, and did pretty early on the 64bit cycle.

90+% of users wouldn't really benefit from the native benefits (such as expanded memory allotment) of 64bit. They simple don't do anything that requires it. Those areas where it does have huge benefit, pretty much have 64bit releases already.

Quantumbunny:such as expanded memory allotment) of 64bit. They simple don't do anything that requires it. Those areas where it does have huge benefit, pretty much have 64bit releases already.

To be fair, being able to stick an unholy farkton of RAM on your windows box rather than the x32 maximum of a mere mortal farkton of RAM pretty much blasts most of your windows resource usage issues and memory leak issues right out the goddamned window.

Which is nice, even if all you're doing with your windows box is checking your e-mail twice a day. Gives you de facto infinite uptime barring power outages and major updates.

Quantumbunny:But all it would take for widespread adoption of 64bit and the replacement of 32bit... is daily things having fully 64bit builds. Browsers and their plugins and extensions, Microsoft Office, etc etc.

That said, server end when there's crap ton of memory used... there's a reason SQL Server has a 64bit version, and did pretty early on the 64bit cycle.

90+% of users wouldn't really benefit from the native benefits (such as expanded memory allotment) of 64bit. They simple don't do anything that requires it. Those areas where it does have huge benefit, pretty much have 64bit releases already.

I'm using x86 to include amd64/x86_64, so to me this really isn't a 64/32 bit issue, it's more about the underlying instruction set.

Jim_Callahan:Quantumbunny:such as expanded memory allotment) of 64bit. They simple don't do anything that requires it. Those areas where it does have huge benefit, pretty much have 64bit releases already.

To be fair, being able to stick an unholy farkton of RAM on your windows box rather than the x32 maximum of a mere mortal farkton of RAM pretty much blasts most of your windows resource usage issues and memory leak issues right out the goddamned window.

Which is nice, even if all you're doing with your windows box is checking your e-mail twice a day. Gives you de facto infinite uptime barring power outages and major updates.

Sure, and your OS is exactly one of the areas which has been 64 bit since shortly after day 1 of 64 biatchips.

But each application/process, doesn't need 4gig, let alone more for Joe Schmoe.

A handful of games and super intensive photo or video editing programs need it, not much else.

enry:I put all the blame on Microsoft. They and the app developers own the x86 architecture. While I can run Linux (and its apps) on anything from my phone to my router to my desktop, Windows only runs in a few places. Until Microsoft comes up with a migration path to a new architecture, Intel is going to be stuck with x86 for the long haul.

Oh, and cost. x86 is a commodity - ARM is pretty close for bang:buck ratio, but Windows RT is pretty much a flop. Why? No apps. There's no incentive to go to another architecture if the cost isn't going to go down in the near term.

Apple was able to get away with it by doing what they usually do: tell people to f*ck off and buy new hardware if they don't like it. They did it in the move to PPC and in the move to x86.

Over time, I think Apple is going to have the same architecture for iOS and OSX. They're already starting this by abstracting the applications using the app store. Microsoft should go that direction, but the landscape for their apps is way bigger.

Actually, Apple did provide an upgrade path of sorts in the Intel transition. For a few years after they launched Intel Macs, they shipped both Intel and PPC versions of all of their software (so-called "Universal binaries"), and the Intel Macs came with an emulator ("Rosetta") that would run older PPC apps on an Intel Mac (albeit slowly). They also released developer tools for the Intel Macs at the same time as the Intel Macs themselves.

Admittedly, though, it wasn't that good of an upgrade path, since Apple completely dropped support for PPC hardware just two OS versions later, and dropped all support for legacy PPC apps one release after that. But that's still an improvement from how Microsoft is dealing with Windows RT.

I think they did something similar when they went from m68k to PPC, but I don't remember the specifics.

Yes, but compatibility wise its no different then a different architecture that happens to also run old x86 code. There has been several other attempts at architecture changes that have included x86 compatibility (but they ran slowly and poorly), this was the first to succeed in the market because it ran old x86 well.

Let me rephrase that- x86-64 is a 64-bit extension of x86. It's still an x86 architecture but with extra 64-bit registers and functionality. It's a misnomer to say that x86-64 replaced x86, because x86 is still an intrinsic part of x86-64. What I said was supposed to be a joke.

I guess this requires a bit of explanation. The reason that Intel and other chip makers want to get away from x86 is that it's a pretty badly designed instruction set (for various reasons), and in particular it contains a huge number of very, very legacy instructions that are difficult and expensive to support, yet are used very little by modern software. These instructions are a throwback to the days when everyone had to write purely in assembly, so having a single CPU instruction to do something like manage a stack was a godsend. No one does that these days- stacks are managed through primitive operations (arithmetic and dereferencing) because they're written at a high level in a high level programming language. But the instructions live on, both because they're part of the standard and because a very small but vocal minority would cry bloody murder if they couldn't run their 16-bit protected mode applications anymore.

There are two approaches to supporting legacy x86 instructions, they can be supported natively or through emulation, and essentially it's the difference of expensive and fast vs. cheap and slow. If you're going to call yourself an x86 or x86-64 instruction set, you need to support the entire standard relatively natively, or else your processor comes crashing to a halt every time it encounters a legacy instruction. They've been trying to work around this for years (a decade?) through the use of micro-ops and macro-op decomposition, but this still adds huge overhead in terms of the chip structures needed to support that. Hence, modern x86 and x86-64 architectures try to mitigate the bad things about x86, but they don't succeed completely.

Everyone wishes x86 would just die completely. It was great in it's time, but we've learned a lot about processor design since then.

Quantumbunny:But all it would take for widespread adoption of 64bit and the replacement of 32bit... is daily things having fully 64bit builds. Browsers and their plugins and extensions, Microsoft Office, etc etc.

The big thing standing in the way of full 64-bit adoption is legacy hardware and legacy drivers. A lot of old hardware doesn't have 64-bit drivers, and no one is willing (or able) to bring them into the modern age. The hardware manufacturers feel that they have to support 32-bit drivers for a long time to come, so there's little to no rush to deploy robust 64 bit drivers. This is worsened by the fact that modern processors will continue to support 32 bit legacy mode for some time, and this reinforces the lackadaisical attitude toward really supporting 64 bit fully.

I just had a 64 bit driver problem not too long ago, and we've had this technology for years.

Yes, but compatibility wise its no different then a different architecture that happens to also run old x86 code. There has been several other attempts at architecture changes that have included x86 compatibility (but they ran slowly and poorly), this was the first to succeed in the market because it ran old x86 well.

Let me rephrase that- x86-64 is a 64-bit extension of x86. It's still an x86 architecture but with extra 64-bit registers and functionality. It's a misnomer to say that x86-64 replaced x86, because x86 is still an intrinsic part of x86-64. What I said was supposed to be a joke.

I guess this requires a bit of explanation. The reason that Intel and other chip makers want to get away from x86 is that it's a pretty badly designed instruction set (for various reasons), and in particular it contains a huge number of very, very legacy instructions that are difficult and expensive to support, yet are used very little by modern software. These instructions are a throwback to the days when everyone had to write purely in assembly, so having a single CPU instruction to do something like manage a stack was a godsend. No one does that these days- stacks are managed through primitive operations (arithmetic and dereferencing) because they're written at a high level in a high level programming language. But the instructions live on, both because they're part of the standard and because a very small but vocal minority would cry bloody murder if they couldn't run their 16-bit protected mode applications anymore.

There are two approaches to supporting legacy x86 instructions, they can be supported natively or through emulation, and essentially it's the difference of expensive and fast vs. cheap and slow. If you're going to call yourself an x86 or x86-64 instruction set, you need to support the entire standard relatively natively, or else your processor comes crashing to a halt every time it encounters a legacy instruction. They've been trying to work around this for years (a decade?) through the use of micro-ops and macro-op decomposition, but this still adds huge overhead in terms of the chip structures needed to support that. Hence, modern x86 and x86-64 architectures try to mitigate the bad things about x86, but they don't succeed completely.

Everyone wishes x86 would just die completely. It was great in it's time, but we've learned a lot about processor design since then.

Yes, but x64 solves the migration problem. In 20 years (smirk) when nothing is running x86-32 processors can finally ditch the backwards support.

MindStalker:Yes, but x64 solves the migration problem. In 20 years (smirk) when nothing is running x86-32 processors can finally ditch the backwards support.

I would doubt that's a viable strategy. Also, there is no x86-32, it's just x86. The term x86-64 is a generic name for AMD64, which was followed by Intel64. I guess, technically, you could say x86-64 is the subset of AMD64 and Intel64 that is compatible with both.

enry:Over time, I think Apple is going to have the same architecture for iOS and OSX. They're already starting this by abstracting the applications using the app store. Microsoft should go that direction, but the landscape for their apps is way bigger.

I think you're very much correct. iOS needs some major additions to get to this point (file system, anybody?), but as the power and utility of those devices increase I can't imagine that Apple will not add to the convergence between the two platforms. There's things they could be doing now to make things better, but at some point there's no reason to not be able to run the same apps on your full size computer that you run on your phone and/or tablet.

akula:enry: Over time, I think Apple is going to have the same architecture for iOS and OSX. They're already starting this by abstracting the applications using the app store. Microsoft should go that direction, but the landscape for their apps is way bigger.

I think you're very much correct. iOS needs some major additions to get to this point (file system, anybody?), but as the power and utility of those devices increase I can't imagine that Apple will not add to the convergence between the two platforms. There's things they could be doing now to make things better, but at some point there's no reason to not be able to run the same apps on your full size computer that you run on your phone and/or tablet.

You're forgetting that if you can run the same app on your phone/tablet that you run on your desktop then they only get to sell it to you once.

Arumat:akula: enry: Over time, I think Apple is going to have the same architecture for iOS and OSX. They're already starting this by abstracting the applications using the app store. Microsoft should go that direction, but the landscape for their apps is way bigger.

I think you're very much correct. iOS needs some major additions to get to this point (file system, anybody?), but as the power and utility of those devices increase I can't imagine that Apple will not add to the convergence between the two platforms. There's things they could be doing now to make things better, but at some point there's no reason to not be able to run the same apps on your full size computer that you run on your phone and/or tablet.

You're forgetting that if you can run the same app on your phone/tablet that you run on your desktop then they only get to sell it to you once.

/never joining the iCult

Desktop version, tablet version. You already see it on some Android apps at least (tablet/phone)

Yes, but compatibility wise its no different then a different architecture that happens to also run old x86 code. There has been several other attempts at architecture changes that have included x86 compatibility (but they ran slowly and poorly), this was the first to succeed in the market because it ran old x86 well.

It is very different than other architectures that happen to run x86 code; those essentially have two processor front-ends, one for the new architecture and one for x86 (often a very slow firmware translation layer for x86). The instruction format for x86-64 is very much still x86 style. That is, it's variable instruction length with instructions that vary anywhere from simple ADD to complex MAC instructions with memory, register and gigantic immediate operands. This is what makes x86 "ancient" in comparison to more modern architectures. x86-64 cleaned up the instruction format a bit, but it retains much of the same complexities that make it a nightmare to decode.

Marine1:Personally... I like x86. Sure, it's old as fark, but when something is programmed to run on x86... it runs on x86 processors that meet the minimum requirements. I can't say that about ARM right now.

akula:enry: Over time, I think Apple is going to have the same architecture for iOS and OSX. They're already starting this by abstracting the applications using the app store. Microsoft should go that direction, but the landscape for their apps is way bigger.

I think you're very much correct. iOS needs some major additions to get to this point (file system, anybody?), but as the power and utility of those devices increase I can't imagine that Apple will not add to the convergence between the two platforms. There's things they could be doing now to make things better, but at some point there's no reason to not be able to run the same apps on your full size computer that you run on your phone and/or tablet.

For all practical purposes, iOS is just Mac OS X built for ARM, with the addition of a touchscreen UI and a highly constrained sandbox for third-party applications. It's almost certain that there is a real UNIX file system on iOS; it's just not accessible to the end user or to third-party apps.

What I think is more likely to happen is that Apple will come out with an ARM-based Mac laptop with insanely long battery life, but it will have an ARM version of Mac OS X rather than iOS, and it will retain the look and feel that's typically associated with Mac OS X. I don't foresee Apple rolling out a fully unified OS a la Windows 8 unless they can work out a way to make it not suck.

RangerTaylor:God, I remember the brain-melting speed when we went from the 386 to the 486DX. It was farking unbelievable. Star Control 2 went a HELL of a lot better on the 486DX.

This. My last year in highschool the school divison bought our school a 486 DX4/100. Gaming on that was a thing of beauty. You'd finish a level of a game in the time it took people on the slower computers to load the level.