What were the benefits of Itanium over the x86-64 processors? I seemed to have missed this line completely.

For bespoke applications designed for the architecture, performance could be significantly higher than on the equivelent x86 architecture. Additionally it's a much simpler architecture, with very little bloat - unlike x86. The biggest apparent difference though - is that the onus was on the programmer to make a really efficient and fast program, rather than x86 where the onus is more on the CPU.

Both approaches have merit, military and scientific fields I would imagine prefer something like Itanium - it's a very predictable CPU. But for general use, x86 is vastly superior since you can get apps to perform at an acceptable level very, very quickly.

Itanium was never meant to move into the mainstream though - but unfortunately for Intel, x86-64 (Opteron) rapidly expanded into the workstation / server market, and Itanium wasn't competitive enough at the top-end. Cue Conroe-based Xeons and the decline of Itanium.

The Itanium put Intels "because we can" attitude to a new level*
Thank you AMD for pursuing the alternative X86-64 route.

*As in: why do we present not backwards compatible and horrendously expensive hardware?

At some point we will have to abandon the x86 architecture. New computing paradigms will arrive and it simply won't be enough. Clinging to the old simply because its comfortable only holds back progress. Why do we still have floppy/IDE ports on motherboards? Why do we still use PCI slots? They are certainly not better than any modern solution but we have to keep them around taking up space, increasing costs just because a minority of people have old hardware they want to keep on using. In some applications (normally where the hardware is specialised and expensive) this is fine but why hold back the bleeding edge for them?

With increasing parallelism in processing the x86 architecture will eventually run out of steam and eventually be replaced. Itanium was better than x86 for many applications and it is important to do things "just because we can". That's how a lot of progress gets made. We went to the Moon "because we can" we built the LHC "because we can". Trying new ideas is not a bad thing. Itanium may have failed but it was just one product and it had its own problems of course, but its good that Intel made it, if only as a showcase product.

@Cobalt: But won't the death (to the higher end of the market) likely coincide with the inability to realistically push conventional chipmaking any further? Whatever succeeds the silicon chip will, at least initially, be pretty well out of the mainstream, so designing a new architecture around it seems a logical step. Meanwhile, we may as well stay with x86 (although nvidia at least wants us, for the most part, not to)

what we need is a simpler processor instruction set. with GPU's RISC-like instructions and ARM's success, i can really see 50 years from now, Intel is out of the market, ARM being the biggest player, pushing out new architecture every 2 years.

Itanium is/was a purely 64 bit architecture and most CPU's today and x86-64? Which means that they (x86-64) can run both 32 and 64bit programs but itanium can only run 64 bit? im sure theres more differences than that but most people are already switching to 64 bit OS's so wouldnt it make more sense to eventually shift processors and programs completely to 64bit?

well, to have compatibility, all current 64bit operating systems (Windows, Linux and Mac) have backwards compatible mode, on Windows it's called WOW64. so normal windows (as in 7, XP) only supports x86-64.

Itanium is a extremely complex version of x86-64 but cuts out the x86 part. so it does not have backwards compatibility while making all assembly instructions incredibility difficult.

At some point we will have to abandon the x86 architecture. New computing paradigms will arrive and it simply won't be enough. Clinging to the old simply because its comfortable only holds back progress. Why do we still have floppy/IDE ports on motherboards? Why do we still use PCI slots? They are certainly not better than any modern solution but we have to keep them around taking up space, increasing costs just because a minority of people have old hardware they want to keep on using. In some applications (normally where the hardware is specialised and expensive) this is fine but why hold back the bleeding edge for them?

With increasing parallelism in processing the x86 architecture will eventually run out of steam and eventually be replaced. Itanium was better than x86 for many applications and it is important to do things "just because we can". That's how a lot of progress gets made. We went to the Moon "because we can" we built the LHC "because we can". Trying new ideas is not a bad thing. Itanium may have failed but it was just one product and it had its own problems of course, but its good that Intel made it, if only as a showcase product.

Floppy disk ports are still on motherboards because up until Vista/Server 2008 you needed a floppy disk to load drivers for windows install (XP/Server 2000/2003)

it is important to do things "just because we can". That's how a lot of progress gets made. We went to the Moon "because we can" we built the LHC "because we can". Trying new ideas is not a bad thing. Itanium may have failed but it was just one product and it had its own problems of course, but its good that Intel made it, if only as a showcase product.

Don't get me wrong, with "because we can" I meant, server people asking for server infrastructure BEGGED for backwards compatibility.
Most have specially written software, that couldn't be changed quickly (or cost effective)
Intel didn't offer it to them (nor faster serverprocessors that were 32bit).
When Intel says, the architecture changes, then it changes, and everyone has to comply.

Only this time it didn't work out because of Opteron.

Somebody has to set the industry standarts, and Intel usually has the might to do so.
But sometimes, they change things "because they can", nevermind if the market wants it or not. (Itanium, P4 and Rambus come to mind, but AMD saved the day)