Well, I was thinking more of a new socket (am4?) where newer features would be there, but "Bulldozer" would still fit in AM3 sockets to retain "some" backwards compatability. Or it might go the way of lga1156/1157 and new platform time.

Heres to hoping AMD will put up a fight this round. according to the information we have so far though, it should be an interesting race in 2011/2012. here is a glimpse of what the high end market has in store from both companies

Overall the AMD architecture looks like it may outperform Intel due to the higher native
memory speed. And also because of price. the Sandy Bridge 8 core will be a 3.2GHz EE
only, and carry a price tag of $1000 whereas bulldozer will carry a price tag
somewhere in the $300-500 range. Also, there is the fact that AM3+ will be backwards
compatible over the course of a few CPU generations, whereas socket 2011 will last
2-3 years, like the current LGA1366 platform.

Looks like my next build may be an AMD build. but for the love of god, please score a 7.9 on WEI. I need those kind of bragging rights

Price is determined by competitiveness. If the Bulldozer has a $300-500 price compared to Sand Bridge's $1000, that means Sandy Bridge is still considerably faster than Bulldozer.

Memory speed also isn't very telling in terms of overall performance. Core i7 920, for example, has DDR3-1066 while AM3 processors are DDR3-1333. The 920 has tri-channel memory instead of dual though so it slaughters it in terms of sheer bandwidth. Even if you handicap the 920's memory, it's still a faster chip than Phenom II clock for clock.

AMD is throwing some new technologies out there and we just don't know how well they'll perform until they get benchmarked by 3rd party sources.

I remember reading something recently where one scientist said that the speed of light was just too damn slow. Amazing.

Click to expand...

Then that scientist was a moron, not uncommon among scientist. It's a downside of being a specialist in a mentally rigid system. Like how specific doctors always look at your symptoms in terms of their specialty often giving you a wrong diagnosis. You need a generalist to get proper direction for a treatment... but I digress. The speed of light isn't the problem, the architecture and materials used are. Switching from silicone could net a 10x increase in speed. Switching to an optical/laser architecture could bring a 10x-100x improvement. Ballistic deflection processors could bring a radical change even sticking with silicone, about 1000x. The real problem is the current architecture is just so shitty. Flipping switches on and off is so wasteful. It's hard for me to get excited about the 10-15% increase we get from new cpu architecture because I know it's just so half-assed. A company like Intel could fast track development of one of these techs at the expense of the tic-toc development pattern while still maintaining a performance lead with existing i7 products. Once they came out with one of those news archs they'd rape the market and easily become the most valued company in the world. Probably get in trouble for a monopoly but it wouldn't be their fault the competition fell 20 years behind.

Then that scientist was a moron, not uncommon among scientist. It's a downside of being a specialist in a mentally rigid system. Like how specific doctors always look at your symptoms in terms of their specialty often giving you a wrong diagnosis. You need a generalist to get proper direction for a treatment... but I digress. The speed of light isn't the problem, the architecture and materials used are. Switching from silicone could net a 10x increase in speed. Switching to an optical/laser architecture could bring a 10x-100x improvement. Ballistic deflection processors could bring a radical change even sticking with silicone, about 1000x. The real problem is the current architecture is just so shitty. Flipping switches on and off is so wasteful. It's hard for me to get excited about the 10-15% increase we get from new cpu architecture because I know it's just so half-assed. A company like Intel could fast track development of one of these techs at the expense of the tic-toc development pattern while still maintaining a performance lead with existing i7 products. Once they came out with one of those news archs they'd rape the market and easily become the most valued company in the world. Probably get in trouble for a monopoly but it wouldn't be their fault the competition fell 20 years behind.

Click to expand...

I wish I could remember the context. I just thought it was a funny comment. Not sure why you think it was moronic given you have no idea what the context was.

Anyway, that's not really relevant. However some links for the stats you posted would be interesting.

Not that there's any guarantee it will continue on, but Moore's law has met and conquered many obstacles that previously appeared insurmountable in the past. The ongoing march of technological progress is not a steady process of gradually building up to higher and higher levels of power and sophistication, but a capricious beast that has periods of rest as well as unforeseeable breakthroughs.

Who knows ... optical and/or quantum computing may swoop in to save the day, or it may be something completely out of the blue that nobody has thought of. Moore's Law has been called a self-fulfilling prophecy because the industry strives to keep pace with it as a benchmark. They will not fail it without a fight.

^^ In a perfect world, price might be determined by competitiveness, but we are not so lucky as to live in a world so simple. First, the range of pricing is not linear—generally, after a certain point, as you go up in performance, you drop in performance per dollar. And the question must be asked, "performance in what?" Different architectures and designs are advantageous for certain types of computing but not for others. Then there's brand loyalty, which may allow, for example, one company to generate better sales at a higher price point than a competing product simply because buyers in that segment are typically loyal to that brand. Finally, supply vs demand in a certain segment and product differentiation play a big role in pricing.

Just a general observation, this forum is generally anti-intel (let me add Nvidia). I don't own any Intel hardware yet so, this is unbiased. If AMD starts outperforming Intel processors, I expect the price to rise accordingly. I think that's what makes AMD look good. You pay for the performance you know your gonna get with Intel, but AMD gives you decent performance for much less. It usually shows with Intel with gaming or benchmarking. How does AMD sell their processors cheaper? Same product, but massive difference in prices. I'm being serious. Are they using lower quality/cheaper parts?

How does AMD sell their processors cheaper? Same product, but massive difference in prices. I'm being serious. Are they using lower quality/cheaper parts?

Click to expand...

Designing a CPU takes years and costs millions whereas once actual production starts it costs very little to create and then the actual profit they make from selling them so cheaply covers the cost of design. Since intel has the performance crown they can sell a chip that only cost a few dollars to make for $1000 because it's the fastest. If AMD did the same no-one would buy it so they sell it for much less because more people would buy it.

I'm even on the scale with both, but if that's true, that would help sway someone's mind Well, I know for a fact that AMDs handle voltage a lot better than Intels. But Intel OC high on the voltage they do take. I can't translate this into quality.