96 Comments

Here should be included also WinRAR´s built_in benchmark & hardware test" in KB/s,
since it can be treated as a real life memory subsystem benchmark (& NOT a Data
Compression Bench! for CPU for example)

Didn't realize there was no hardware XOR. Thanks for clearing that up elecrzy.

xsilver, RAID 5 is a big deal and is a long way ahead of RAIDs 0 and 1. Most motherboards offer RAIDs 0 and 1, but only high end expensive ones offer onboard RAID 5. Without it, you need a SCSI or SATA RAID card which will run you a couple hundred bucks. To have that on a desktop board is a major deal. But again, since it's done by the CPU without XOR hardware, it's not that big a deal I guess.
Reply

#89: the chipset doesn't offer its own XOR processor for RAID 5 so it has to rely on the cpu to do the calcs. this basically means you lose alot of performance(high cpu usage) when compared to hardware raid cards.Reply

RAID 5 and 10 is indeed a big deal for a built in chipset. It is a little outside the scope for a desktop, but cool none the less. I would have to also give a win to nVidia for providing GbE on the chip. I guess Intel would rather people use their GbE separate chip.Reply

Intel has a nice chipset, as usual. Nvidia, as usual, clueless about audio desires which would add insignifigant price to chipset at great gains to most consumers. I don't really see the Nvidia recomendation at all unless you NEED, Sli. Intel has more feature, way better audio, the NCQ differences are really none and it's cheaper.

#88 -- I think it is old news... I think the older 9xx chipsets offered raid 0,1 for free so offering raid 5 on the newer chip may not be so crash hot??

and questar, talking to you is a bit like talking to a brick wall....
a lot of us here already explained that we are arguing about performance NOT volume... what you specify as "qualifications" is due to the sheer volume intel ships.... most people are aware that AMD only has 15% of the market.
If IBM,HP,Dell dont want to "qualify" AMD systems, its their loss, not ours
but no matter how you argue it, AMD has the performance advantage on everything, high end, middle and low end right now and only the laptop pentium M is the performance advantage for intel right nowReply

Am I the only one that noticed that the Intel chipset supports on board RAID 5?!?! That's amazing! No need to buy expensive raid cards anymore. I'm surprised they didn't pay any attention to that in the article.Reply

My case meets the standards for running prescotts..my 3.2 ES and my 2.8@3.5ghz ran perfectly fine in the same case, the 3.2ES also on the same motherboard, with all the same components, and neither my 2.8 or 3.2ES had the heat issues of my 3.4ghz chip. Not all of them run too hot, but some seem to do so no matter what cooling you throw at them. The 3.4 is still running warmer with water cooling, than my 3.2 ES(which I got from the chip loaner program) is with air.Reply

I think Questar is cramitpals nemisis..
Quester I work for Intel, even Intel knows that the prescott has heat issues, and that for most applications right now, AMD is winning in price/performance...having to use water cooling to get a chip to run at stock speed without throttling is a pretty major heat issue if you ask me..even a thermalright XP-120 couldn't keep my 3.4 prescott from throttling at stock speeds, my 3.2 ES, and 2.8 didn't have as much of an issue, but some of the prescotts are running WAY too hot..Reply

Well... it would be pretty stupid of a company to not roll with the punches. I'd rather see them shape their direction as they have that be rigid (which means they wouldn't compete anymore).

The reason no one jumped at 64bits in x86 land is because the volume of 64bit processors out there wasn't worth the effort. So... 100,000 64bit systems are out there... compare that with over 1000X that number of 32bit only systems. Which would you target if you were developing software? Now that 64bit will be mainstream, the market will move that way.

In addition, mom/pop won't see any real benefit from 64bit. There will be marginal speed benefits to having more general purpose registers, but that would have been the case in 32bit land. I doubt they are doing things that require more than 4G of memory, either. The only other real speed benefits are when you are manipulating 64bit integers and there just aren't that many apps that mom/pop use that need that kind of range.

I like AMD as well (all 5 of my personal desktops have them) but I'm not a blind follower. It's nice to be excited about technology but don't let it become a religion. If Intel released a better processor than AMD, I wouldn't hesitate to buy those instead.Reply

I'll stop you from having to post a link to Gateway, or somebody even smaller. They don't provide the services we contract for when we buy systems. Again, it's about the entire system, and everything that is wrapped around it.Reply

And I assume that means that in your companies infinite wisdom that AMD processors could never be qualified in such a way and even though AMD has tried to convince your company otherwise, you don't think it is the right time to stop kissing Intel's ass.

You don't by any chance work for a little company called Dell do you?Reply

For example, I know exactly when my current desktop system is going end of life, and I know what product will replace it. We will have POC units in our labs for our desktop engineering people to work with about 90 days before GA. They will buld the OS image for the systems, do extensive application testing, standardize the bios version and config (our PC vendor ships our systems preconfigured to our specifications). Everything that's different about the system from a technician standpoint will be documented. (Such as if you replace a system board, this is how to program the bios with the systems' asset tag). Our management systems will be updated with any changes that are needed to support the new systems.

In the interim, I'll stock up with about a thousand units of the old system to bridge over the transition (four week supply). This is SOP for any large corporation. Every Fortune 500 that has centralized IT functions do it this way.

This allows me to plan the resources that are needed to transition to the new system.Reply

Oh and one other thing. Because Intel has such an influence on the IT industry, their constant indecision has cost us many delays in technology.

Only until Intel moved to 64 bits in the last 12 months has Microsoft surprisingly come out with WinXP 64. The industry won't move unless Intel moves. And when they move everyone has to jump.

Programmers are now scrambling to optimize code for 64bits and multithreading because they can't get their 5 GHz singlethreaded 32bit Prescott anymore. AMD said it was going that way and that the rest of the IT industry needs to go to 64bits and dual cores. They said this almost two years ago when they released the K8 architecture.

No one jumped. No one changed their code. We are slaves to Intel's whims because no one will make a decision unless Intel tells them too. Innovation will always suffer at the likes of monopolies like Intel. Thankfully, AMD's processors were so superior to Intel's that the chip giant had to budge or face market share decreases.

If you think I'm a conspiracy theory crackpot, I let current events speak for themselves. Why does a 80%+ market share, $80 bill revenue per year, industry leader have to adopt any tactics from a small insignificant rival like AMD?

I am not done flaming questar after his last post about roadmaps (#54).

That is the biggest Intel loving bunch of bullsh*t I have ever heard.

To keep the record straight, Intel completely tore up its roadmaps last year when it cancelled plans for its 4GHz prescott. They completely changed their entire marketing and engineering strategy. Their entire roadmap is different now.

I get so mad when people like him just see Intel's name and assumes stable roadmaps. I've been following tech for many years now, so here we go.

Fact #1. Intel promoted Megahertz=performance for years.

Fact #2. Intel recently said that megahertz isn't everything and other factors need to be considered.

180 reversal.

Fact #3. Intel said they would never use model numbers to identify processors and that AMD was misleading their customers.

Fact #4. Intel now uses model numbers to identify their processors.

180 reversal.

Fact #5. Intel was going to use the netburst architecture up to 6 GHz+.

Fact #8. As I currently write this, no current Dell laptop models (Intel's largest distributor) have a pentium 4 in them.

180 reversal

Fact #9. Intel said a year ago that 64bits is not ready for many years and will not implement the technology until it is ready.

Fact #10. All currently released Intel processors except for 5xx and Pentium M use EM64T technology.

180 reversal.

Fact #11. There were no plans for dual core in Intel processors until AMD said that dual core was planned from the beginning of the K8 architecture.

Fact #12. Intels "fastests" processors are becoming dual core.

180 reversal.

Questar, exactly what 3 year roadmaps from Intel have remained unchanged? We have no idea if Intel will change on us again and go a completely different route. How do you know that the current 3 year roadmap will stay unchanged? Do you have a crystal ball? No one can foresee that far into the future in the IT industry.

3 years ago Intel showed roadmaps of the Pentium 4 to go to 6 GHz with no model numbers, no 64bits, no Pentium M, no dual core. I really need some clarification from you because I'm so confused.

And all you other loyal IT readers out there, please let me know if I left any other 180 reversal of Intel out. There have been so many it's hard to keep track.Reply

Motley, I totally agree PCI Express is good for alot of things - SCSI, iSCSI, Gigabit Ethernet, and others. However, you won't see neither one of them on the mainboards with 915, 925x or similar chipsets. Those kind of things appear on server-level chipsets, and some of them had PCI-E for quite some timeReply

"It's pretty clear - Intel's last few products have been worthless in many cases."
You damn americans - just because something is not the absolute number one, it means it is worthless?
I would like to have a Northwood processor at 3.6 GHz and at the price of a Sempron 2200, but that doesnt' mean Prescott are worthlessReply

Anand, You were right to keep this test Intel since that is what the test was about anyway, Dual Core Intel on either NVIDIA or Intel boards. When the Dual Core AMD's come out, then it would be great to see a test with both platforms on the NVIDIA NF4 boards.Reply

Anand; I hope that when you get the Athlon64 X2, if you don’t have it already, you will end up doing the most comprehensive benchmarking on the planet! The full Server/Desktop/Multimedia/Games package. Testing Single Xeon, Single Opteron, Dual Xeon, Dual Opteron, Dual-Core P4 and Dual-Core Opteron. Also I would recommend against using benchmarking suites, instead use real world applications. In fact, that would be an interesting tests in it self. Does benchmarking suites really give accurate results compared to the real product? Reply

JoKeRr, "And if we look ahead, Intel is making dualcore 65nm's thermal evenlope same as Northwood at 89W (don't count on me though), I'm impressed b/c Intel is actively addressing the problems."

They were responsible for the problems in the first place, and the only reason they are addressing them is that they now have some competition that would gain a lot of market share if they continued to ignore the interests of their customers going forward. If there´s something admirable about that, I must have missed it.
Reply

JoKeRr, "However, with such a great product, why couldn't AMD strike a deal with major OEMs like Dell? I know they're doing great on the server side with opteron, but why not desktop?"

You must be "joking". Dell may be the the biggest PC OEM in the world, but not by much, and No.2 HP, No.3
Lenovo/IBM, No.4 Acer and No.5 Fujitsu-Siemens all carry AMD desktop CPUs, from Sempron to Athlon 64.Reply

Quote:-
Why not impliment something like Hyperthreading when it's proven to work so well?

" Fred's response to this question was thankfully straightforward; he isn't a fan of Intel's Hyper Threading in the sense that the entire pipeline is shared between multiple threads. In Fred's words, "it's a misuse of resources." "

"A bit of that changed when Intel brought forth their dual core plans - assuming that they can actually guarantee availability, Intel is planning to ship more desktop dual core processors, at lower prices, than AMD this year."

I think it´s unlikely that the 1.8Ghz Athlon 64 X2 will cost more than $240. After all, a single core Athlon CPU @ 1.8Ghz can be had for ~$125. Not mentioning that it will likely show higher overall performance (note that Pentium D has no HT), will fit into existing platforms, have about half the power consumption, and probably better 64bit performance and power management. We´ll see, I guess...

OK, I was going to flame Questar with an old Chinese saying:
It's always the idiot who got the loudest mouth and who thinks he/she knows everything, and it's always the smarties who only speak the truth and state the facts when necessary.

Anyway, I take that back b/c I think Questar has changed.
-------
On the flip side, this is how I view the whole processor debate:

Intel:
great marketing power and OEM support, great fabs and lots of engineering power. P4 prescott is hotter than it should be, but Intel has made a lot of progress from C0 stepping to the newest N0 stepping. Plus you've got to give Intel some credit for being the first that brings Dualcore to desktop (I know AMD's the first in server) and willing to widely spread the eventual benefit of dualcore to everyone today at a very fair price even though the cost of manufacturing is definitely higher. Sure it's essentially 2 prescott 1mb core glued together but it works. It takes guts for a company to widespread something so bleeding edge and so much at the same time (DDR2, PCI-E, HD audio, etc). And if we look ahead, Intel is making dualcore 65nm's thermal evenlope same as Northwood at 89W (don't count on me though), I'm impressed b/c Intel is actively addressing the problems. And who's not impressed with the introduction of Centrino?

On the other hand, there are things I don't like about Intel either: Such as the frequent change of socket (was 775 really neccessary before dualcore?? what about 423?), not supplying nearly as much 875P chipset as it should b/c 925XE/925/915 is not selling well, and now a whole new platform just to add dualcore support, just to name a few. As for performance: Other than gaming, Intel's P4 is only behind in 3 or 4 benchmarks when I last counted.

AMD: They do have a wondering processor with Athlon64 right now, cool and fast, especially in gaming. They are also making good progress with dualcore, and mobile platform (turion64) and they have great chipset from NVDIA and VIA. And props to AMD who's the first to introduce 64bit support, well done. However, with such a great product, why couldn't AMD strike a deal with major OEMs like Dell? I know they're doing great on the server side with opteron, but why not desktop? Lack of marketing in my own opinion hurts a lot for companies like AMD. And I wish AMD could have addressed the issue of multitasking better before the coming of dualcore: Why not impliment something like Hyperthreading when it's proven to work so well?
---------

One last question for Anand b4 I shut up:
DDR could run at 2-2-2-5 but fetches 2bit per cycle. DDR2 runs at 4-4-4-10, even though twice the latency but fetches 4bit per cycle, so essentially DDR2 at 4-4-4-10 is about the same (in terms of bandwidth and maybe even latency??) as DDR at 2-2-2-5?

I'll see if I can get Derek to do an article on audio quality of the latest solutions, but a lot of that will vary from one motherboard to the text. If an article does end up shaping out, I'll post something about it.

The 2 single core CPUs vs. 1 dual core CPU comparison is an interesting one that I'd like to make and I'll do my best to fit some of those numbers in there, but I think there are other, more useful (from a purchasing standpoint) comparisons out there that you will see in the article.

As far as my reasons for not doing an AMD NF4 vs. Intel NF4 comparison, it has nothing to do with pleasing any manufacturer - as I've said many times before, I don't care about pleasing any manufacturers, I'm here to deliver what you all want. It doesn't matter that Intel supplied the CPU, they just send us the hardware and we do whatever we want with it. I was originally going to do an AMD vs. Intel comparison in that article, but a handful of readers responded that it wasn't necessary so I left it out - I agreed with them as I thought it would be redundant and after all, if you're looking at a comparison of Intel chipsets you've already decided that you want an Intel processor (if not, consult our CPU reviews first to figure out what CPU to buy, then read the chipset reviews to figure out what chipset, then what motherboard, etc...).

Just one correction that's worth it:
" Albeit not beeing unjustified. At least insom cases like HD audio. "
should sound more like this
" Albeit not beeing justified. At least in some cases like HD audio. "

You know, in my native language there could be even 3 to 4 negations in one simple sentence:
Nikdy som nepovedal ze nie si blbec. goes word for word:
I never didn't said that you aren't silly.

Rand: Yeah ! so am I.
Actually this is the first time I noticed they have changed it ;-) Funny is I started to watch IT closely just 2yrs bacwards...

Questar:
from the evolution of your posts through out this discusion I see you HAVE changed the opinion. It's quite refreshing to see some valid arguments in your last post. That's it you should have started with.

In case you are who you claim to, this flame was surely worth the paper (literaly meant :).

Anand:
Please could you take a look at the audio quiality of the new SB from nVidia(;-) ? You know, nF4's AC'97 is nothing to sing about...
From the other keg - hope to see 2x248(848) versus single 1x275(875) in the SAME board compared. this would be waaaays more usefull spent time than any other comparison possible. PLEASE take note here so those of us who have current image of situation in the performance arena doesn't have to make indirect guesses(wonder if that word spells such way ;D).

To all who would like to se nF4 AMD vs. Intel SLI roundup: Please take in mind that such an embarassment would not please Intel(AND the provider of tested HW) very much. It would also make no other sense than to sink the P4 platform even more into mud. Albeit not beeing unjustified. At least insom cases like HD audio.Reply

"The only defense I imagine he could possibly conjure up right now is currently in the market there is the "Nobody got fired for buying Intel" mentality where companies and such are wary of trying non-Intel products mainly because... Dell and other major manufacturers wont offer it in any quantity."

Actually there are two reasons:

1) Qualification costs. it can be easy to drop $150k to qualify a new platform.

2) Product longevity. Change is very expensive to large corporations. Anything we make a commitment to buy must have a lifespan of at least 18 months from the date we qualify the product. We also must be comfortable with the companies 3 year product roadmap. So far there are no teir one vendors that have AMD product lines that meet these requirements.

I think you mean you disagree with his first statement, since his last statement was about DDR2. Personally, I assume reviews on this site are talking to me (PC enthusiast) and not businesses (except reviews which explicitly state otherwise which are few and far between here). In that context, Anand has a point.Reply

Perhaps it should have been worded differently like... offered performance benefits that have only yet to be realized. But as worded, it is misleading and incorrect. Obviously, I read your site often, and I have come to expect technical correctness in what you write ;-)

That said, I still would have to disagree with your last statement. Where companies purchase and keep PC's around for 3+ years (OMG, I wish we got rid of PC's in 3 years), the ability to purchase PCI-E when it came out knowing that we could upgrade them to iSCSI, etc in the future *IS* a very tangable benefit. At home, it's a different story, where my motherboard changes with every major change (or every other as money permits).Reply

You have to be leery of anyone who resorts to juvenile symantics in an argument. When Questar derided another person for using the word "worthless" to describe Intel, you had to ignore him. Obviously, "worthless" wasn't meant literally. That's one of the wonders of the English language, the way it evolves, with words taking on more subtle meanings through the gradual societal acceptance of colloquialisms and slang. Words like "worthless" also lose their qualitative and quantitative qualities through this evolution...depending on how the word is used of course. Generally, when people resort to literal symantics, they feel like they are losing the argument. Reminds me of when Bill Clinton questioned the definition of the word "is." Questar's back was against the wall, I guess.Reply

Questar - This may have been said before, but I didn't read this whole thread.

Reviews are generally filled with opinion, it's the nature of the beast. If you wanted an Intel white paper well this isn't the place for it. If you've taken a high school level english class then you should be quite capable of determining opinion from fact in common english.Reply

#44 - I dont think its *that* silly to say such a thing. DDR2 and PCI-E are still new technologies and apart from newer mainboards coming with onboard PCI-E gigabit lan, there hasnt been anything worthy of note for the mainstream user. Getting off the PCI bus is good but it takes time for us to migrate to it. Let alone were talking about technology thats barely penetrating the market thats already saturated with people who are perfectly happy with thier current systems. Remember how long it took for us to get off ISA completely.

There are alot of 2-3ghz PCI systems out there and to Average Joe User (tm) you can spin PCI-E as much as you want but unless they are in the market for a new computer they really dont give a damn. Same thing for Athlon 64 or Pentium D. How do you convince someone who uses AOL they need THAT much power?

I agree that there are huge benefits to PCI Express, but for pretty much the entire life span of the 925X/915 platforms none of PCI Express' potential was even remotely tapped into. So here we are today, where PCI Express devices are finally starting to appear and we are given a brand new chipset, one that supports dual core.

I didn't mean to come off as saying that PCI Express and DDR2 are bad technologies, but the 925X/915 platform as a whole was not aided by their inclusion during its life span. The 955/945 chipsets will succeed in those areas where the 925X/915 did not, although it is worth pointing out that while Intel remains on a 800MHz FSB - DDR2 continues to do nothing for performance, even on 955X.

"Honestly, Intel processors and even the platform haven’t been interesting since the introduction of Prescott. They have been too hot and poor performers, not to mention that the latest Intel platforms forced a transition to technologies that basically offered no performance benefits (DDR2, PCI Express)".

I find it absolutely disturbing that even anandtech would say something as silly as this. Sure, if all you care about is graphics performance, PCI-E isn't that big of a deal. But drop in a Gigabit ethernet card, SCSI controller running a fast/wide raid, or *gasp* iSCSI. You'll see the difference immediately. To say there is no performance benefit is just totally missing the point that PCI-E is an improvement over PCI. Get your head out of games for a minute and you'll see.Reply

Granted when you talk about how everyone who reads this site are a "bunch of idiots" and then talk about how vast your knowledge of IT is all the while not being able to back it up, this just shows immaturity and makes it hard to believe what you claim is even true. And yeah, when you have to stoop to correcting minor spelling errors (when you yourself were in the wrong) to prove a point it means you havent got a leg to stand on.

Not to mention having a fanboyism on something is actually a BAD thing to have in the IT industry. If you are so well intwined on a certain piece of hardware (say, all Intel and f*** everything else) then its a dangerous situation where you wont be trying alternatives. Possibly cheaper or better alternatives at that. Say your in a company and you have a specific solid mindset on something, Joe Blow 2.0 comes in as a new hire. Joe Blow 2.0 pitches cheaper, faster, better solution but you diss it because there cant be possibly anything better than what you have. Joe Blow 2.0 wins a contract and you look stupid. Don't try this at work kids.

The only defense I imagine he could possibly conjure up right now is currently in the market there is the "Nobody got fired for buying Intel" mentality where companies and such are wary of trying non-Intel products mainly because... Dell and other major manufacturers wont offer it in any quantity. All the systems here are Dell and its sometimes its a blessing and a curse. Of course, even the "Nobody got fired for buying IBM" mentality slowly faded so it depends on what the market wants.

Just dont criticize all pc enthusiasts because they want something other than the norm, thanks Questar. Reply

So basically, if we were to take that piece of an ego boost for a fact, you're probably an arrogant executive who thinks he knows everything and nobody can prove him wrong.

In reality, you're probably just a 16 year old "know-it-all" who has to be a grammar Nazi to prove your stupid little points that don't really mean anything in the first place.

For the moment, NVIDIA and Intel have a cross-licensing contract, so it's basically eye for an eye. Intel gets SLI, NVIDIA gets to make chipsets for Intel based systems. Since none of us actually know the exact specifications of the contract, I guess we can't make any comments on that, can we? But if NVIDIA eats up Intel's marketshare in chipsets, it's definitely a problem for Intel.

#34: By worthless I do mean that they are not worthy enough to consider for high-end performance. I was refeering to the enthusiast community than anything else.

#35: It's quite unfortunate that you work in the industry. So, let me guess - all of these servers/laptops/desktops will have "Intel Inside". I guess it's quite stupid of the management of your company to give you such a huge responsibility, since you obviously don't know anything about it.

And if you own your own company then I can only wonder when your company will go down. Reply

whoa whoa, I'm no deity here, just a normal guy like everyone else - I can make mistakes and I encourage everyone to never blindly follow something I, or anyone else, says. That being said, Questar I've got a few things that you may be interested in reading:

That graph shows exactly how hot Prescott gets, in fact, until the release of the latter 5xxJ series and 6xx series with EIST, Prescott systems were considerably louder than Athlon 64 systems. "Too hot" may be an opinion, but it's one echoed by the vast majority of readers as well as folks in the industry - who, in turn, are the ones purchasing/recommending the CPUs so their opinion matters quite a bit.

3) I can't go into specifics as to how the Intel/NVIDIA agreement came into play, but know that Intel doesn't just strike up broad cross licensing agreements to companies like NVIDIA so they can make money on NVIDIA's chipsets. The Intel/NVIDIA relationship is far from just a "you can make chipsets for our CPUs" relationship, it is a cross licensing agreement where Intel gets access to big hunks of NVIDIA's patent portfolio and NVIDIA gets access to Intel's. That sort of a play is not made just to increase revenues, I can't go into much further detail but I suggest reading up on patent law and how it is employed by Intel.

4) Also remember that Intel not manufacturing silicon isn't necessarily a cost saver for them; a modern day fab costs around $2.5B, and you make that money back by keeping the fab running at as close to capacity as possible.

I think that's it, let me know if I missed something. I apologize for not replying earlier, I've been extremely strapped for time given next week's impending launch.

Houdani

I haven't played around with all of the multitasking tests, but I'd say that the lighter ones (I/O wise) have around 8 - 10 outstanding IOs. I believe NVIDIA disables NCQ at queue depths below 32, but I don't think Intel does (which is why Intel shows a slight performance advantage in the first set of tests).

Interestingly enough, in the first gaming multitasking scenario, Intel actually ends up being faster than NVIDIA by a couple of percent - I'm guessing because NVIDIA is running with NCQ disabled there.

#29 segagenesis: He isn't going to believe the popular sites because he thinks they are bought out and their editors have no knowledge of the industry. And if you find a smaller site, he still won't believe you because smaller sites know nothing either.

Questar: Do you think you are the only with industry knowledge? I can only hope that you are not working for an IT company.Reply

#28 - Just to keep things balanced here, Intel has a large portion of the OEM market because it can produce products in volume compared to AMD and most people dont really care whats "Intel Inside" thier computer. Just beacuse AMD may have a technologically superior processor doesnt mean its going to do wonders overnight when you just have to cite Betamax vs. VHS. On the other hand, Intel has the Pentium-M which is a good piece of hardware yet is limiting its market penetration with high prices/low production.Reply

#26 QUESTAR: I see you can't handle the proof, eh. After you couldn't come up with a counter-argument you decided to bash Toms. Sure, Toms may not be as in-depth as Anand and they could be biased, but they aren't that blatant about it.

#26 - What I cant provide links outside this site because they dont count? Oh wait this site doesnt "count" either does it?

Regarding the infamous AMD video that was a long time ago, not to mention Tom's doing such a video actually made something HAPPEN in the industry. AMD responded and added thermal protection in the newer CPUs. The P4 heat problem is *now*!
Reply

#25 QUESTAR: "Let me explain it to you:
Intel get's a cut of the money from every chipset nVidia sells. What part of that don't you get?"

Is it better for Intel to get a cut out of NVIDIA's profits or hog the entire market with their own chipsets and take all the profits to themselves? What I don't get is how stupid you are.

"Ummm...yeah right, go right on thinking that."

Yet again, we have a mornoic statement from our AnandTech's very own dumbass. Maybe Anand should hire you to post stupid comments throughout the site to generate more discussions. Then again, even he will get tired seeing your stupid comments.

Intel surely doesn't have a chance against AMD with their Prescott CPUs. The only reason Intel is still the number one chipmaker is because it has signed exclusive contracts with Dell and Sony and there are quite a few people out there who could care less if they have an Intel or AMD CPUs.

Once again, it's your own ignorance that's blocking your thinking passages. Neither AMD nor Intel are strong enough take each other out of the business, but AMD CPUs do perform better in many scenarios against Intel CPUs. This include both desktop as well as server level CPUs. If you remember the article on AnandTech, Opteron kicked Intel's ass. And with the new Opterons coming soon, you will get a confirmation yourself.

It's pretty clear - Intel's last few products have been worthless in many cases.
Reply

#24 - I dont mind because I deal with people like him every day. I have used AMD myself for the past 5 years but I will admit that Intel has the performance crown lately when it comes to content encoding... however at a price. I have also preferred AMD due to pricing and gaming performance where it continues to do fairly well at.

Working in labs maintaing them as is desktops (I am responsible for about 500+ computers) I have noticed that with newer P4's the heat output is actually noticeable. As I said a whole room full of them really raise the themperature, and thats just sitting there idle. A friend of mine has a 3.8ghz P4 and that thing is at its thermal limit with a X850 XT PE in the same case. Ouch!Reply

#20 QUESTAR: "Too Hot" is merely a figurative comment. Don't try and be a smart ass. We all can clearly see through your Intel favoritism. You are definitely not as knowledgable as Anand or some of the people here, so get lost.

If you don't like what AnandTech has to say, stop reading the site. People like you only waste valuable bandwidth, plus, it will be one less troll on the Internet. Reply

Questar: Please, you're comments are quite stupid. You are not the only one with Power, Intel and AMD CPUs, you know.

Just to let you know, AnandTech has a reputation of being the best of the best, and Anand is the pioneer of reviewing hardware, so he has been in this business for a long time. Therefore, it means that he has seen quite a bit of hardware in these 8 years running AnandTech. It's pretty ignorant of you to question him.

I agree with everyone. You are pretty stupid if you think Intel has a chance against AMD. Prescotts are illogical and rather poorly designed CPUs.

"Intel probably makes as much net profit off the licensing of the nVidia chipset as they do selling thier own - after all thay don't have to design, build, ship or sell anything. So why would they be worried?"

Once again, your opinion. Can you please get Intel to leak these numbers to you, so we can have a reason to believe you?

Licensing prices are fixed. Sure, Intel could be making more money from licensing their technologies to NVIDIA, but what will happen in the future when PCIe and DDR2 will start to pick up the pace. Then, Intel would want to sell as much of their chipsets as possible for maximum revenue and when you have a strong chipset maker like NVIDIA, it would be pretty hard, don't you think? In the future, NVIDIA's licensing fee wouldn't cut it.

It's pretty logical: If company A makes chipsets and company B makes the chipset with same technologies, the market will surely divide between the two.

I guess your brain is sealed somewhere, which is why you probably can't think straight. I hate stupid people and I think you are one of them.

So are you avoiding having to make sense? Quit making up stories when you call for proof yet provide none yourself. I gave you proof, now its your turn.

Here, I'll help you decipher it because you seem to be ignoring posts in favor of your own flawed logic. Here is a snippet of one of your own.

> "Honestly, Intel processors and even the platform haven’t been interesting since the introduction of Prescott. They have been too hot and poor performers, not to mention that the latest Intel platforms forced a transition to technologies that basically offered no performance benefits (DDR2, PCI Express)."

> Your opinion only, don't make this out to be fact.

The link I provided shows that in *fact* there is more heat output by modern Intel processors. Yes, this is a quantitative analysis. If it was qualitative you could have called it opinion, but its not eh? Try again.Reply

Why are you here then? Hell, you cant even read I guess. I gave you a link talking about heat output when you said it was "opinion" when it was stated Intel runs hotter. I can tell you that from fact from the 25 prescotts sitting in a lab here and when they are all running the A/C better be running too!

Why dont you come up with some facts yourself instead of insulting both the site and others?Reply

#11,
What makes you think I would care about "a huge landslide of flames regarding your post in this comment section"?

99% of the people on this site are ignorant cattle without the ability to think for themselves. They are here as part of a communal circle jerk over AMD cpu's. I care as much for their thoughts as I care about the thoughts of the cow that gets slaughtered for my dinner.Reply

#10,
I see no information in your link that #8's argument. He specifically said that an A64 is faster than Intel in all applications except for video encoding. The link you provided actually proves him wrong, as there are other applications in which the comparable Intel CPU is faster, or the difference is insignificant between the two.Reply

Go read all the reviews of Intel and AMD processors since the release of the K8 architecture 8 years ago. You have a lot of reading to do.

Don't give the editor-in-chief of a 8 million plus readership hardware review site advise about getting his facts straight. Anandtech receives and reviews hundreds and hundreds of hardware that you will never even dream of owning.

If a statement is made in a review from a site as reputable as Anandtech, it is not made lightly. You have all the right in the world to question it and seek a second opinion elsewhere, but it is COMMON knowledge among those reading CPU reviews over the last two years that AMD CPUs are faster in games and computational number crunching whereas Intel excels in audio and video encoding PERIOD.

You can easily explain these findings. Games and computational number crunching take low latency, high memory bandwidth to work well. Audio and video encoding need fast processor speeds.

Wait about five hours after the release of this review and you will soon be finding a huge landslide of flames regarding your post in this comment section.

Have fun! :)

PS. You must not read at all if you think anyone has found a performance advantage of PCI-E and DDR2 over AGP and DDR for current software applications. Not tomshardware, Hardocp, Xbitlabs, Anandtech, etc. have found a performance increase between AGP 8X vs. PCI-E 16x or DDR400 vs. DDR2400/533/667.Reply

#5
"Honestly, Intel processors and even the platform haven’t been interesting since the introduction of Prescott. They have been too hot and poor performers, not to mention that the latest Intel platforms forced a transition to technologies that basically offered no performance benefits (DDR2, PCI Express)."

Your opinion only, don't make this out to be fact.

That is pretty much fact. In all areas except encoding, they were worse performers than their competiton (Athlon 64). The extra heat sure didn't improve that either. As far as forcing DDR2 and PCI Express, when they didn't improve performance, you can't disagree with that.Reply

"Honestly, Intel processors and even the platform haven’t been interesting since the introduction of Prescott. They have been too hot and poor performers, not to mention that the latest Intel platforms forced a transition to technologies that basically offered no performance benefits (DDR2, PCI Express)."

Your opinion only, don't make this out to be fact.

"at the end of the day, Intel would still be happier if there was no threat from companies like NVIDIA"

nVidia (please print it correctly) is not a "threat" to Intel in the chipset market. They couldn't make a P4 chipset without a license from Intel. If Intel was threatened by them they wouldn't sell them a license. The purpose in licensing is give system builders more choice in design features.

"However Intel’s chipset team has reason to worry; motherboard manufacturers weren’t happy with the 925/915 chipsets, and with a viable alternative in NVIDIA, we may very well have an opportunity for NVIDIA to start eating into Intel’s own chipset market share in a way that no other company has in the past"

Intel probably makes as much net profit off the licensing of the nVidia chipset as they do selling thier own - after all thay don't have to design, build, ship or sell anything. So why would they be worried?