The current situation with Anandtech ATI reports & coverage is absolutely absurd, & so disappointing,
I do not even know where to start.

It seems like nothing is done by pros anymore at Anandtech.

From the endless 57XX Driver bugs, To the flaky incomplete & undocumented DXVA features,
To the High DPC usage in anything not 3d/dxva
all the way to the poorest 2d performance ever seen on the pc (this is not an exaggerated comment), NOTHING is discovered by Anadtech.

You have become a commercial, biased, & unprofessional, overrated site.

So there, I have done the work for you,
go check these issues & let's see when you will get the staff professional enough to analyze or even notice all the above. Reply

According to benchmark reviews the 4670 idles at 9w - the way they come to this conclusion is they boot the pc without the vid card and run the psu cord through a Kill-A-Watt EZ P4460 wall socket mount that reads the wattage draw, take that number for system idle power and then run it against the system at idle with the card for a base number, then they run it through some 3D titles for total draw.. Reply

There is no marketing or business decision to continue producing the 4850. It's more expensive to make than the reviewed card, and it's making the new cards look like crap. I personally believe the Far Cry 2 data is not correct (it makes no sense), but in everything else the 4850 is significantly faster.

I'm actually surprised AMD would be stupid enough to continue producing (not just selling out of existing stock) the 4850...they are shooting themselves in the foot and making their "new" product lineup underwhelming.

Hint, hint, if you are in the market for a card in this price range and don't care about power requirements or small size (ie HTPC), get the 4850 NOW. I can't imagine it will be around next month unless AMD is completely clueless (which I believe they are not). Reply

hey im a noob on GPU'S :) is the 4850 better than the 5670? I own an imac :) with a mobility 4850, if apple upgraded to a mobilit 5750, would it be considerably faster? I tried asking apple but as usual they never reply :)Reply

It seems not a single card in the 5-series bring you more performance at a particular price point. There's always a card from the 4-series that beats the 5-series and costs less. Why this trade off between performance and features? It's either a slower card with more features or a faster card with less features...

This is completely unlike the 4-series, which revolutionized performance at every price point.

Guess things will change in, oh, about a year, when Fermi-derived cards are out at all price points... Reply

On th first page of the article it said launch volume would be around 50k units and that is expected to be sufficient.

Is that figure for US only? if it's the whole world, it works out about 1 card each for all the retail stores that sell graphic cards. Even with the price set high as it is would think a much greater supply is needed.

Unless you already have an HTPC why would anyone get this card. If building a new HTPC you could get a Clarkdale to bitstream the audio-codecs.

Also...why do we care if it is bitstreamed? I have a reciever that can decode this but it doesn't matter if the digital information is converted to PCM before or after the HDMI cable. The only advantage is to see those lights on the front of my reciever... Reply

The 8800 GT data was originally collected for past articles, where we started at 16x10. The 8800 GT isn't part of my collection (it's Anand's) so I wasn't able to get 12x10 data in time for this article. Reply

It's probably fair to point out that, in most tests, the 5670 is very close to the 8800, and as such listing it may not mean anything. However, the 1280x1024 tests are also without AA - it might be nice to see the effect of turning AA on with this oldie but goodie as compared to the more modern competition, so including it may make sense. You may think that the higher core clock of the 5670 would give it an advantage without AA but if it goes anything like Batman, this would probably be an incorrect assumption as well. Reply

By that definition that means that these cards were designed to run in systems that have power supplies from 350 to 400w, support HD quality video, and support games at a resolution of no higher than 1440x900 at medium quality settings with 2x AA and 8x anisotropic filtering. By putting them at settings that most will not run these cards at it makes these results for the most part worthless.

I mean who cares how these cards run at 1920x1200 at high detail settings since we already know they're going to fail anyway? I'm more interested in how these run with all the details on at say 1440x900 or possibly 1680x1050 which are the more common widescreen monitors most people have.

For that matter where are details about how these cards compare running HD quality video, if the fan speed can be controlled via speedfan, or even if they have fixed some of the video quality issues like black crush when outputting via HDMI?
Reply

We traditionally run every game at 3 resolutions, particularly since some games are more GPU-intensive than others. Based on the 12x10 performance, I decided that 10x7 would be silly and instead went up a level to 19x12 - a few games were actually playable even at those high resolutions.

16x10 is accounted for, and 12x10 is close enough to 14x9 that the results are practically the same.

HD Video: All the 5000 series cards, except perhaps the Cedar are going to be exactly the same.

Fan speed: Can be modified (I use it to cool down the cards before extraction after running FurMark)

Looks like a worthy replacement for my 4670 in my HTPC. Looks like it will still have some issues with certain games at 1080p but since I play mostly Source games on my HTPC, it should stabilize some frame rate problems I get with my 4670. Reply

Where is the 4770 and 9800GT??? A lot of the the data makes me skeptical because it doesnt mesh with what I'm seeing on other sites. I'm more inclined to trust the other reviews because they used common sense and compared this card to more cards in its own price range. Common sense is your friend. Notice how close the 5670 and 5750 are in terms of load power. That dont make sense at all. Reply

The 9800 GT is the same as an 8800 GT. As for the 4770 it's here too, although I don't have any 19x12 data for it since that resolution was a last-minute decision (I only had 30 hours or so with the 5670). Reply

Common sense dictates that if a 4770 performs like a 4850 clone in most cases, leave it out for saturation purposes. Same case for the 9800GT, GTS 250 is a rebranded 9800GT, near identical performance.
Price point? 9800GT and 4770 are next-to-unavailable, so what price should be used on something that can't be bought (in the near future when 5670 is all that's left on ATI side and GT 250 for nVidia but this review is still sitting on this site).

How can a power meters measured numbers "not make sense"? A much better performing 5750 at near 5670 power usage just means inefficiency at 5670 performance, which is common for lower performance parts on similar fab.

Explanation of the above would be a bonus, but hardly required, great review. Reply

Also want to note that without real nvidia cards for amd it seems enough just to beat the weak nvidia setup and thats all.
Eyfinity is just plain stupid on these level of cards for games. Real DX11 games on these performance levels is questionable at least too.
Both 5700 and 5600 are VERY weak for the price they sell. I wouldnt recomend them to anyone if they own a 4800 or 4600 series card, just in case they need a new card in a new machine.
The feature set with new generation of cards, including audio bitstreaming should be a MUST and not a priced upgrade put in the cards cost. Iam quite dissapointed form the amd-s 5600 and 5700 series cards. The only real new cards are the radeon 5800-s. Reply

I agree with Zool. The 5800s are the only true advancement among this 5000 series. They actually improved performance and price/performance (well, until pricing got out of hand because of supply) over their equivalent lineup predecessors and against their competition from NV. The 5700s and apparently 5600s are just spinning wheels performance-wise: they are feature upgrades not performance upgrades. That makes them mildly disappointing and not an easy purchase decision. Reply

I had a 3850. Buying a 5770 was a fairly easy decision to me. I wanted the 5850 in fact, but to buy this one I would need to change my PSU too and that would be too expensive. So the 5770 it was! And I'm pretty glad with the performance. Remember that not everyone has the latest board from previus generation. ATI/AMD is doing a great job. Reply

These lower end series are not intended to run high resolution monitors in "heavy" games at performance modes. For that, there is the 5800 series.
These 5600 series seems ok for every game in 19" resolutions and lowered quality, which make them perfect for many people. They are a huge step up from integrated graphics :) Reply

How many people actually buy 19" displays anymore? Wide screen isn't like the older 4:3 screens, so a 19" LCD is kind of small.

At $125~150, there isn't much reason to NOT get a 20~21" class monitor.

While the 5600s are a bit on the slow side, there is a NEED to have low-end graphics cards that meets some standards and having an entire product line support DX11 is still a good thing.

Once the price of the 5670 gets down to $75 then it will be a good value card. But not at $100~120 which is the current price on Newegg. And remember, many people don't have the PSUs (or budget) to get support a 5700 series card. I think once 40nm manufacturing matures for TMSC (sp?), the pricing will go down more.

As an owner of a 4670, the 5670 is easily a faster card... but I believe AMD screwed up. The $100 4770 was almost on par with the 4850 and easily faster than the 4830. There is NO reason the smaller die 5670 to be ANY slower than the 4770. That is ALL the 5670 needed to be. But then again, the $135 (today) 5750 is starting to be constantly faster than the 4850 card (good).

SO the real problem is pricing. If the $100 5670 was almost as fast as the $135 5750, there would be no need for the 5750. Also, other than PSU requirements - it would be stupid to spend $120 for a 1GB 5670 when the 5750 is $15 more and almost twice the performance.
Reply

"They are a huge step up from integrated graphics :)"
Price wise the 5670 is a huge step too from integrated graphic. I was mainly comparing the 5xxx and 4xxx series and thats almost a zero jump. Reply

The 5700 cards are on the same level than 4800 cards and the 5600 cards are very close to 4600 cards. Now if u enable DX11 in games u will se performance way below both 4800 and 4600 for both dx11 cards against they counterpart. Thats downgrading not upgrading.
And the X700 vs X800 series trick and price range change is quite disturbing too. Reply

How can game developers make better looking games when the performance/price sits on the same level with each generation ? DX11 is very taxing if u want to make it properly. Those fancy new efects, postprocessing with Dx compute just eats much more shader power,bandwith. Performance wise 4800 owners can upgrade only to 5800 cards (dx11 speeds with 5700 is very weak) which price level is another category.

But that can happen if your only competition is rebranding a 2006 card architecture because the GT200 was overdesigned. The disturbing part of this is that nvidia cant learn from its mistakes and make another giant chip second time GT300 which is this time even late :). Reply

Its quite strange that they downgraded the 5670 TMUs from 32 to 20. With the 60+ GB/s the 32 TMUs could be much more usefull than with the 4670 bandwith. All games use multitexturing to some degree quite some time. Reply

Far Cry 2: the text states that the 5670 and the 4850 have the same amount of memory and that the 5670 beats the 4850.

However, looking at the test setup, the 5670 is the 1GB version and the 4850 is the 512MB version, and the test results support this. The gap between the 4850 and the 4870 is *way* too big to not be memory size constraint.

As such, the only reason the 5670 "beats" the 4850 in this test is the memory size, and the supporting text is wrong. Reply

hmmm ok, then the Far Cry 2 results are a bit peculiar. The 4850 has the same amount of memory but more of everything else and is 25% slower. The performance of the 5670 seems to fall in line with its compute resources, as if it doesn't have a memory bottleneck. This made me think you had a 1GB card. My apologies. Reply

I bought a $99 HD4830 more than a year ago, and it much faster than this, especially when overclocked (as it had lot of OC headroom, and performs a little faster than an HD4850). Sad that the same amount of money a year later gets you a slower card. Reply

You 4830 is a partial defective 4850, thats what made it nice value until the 4770 arrived (despite low availablility then). You have to wait for the already rumored 5830 to get the same feeling again... Reply

What made the 4670 an exciting card well over a year ago was that it was under $100 when it was launched ($80 avg) and it was almost as fast as the 3870, sometimes faster (as drivers matured). So when looking at some of these benchmarks that DON'T have the 4670, just look at the 3870 and count it the same. So at $80, it had replaced the $200~150 3870 and ran cooler, etc.

Anyway, the 5670 SHOULD have at least equaled the 4770 in performance! That would make the 5670 a very good value gaming card for the $90~100 price range. You can get 4770s for about $95~110 (until gone).

Hopefully in the coming months, the prices will start to get lower
naturally. But AMD should have a $100 card that *IS* equal to the 4850. Perhaps that would be a 5730 card, but its power should still be under 75watts under load.

Until Nvidia comes out with something competitive, AMD has little reason to load the prices... ha, notice how things have changed? :)

Nice thorough review. I'd be interested in some more results with lesser or no AA as well though. While we all love AA it's kind of silly to expect to run it well at 1920x1200 or sometimes even 1680x1050 on <$100 cards. Plus it would give those who keep cards for a long time and just turn down features such as AA a better comparison. Reply

Another nice thing about it is that it makes an inexpensive triple-head card that does not need external power, even if one of the heads needs to be Displayport. Even a single link HDMI/DVI can still support 1920x1200. Reply

I'm sure I seen it in 2, including the Envy15, Perhaps I shouldn't have said "alot".

But it would be good to have a review comparing the mobile solutions out there. Not to mention the throttling problems in some notebooks.

I'd love to see Anandtech do a review of the problems the Dell XPS 16 w/ Core I7 has. On A/C and only on A/C it cuts the multiplier to 7 and then uses a clock modulation. Clock modulation tells the cpu to only do work certain cycles, so you can have as many as 75% of your CPU cycles going to waste.

Full story here, and just for the record, Id be willing to let Anandtech borrow my 1645 to test if Dell doesn't fix it with this next BIOS update, which I don't see how they can 90watt AC is simply not enough. Reply

I read through that thread yesterday. We support 50+ Dell e6500 laptops that have been problematic in other ways besides throttling, but it was nevertheless interesting to read and pass along to my fellow IT co-workers. Reply

I've looked at the thread and sent Dell an email asking for comment. It's important to remember power supply (power brick in this case) efficiency, so if the brick can output 90W and it's only 75% efficient (which is probably higher than what it really achieves), power draw at the wall of up to 120W might be achievable without the need to throttle. So, it's possible that a BIOS update will indeed address the problem, but let's not jump to any conclusions just yet....

I'd also say that if you're using FurMark to achieve the throttling, find something else instead. FurMark really pushes the envelope and many consider it a power virus. I understand others are saying it occurs with regular games, which is obviously a much bigger issue than with a test program that doesn't represent a real-world workload.

Anyway, if you really want to send us the laptop for testing, why not do the testing yourself and use that as the basis for an audition to AnandTech? If you go that route, I would make sure you really investigate when throttling does and doesn't occur, look at the various power profiles and try tweaking those, etc.

As a side note: with Win7 I noticed on at least one laptop that using the "passive" cooling profile caused video playback to stutter, and setting it to "active" fixed the problem. There are so many variables that you can never know 100% what might be causing a particular problem. Reply

Jarred thanks, I'm going to take you up on that and currently I'm doing a write up on the XPS 1645 w/ RGB. I would love any suggestions or if you would like me to include anything please send them to SlyNine@hotmail.com with the subject: XPS 1645. If anyone knows any tools other then throttle stop to monitor the CPU modulation that would also be helpful. Reply

Yea I don't use FurMark at all, in fact I made a post recommending them not use it.

With just UT3 nothing else going on the multiplier hits as low as 7 and with the brightness up halfway the modulation kicks in bringing the CPU down to 25%, that's only 25 cycles out of every 100 that's willing to do anything. Even just doing a Prime 95 run the multi is below 10, correct me if I'm wrong but isn't it supposed to be around 13?

But thanks a TON Jarred for acknowledging this, If a high profile sit like Anandtech did a story on it I'd imagen that dell would have to respond. Really this is an Amazing laptop otherwise (other then this line I have threw my screen but obviously that is covered by warranty.) Reply

As far as CPU multiplier, if you have the i7-720QM the normal multiplier is 12X (133 bus * 12 = 1.6GHz). For the i7-820QM the stock multiplier is 13 (1.73GHz). Maximum Turbo mode on the 720QM is 2.80GHz, so you could potentially see a 21X multiplier, while on the 820QM the maximum Turbo is 3.066GHz so you'd see up to a 23X multiplier. I don't know if throttle stop tells you max and min multipliers or not, but you could even run CPU-Z and just watch to see if the multiplier is changing a lot. Reply

Yea I have been watching a few programs including throttle stop, Realtemp and Realtemp GT, including I7 turbo. They all show the max multiplier at 7-9 when gaming under load, even with an external monitor hooked up and this screen off it doesn't go past 10. Its worth noting that with the screen brightness turned down and CPU load only they stay at 12, but turn the brightness up and your multi falls to 8.

The biggest problem is the clock modulation, which I'm trying to test. But it definitely correlates with real world performance, while task manager may show the CPU at 100%, throttle stop reports a 75% reduction in CPU usage. This also correlates with the delta that taskmanager indicates CPU usage at and what programs like I7turbo and real temp show the C0 state percent. Task manager will show 100% while the C0% will be at 25%, indicating a 75% reduction while under load.

Perhaps throttle stop just measures the difference between the CO% and what the OS reports.

I've custom set all the settings in the advanced power options to be the same on and off battery. When you unplug the system runs a great deal faster, albeit at the risk of harming the battery. I've disabled speed step as well with no difference.

Excel isn't my strong suit(basically I'm going to have to relearn how to use it) but I'm trying to correlate frame rate with the indicated clock modulation. But I'm unsure how to record a timeline of FPS. It does appear though that the FPS do report accurately when the clock modulation kicks in. Reply

Hi, I would like to purchase an Entry level 1GB DDR3 Asus Geforce HD5450 Graphics Card, but considering the power requirements, i only have an 250W PSU. Is it ok to buy a graphic card that requires a minimum of 400w and connect it to my existing MB or do i need to upgrade my PSU?? Advice required. If so any consequences i could face in future ??Reply