1) Ivy bridge is a CPU.2) Trinity, specifically A10, is an APU.3) This article would make more sense if you compared Ivy Bridge+7970 to Bulldozer(dedi. CPU)+7970(dedi. GPU).

It takes time for texture data to be retrieved from the RAM (Which is, actually, the bottleneck with APUs). The CPU and GPU use the same memory bus.

Of course, dedicated solutions are far faster than integrated solutions. The APU, however, is a bit of a miracle for us portable gamers, allowing something that's far better than integrated, sans the cost of dedicated chipsets.

First of all, I'd like to say that you did a good job and these type of video comparisons are nice to show off any detail differences as well as frame-rate differences.

On the other hand, it could use some work.

First of all, youtube limits your framerate to 30 FPS. So in a video like this, the difference in frame-rate cannot be seen unless there is a significant (5-8FPS dip) below 30 in which.

Second, your microphone sounds awful. It's muddy and makes your analysis sound poor by comparison. You also have a lot of popping going on which is stressful to listen to if someone is using headphones.

Third, and more of a suggesting than a real issue; you should do some full screen transitions to compare the smoothness of gameplay.Like wise, keep constant tickers in the black area when doing side by side; so we can tell what the FPS is at, during that moment. Reply

My camera is limited to 24FPS at 1080p, so the use of YouTube isn't a problem there. Using FRAPS doesn't help either, as when you record video it changes the framerate of the game being recorded. I also need to mention that getting a clear, good image of a display with most cameras can be difficult at best -- I've got the Nikon D3100, and it has a tendency to overexpose LCDs when doing video. I shot this video sequence probably a dozen times before I got something I felt looked reasonable. But then, I'm not a videographer.

As for the audio/microphone, it's probably more positioning than anything, plus I have a bit of a cold right now. (My voice normally isn't that low -- I sing high tenor parts in our church choir! LOL) I ran the audio through some cleaning and equalizing of volumes, which probably didn't help either. I'll see about recording some other games when I get a chance, though I'm in the process of moving right now so it might take until post-Christmas.

I can give some frame rates of other games, though I was planning on keeping those for the full review. Basically, Batman is neither the best nor the worst result, particularly at Extreme detail. Civ5 and Skyrim are really bad in Trinity (about half the frame rate!), while Sleeping Dogs oddly enough runs faster at Extreme settings on Trinity than on IVB (but quite a bit slower at Medium) -- I'm not sure why that is. I've still got more games to benchmark, though, so I'll hold off on more commentary for now and maybe post a followup video over the weekend if I have time.Reply

how much would mirroring the video output hurt frame rate, or running the screens native resolution on the output display? If either option does not hurt you too much then you can output to a recording device, and then out to a monitor running at whatever the laptops' native resolution would be to begin with. Cleaner video, higher frame rate, and a generally better end result.

For audio I always like to run things through Audition. The built in audio cleanup tools are pretty good (good enough for something like this anyways), and you can automate some leveling through a compressor. There is a little learning curve to the video, but once you get a system down it only adds ~10min to the editing process. Can't always fix a cold in post though lol.

As a final note, I found it hard to tell which screen went with which text. Perhaps color the back ground of the right and left side and have a colored boarder around each to more easily differentiate between them? Also, lots of dead space.... something could be crammed in there.Reply

I like the fact that MSI atleast gave us the option, although it's clear they should have probably gone for a Desktop A10-5700, or give some OC possibilities in the bios, does it have any options there?

But really there's enough Intel based gamer laptops that it would be nice to just have one choice besides them, and this is just one game, not to mention you mention you have some pretty sluggish SSD's running them which imo is more likely causing the very slow loading times. I'd like to see how it runs BF3 as it's one of the few cores that actually take advantage of 4 cores.

I've seen this in Norway only with the 750GB but it comes at a decent price, but obviously it could use a little bump in the CPU.

But yes, it was a nice little video, would like to see more shots of the product, but I guess we'll see that in the full review.Reply

Don't complain, nor tell them what they should have done.They slapped in the cheapo amd fanboy route chip, and thus, a few more amd fanboys can enjoy 7970M gaming without their extreme Intel hatred ruining their laptop gaming experience.

Not at all, this is perfect, and is well deserved.It should wake up the haters, time for them to face reality.

What we really need is the same test but the Intel cpu coupled with the 680M, so we can see amd get severely spanked again. Then we can really hear the idiots raging - but that's not the point at all.

The point is, BANG for the Buck - that is always the point here, and when it sheds a dour and sad gloom upon AMD, it should be known ANYWAY.

Obviously, things are so touchy with that fanbase, things did not work out with the nVidia top chip in the Intel laptop. One may say that is coincidence or the reviewers opportunities, but nonetheless, that is the way it works out, else the outcry would be unbearable.

It's really sad how true that is, how the pressure from the real problem, the amd hardcores, causes the truth to be hidden, and when this lesser truth is shown that only half hammers amd where it needs to be hammered, it's a "real problem" for them anyway.Reply

You know, for $1200, I would say that is probably the best gaming performance for you buck, so I am happy to see a choice like that.I actually really like the idea of an apu + dedicated gpu in a laptop. That gives you actually very good integrated graphics, so the dedicated graphics aren't quite as necessary. As an example, you could play Minecraft on just integrated graphics - meaning you could do that realistically on battery power. (Even a simple game like minecraft is not enjoyable on HD4000, but works very nicely with A10 graphics). Anyway, I like that option.

r4 eh, I like the m17x but I went with the m18x. I guess it all depends what you use it for. My m18x is basically a LAN box but I got a large laptop instead of a tiny desktop. I dont really use it for on the go or anything, thats what the galaxy note 2 is for.

I think that kind of analysis is long overdue at Anandtech, and not just for laptops either. We don't necessarily need a video as proof of your findings each and every time, but I feel like AT's GPU analysis could really benefit from this rather than just having lowest/average fps for a couple of games.Reply

I think that a bar graph showing time to render/frame during a resource-demanding scene, a la Tech Report, would be a much more useful way to show latency, and probably much easier to automate. Throw a FPS/time graph while you re at it, so we don't rely only on averages for information. Until you're able to write all these in your benchmarking stack, just the addition of the standard deviation of FPS along with the average would be quite informative and cover this case. Reply

You are correct; they really do first class benchmarking analysis. For those who are not familiar with their testing methodology, here is an analysis for two cards which nominally (on average) perform similarly, but actually perform distinctly different.

Based on their analysis you would never recommend the one card because of stuttering, but you might if they had only used averages.

i agree i love the TR frame latency testing. they just did 3 articles about the 7950 having frame latency issues on the newest drivers. they even did a high speed recording of skyrim to show the latency is real and effects gameplay. imo they have the best video card reviews on the net.Reply

Interesting video, although more than a little gimped by the framerate limitations of the camera and youtube. Fraps is needed really to make a meaningful side by side comparison.

I don't think a comparison with an i7 equipped 7970-toting clevo is fair. People buying this machine will clearly be looking for the best bang for buck they can get. In the UK p150 clevos with i7 and 7970 cost around 1100ish compared to 900 for the gx60.

Having just bought the GX60 myself, before buying I was looking for comparisons to the other systems at the same price, which are generally equipped with the GTX660m and i7 3630 or 3610. A set of benchmarks or A-B videos of the gx60 against these (MSI GE60 or Lenovo Y580) would be much appreciated.Reply

FRAPS is running -- that's what the yellow frame rate counter in the corner is.

As for bang for the buck, right now it's $250 more to upgrade from a GX60 to the P170EM in the US, give or take. For what amounts to a 20% price increase, the performance in games at maximum detail tends to be anywhere from 0% (completely GPU limited) to 100% (highly CPU limited). Depending on the games you're playing, that's a pretty significant concern. I think an HD 7870M would be about as fast as the 7970M in many titles when paired with Trinity. You don't buy a top-end GPU and then saddle it with a processor that's half as fast; if that's what you're after, get a Core i3-3120M and for gaming it would still be faster than A10-4600M (when paired with HD 7970M).Reply

At this point, no -- I'd have to wipe both laptops and reinstall all the applications/games, which is a couple days of work. I actually updated the MSI to Windows 8 because there was some odd glitch that I couldn't fix in Windows 7. I do have numbers from the Clevo before/after the Windows 8 update, though, and I can tell you that they're not appreciably different.Reply

So why you sounds like AMD A10 so much worse for games if only 11% the frame rates difference when you turn on the graphical details?Ok, I've seen little shorter load times on Intel, but does that worth the extra $300+???In other less CPU intensive games the difference perhaps even lower, so in most case A10 close as good for gaming with 7970M. And if someone buy gaming laptop, they buy for to apply highest possible graphical details. As long AMD does that perfectly for few hundred less dollars than it might would deserve some good words too!Reply

We have to wait for the full test to see how it handles a much wider range of games, especially CPU demanding ones and more recent games such as maybe BF3, including multiplayer, maybe FC3, Metro 2033, Witcher 2, etc.

I dont feel however that Jared was showing any anti AMD bias at all. If he wanted to show AMD in a bad light, he could have just presented the low res/low detail results.Reply

So how fair is to pull the few most CPU heavy games, than the reader will see Yes A10 is really slower, while in most games would perform equally with Intel?And I think this is the site which tested Intel HD4000 for almost as fast as Radeon HD7660G (Trinity). Yes in the few exceptional games almost as fast, while in the 90% of games HD4000 is a joke comparing to 7660G.Reply

Well, I cant say what the results would be because we havent seen any results for more cpu intensive games. But if the A10 is limiting in even some of the popular current CPU intensive games it will probably be even more limited in games in a year or two. And some of the CPU intensive games such as Skyrim, Civ 5, and BF3 multiplayer, as well as many MMOs are among the most popular ones.

If you are so biased in favor of AMD that you would limit yourself to a partial selection of games now, and probably even a more limited selection in the future, to save a couple hundred dollars on a 1000.00 plus purchase, go for it. Not to mention if you are GPU limited, you can run at lower settings or resolutions, while if you are CPU limited there are not many settings you can adjust to compensate for that.Reply

I am glad you tested this laptop, as I have been wondering for a long time how limiting the A10 would be with this powerful mobile GPU. That said, I am not a big fan of video analysis or even walkthroughs for games. Guess I am old school, but I would prefer to read the information. Just seems quicker and more efficient than watching a video and you can get a lot more information faster.

I am eagerly awaiting a full test of this laptop. Personally, anything over 30fps seems adequate to me for most games, so I will be very interested if it can play more demanding games, maybe like Witcher 2 or Metro 2033 at high settings and 1080p. It would also be interesting to compare this machine with an intel gaming laptop of the same price with the best video card for that price point, as well as a lower end gaming laptop such as the 15.6 inch Lenovo Y580 which you can get now for less than 900 dollars with at GT660M. Reply

The real question is if Trinity suffers from the same problems with Frame latency as the 7xxx series does. I mean, who cares if you have all these frames if you have hitching that makes the framerate still look inconsistent?

Sounded very Intel biased when I've seen totally good frame rates on AMD A10-4600M too and the difference was minimal between two of them in extreme settings. So if a gamer laptop play games very-well on extreme what is the problem with it for $1200?Reply

Minimum frame rates of <10 FPS are horrible, particularly if they occur on a regular basis. Which you rather have? Steady state 45FPS with no fluctuations, or 60FPS 98% of the time and 10FPS 2% of the time? I'd take the steady 45FPS every time. Of course, that's not what's on offer here, but when you have periodic slowdowns to <30FPS, it's very noticeable in the gameplay -- everything starts to stutter.Reply

I've seen the minimal 30FPS in the entire test, but if you experiencing drops, why do not you try to lock the CPU cores at 2.7GHz to keep steady performance? Because AMD has this option too with PSCheck, maybe with Overdrive as well!Also when you think for future games, those probably will all support quad cores, because today games are wrote to Dual or Quad Core CPUs. Obviously a Dual Core optimized game will bottleneck more on A10, because the per Core performance is low. Once a game use all available Cores AMD will bottlneck less or maybe not at all. Reply

The problem isn't the CPUs causing the FPS to drop, but rather a change in the complexity of the scenes being rendered. Face one direction and frame rates might be 70+ FPS; turn and see a bunch of smoke, henchmen, a cityscape, etc. and even though your CPU and GPU are running at the same clocks, you get a drop in frame rate because each frame requires more work to compute.

As far as dual-core Intel vs. quad-core AMD, what you say would be true if they ran at the same clocks and accomplished the same amount of work per clock. Simply put, neither of those assumptions is remotely true. A10-4600M has a maximum core clock of 3.2GHz and a guaranteed clock of 2.3GHz; something like an i3-3120M has a steady clock of 2.5GHz, whereas an i5-3210M has a base clock of 2.5GHz and a max turbo of 3.2GHz. In tests, the i5-3210M is up to 2.5x as fast in single-threaded benchmarks (Cinebench 10 1CPU), and it's still up to 40% faster in multi-threaded workloads. So yeah, the "more cores is better" idea only works when the cores aren't totally gimped by comparison.

Considering that i5-3210M equipped laptops (new!) run almost $100 less than A10-4600M equipped laptops, that's not a very good starting point -- you can get Core i5 + GT 640M for the same price. A10-4600M really needs to be selling in laptops that cost $500, but the manufacturers seem to be banking on the interest in the iGPU so instead they start at $600. It delivers good enough performance, sure, but so does everything else -- even Celerons with 4GB RAM run everything outside of games and computationally intensive workloads fine.

The problem is that games remain primarily lightly threaded -- few use more than two cores, and even then it's more like using 1.5 cores -- so having two cores that are each twice as fast already puts you in the lead. Hyper-Threading and other enhancements only increase the gap. Batman is hardly the worst performance of Trinity, particularly at max settings. Unless you can promise me that no other games will ever be like, say, Skyrim (one of my favorite games) or Civ5 (another great game), I'd rather spend $150 more for the 2x increase in CPU performance to go along with the high-end gaming GPU.Reply

Cinebench is a big bull, wonder why so many sites use that crap if cannot even utilize Trinity Cores. Other than that what is Cinebench do for us? Run some x264 video encoding and see if Quad Core A10 with minor difference as fast as Intel i5.If Core i5 + GT 640M really available for $600, that is actually a good deal and certainly faster than A10. Unfortunately power consumption and heat generation, fan noise will double which might not so welcome for many of us.If I remember well Civ5 was one of anandtech's only test where Trinity could show it's huge advantage over Ivy Bridge?Reply

Similar CPU performance as i5, but in Quad Core arrangement and as fast GPU as 630M:http://www.notebookcheck.net/typo3temp/pics/d5d886...http://www.notebookcheck.net/typo3temp/pics/a562dc...And no extra 35W TDP during gaming!Do you also see these tests, just need to turn up the details in gaming and difference between HD4000 and HD7660G is immediately double?You saying AMD has better GPU, but much worse CPU. And I see the CPU performance similar, but AMD IGP much better in the same price range! Because your I7+640M was $770. And than again I7+640M = 45W+45W TDP, while A10+7660G= 35W TDPReply

I admit that the intel configuration was posting higher framerates but there were way more glitches in both tests.Watch the video for yourself the system with the amd apu ran smoother.What does higher frame rates matter if it looks like crap?Reply

I'm curious: at which point(s) in the video are you seeing "way more glitches in both tests"? Give me time points to look at, because in most cases I thought there was more stuttering on the APU than on the CPU.Reply

Jarred this is a great idea. I have one important suggestion to make - can you please spell out the results of your findings upfront? As someone who is interested but not always up to date, I would love to read the first paragraph that quickly sets up the background of your experiment and the second paragraph that summarizes your findings and recommendations.

I'm also going to nitpick here - you talk about the Intel GPU walking away obviously. Unfortunately, it was not obvious to me at all and furthermore, I was even sure what you meant by the phrase walking away.

I love reading content that is structured pyramid style. Give me the most important bit upfront, then the next level of detail, and then the backing data and videos. I'm reading your articles because I already trust you, not because I want all the evidence upfront as proof.

Finally, I love the fact that you are focusing on the quality of the video and not just numbers and graphs.

Sorry if I came across as too prescriptive. I love the quality of content and level of detail, but the way you structure your content has a lot of scope for improvement.Reply

On the desktop, where people are running GPUs that are roughly twice as fast, CPU bottlenecks are a lot more apparent. Here we're dealing with a down-clocked HD 7870, more or less, plus there are very few laptops shipping HD 7970M, let alone 7970M with Trinity. So basically I wanted to see how Trinity would compare to Ivy Bridge with this particular GPU.Reply

We've already done that, back with the launch of Trinity. A10-4600M has a better iGPU than the HD 4000; there's not doubt about that. The problem I have is that the reverse is also true: Intel has a much better CPU than Trinity, without a doubt. So now you can choose Intel for mobility with not too much worry about gaming (and despite what many might like to say, gaming is really a minor concern for a majority of users), or you can get Trinity for moderate gaming with overall worse performance elsewhere ("fast enough", yes, but not fast), or for roughly the same price as A10 you can get Intel + NVIDIA for switchable graphics to give you battery life, gaming, and overall better performance.Reply

What I wonder - is there any use for the IGP, and if so - how? After all, it could do something like physics, right? The IGP is the only thing the APU has over Intel and it's unused in this scenario (which is of course why we're all so surprised anyone would put together a system like this).Reply

I'm not aware of anything that really leverages an AMD iGPU this way when you have an AMD dGPU as well. I guess if you have certain GPGPU programs that run well on AMD, you can run and use both the 7970M and the 7660G, but most games that use the GPU for physics are using PhysX, and that's an NVIDIA-only affair for the foreseeable future.Reply

This review does not make much sense. All conclusions have been based on just one game, and it is one that favors Intel processors. I'm a bit disappointed and frankly I expected more from AnandTech.Reply

Did you actually read the text? Mostly this was me wondering if people would like more videos like this, e.g. before I spend the time required to create them. I'm more than a bit disappointed with a comment like yours, as I expect more of the readers than to assume that a three minute video is representative of all that has gone on behind the scenes.

Take this quote, though: "I should also note that there are some titles where the Trinity and Ivy Bridge notebooks are fairly close in performance (at maximum detail at least), while other titles are even more in favor of a faster CPU (e.g. Skyrim)." I've got benchmarks from about ten titles, with more to run still, so any opinions are not "based on just one game".

As for Batman favoring Intel processors, ALL games favor Intel processors, because they're substantially faster than AMD's processors; the closest we get is pretty much a tie. There is one title (Sleeping Dogs) where at maximum details the Trinity system is a few FPS faster, but step down a notch or two and the gap is once again firmly in favor of Intel. My best guess is that the bottleneck on the Intel platform in that specific title is the Enduro drivers, not the CPU -- with and AMD dGPU copying data to an AMD iGPU, potentially there are some optimizations present that help a bit.

Anyway, more to come... I've made videos of Sleeping Dogs and Skyrim, to show the real extremes. Now I just need to edit them and post, which will take a few hours each.Reply

Stay tuned for more butt kicked amd - the way it is going to be played, because that's how it plays out.

The A10 is limited to half the performance, what a shame.Once again I read the hope and change a comin' for the glorious day when games make the A10 faster.... it's coming say the fanboys, it's a comin', it's the reason for the season...

If the tool you use to measure the frame rates allows logging of the measured frame rates, I think it would be easier for you and more useful for the reader to provide statistics on the frame rates, rather than a video. For instance, you could provide:- variance or standard deviation- quantiles or percentiles- plot of the probability density functionReply

The video is great. It lets the reader really see what they will be getting. Stuff like "Intel frame rate = X, AMD frame rate = Y, THAT'S N% FASTER!!!!" is unnecessary. Believe it or not, I can tell when one number is larger than another. No matter who's faster, it makes the video look like an ad for that company.Reply

Mostly I put the numbers in the video because the FRAPS result and other text is a bit hard to read -- especially if you're not at 1080p. Taking a video of a display is not exactly easy, sadly; I tried about ten times to get what I felt was a reasonable exposure, and even then I'm not happy with it.Reply

I think it was just the way it was presented. Having the raw FPS numbers is ok. (I personally don't think it necessary, because we could still see the stuttering that happened and that tells the watcher all that's really needed to know.)

It was the "Intel is 45% faster!" part that made the video look like an ad to me.Reply

It's not so much an ad as a fact. If you were looking to buy a laptop with 7970M, the fact that there are games and settings where the Intel CPU drives the AMD GPU 45% faster (or 100+% faster in a couple, like Skyrim) is pretty darn important, hence the exclamation point. 45% because of CPU bottlenecks is huge, and we're not even breaking into the "frame rates are so high it doesn't really matter" range, since the Trinity system is averaging well below 60 FPS.Reply

No, keep the fraps in there. I helps a lot to see what is actually going on.I enjoyed the video and yes you should continue to do such things, they definitely add to this site and make the "contact" with benchmarking feel real. I seriously enjoyed this and think you used a golden opportunity well and wisely. If there are lower cpu and gpu combos, the ball may bounce in another direction, so by all means show amd advantages if that what comes up with some of their combos. Can't win em' all the time - but sometimes amd has a good thing going, like their mobile 5660 gaming chip was very tempting and came in really nice packages with many choices at excellent pricing.

They are gaming laptops, and it's top of the line AMD, it doesn't get any better, so if the test is unfair, amd just plain sucks for driving games. Oh, by the way, AMD DOES JUST PLAIN SUCK compared to intel for driving games at the high end - PERIOD. Now, step a couple tiers down and that changes - or at least can be said to be much more competitive for AMD for anyone who already has their AM2+ am3 am3+ platform, which is pretty much all the amd fanboys. So upgrading the cpu to give that game boost is a WINNER for amd board owners...

So you have to temper your amd fanboyism with reality - NO WEBSITE uses amd cpu's to do their benchmark testing - there are LITERALLY HUNDREDS that I go to and they all use INTEL. They are not all incorrect.Reply

I was curious how heavily FRAPS utilizes the CPU. Obviously, in the APU machine the CPU is the weakest link, and it seems that taxing it with an additional load could artificially make the AMD machine look even weaker than the Intel machine. I mean, if the AMD machine is at 95% CPU during intensive gaming and the Intel machine is at 60%, and FRAPS eats up 10% of the CPU cycles, well...

To me, the video didn't add a lot of additional value if it took a while to put together, although it would be of more use to me in cases where there might be a qualitative difference between machines rather than just framerates. Say, a GTX660m machine with a QM Intel processor (e.g. Y580, or G55VW) and the AMD machine with the 7970m).

Also, which version of the Catalyst drivers you were using? Apparently, the newer drivers (Dec) noticeably impact performance, or so it seemed from the notebookcheck forums. I was looking at the GX60 to play GWII, but saw a rather bad review from Notebookcheck itself and got concerned. The users in the forums are saying that they're seeing much different (better, playable) framerates on GWII.Reply

Yes you can.Thing is, the amd fanboys don't want to go there, they just didn't like amd losing on the CPU side. They can't stand it, but every benchmark site they go to has been using the intel cpu platform for testing for YEARS now, and everyone here should know why - but playing hate filled whining and not facing the facts is much more "effective" for the amd fans - at least their personal delusions, and thus delusional feelings and crackpot assessments.

I for one am VERY PLEASED to see an article like this since it can save over a thousand ( or two thousand) dollar MISTAKE.

The amd fans need to get over it, as they are the ones always clamoring for saving those pennies and getting that best bang for the buck, and constantly harping about ripoffs and the like when it comes to their perceived competitive enemy companies, namely intel and nvidia of course.

So now it's time they suck it up and face the tunes, the article writer has likely saved a few very enthusiastic amd fans from severely blowing it.Reply

I couldn't care less about video content. It takes two orders of magnitude more time to get the same info I could just read. I'd much rather the author spend the time on more hardware combinations or more reviews :-)Reply

Have you tried driver 12.12? MSI GX60 owners happily experienced boost over 12.11 with the new 12.12 driver. Someone sad Skyrim ran almost always 60FPS on ultra setting. When are you going to post the test?Reply

I just bought the MSI GX60 and have to say im fairly pleaced overall. Sure the CPU is a bottleneck but if i wanted a Intel CPU i have had to pay over 50% more and for the same price with an intel CPU i would only get a 660M or maby 670M.Reply