We wouldn't dream of letting an Intel launch go by without complete and thorough coverage. Can you imagine the Nehalem launch without reviews on launch day? Or even the release of Intel's first mainstream SSDs without an article going through the technical merits of the chip giant's design? Yet skip a product release was exactly what we did with the introduction of one of Intel's latest chipsets: the long awaited G45.

When I visited CeBIT in Germany earlier this year I noted that the stars of the show were the chipsets. We had 780G from AMD, GeForce 8200 from NVIDIA and G45 from Intel. Nearly every single motherboard had either a DVI or HDMI port on it, and full Blu-ray acceleration was supported across the board by all of the vendors. We were about to enter the golden era of chipsets with integrated graphics and CeBIT was the first indication of it, but what happened?

In fact, out of all three vendors, only AMD was able to deliver on their promises in a timely manner. Despite lacking in one key feature (8-channel LPCM support), the 780G chipset was honestly the most impressive chipset launch from AMD to-date. Gigabyte's 780G board was so good in fact that despite Intel's microprocessor superiority, the board was strong enough to be my choice in my own personal HTPC. While the G45 may have been the right chipset for me, the 780G was the chipset I could rely on right now and thus I turned to it.

NVIDIA shortly followed up with the GeForce 8200, however the first motherboards weren't anywhere near as polished as what was available with 780G and the integrated graphics performance was actually lower than AMD's. For a company that gave Intel such a hard time for delivering poor integrated graphics performance, NVIDIA was nothing but hypocritical. AMD's 780G put it to shame.

Time still went by and there was no G45; I was concerned. While both AMD and NVIDIA had tons of experience in building in support for full hardware H.264/VC1/MPEG-2 decode acceleration, this would be Intel's first attempt at such a feature. I'd seen demos of G45 running just fine and I was told that everything was ok, but I didn't have hardware. The first time I laid my hands on G45 was actually in mobile form during the mismanaged Centrino 2 launch, and it wasn't good. Blu-ray acceleration worked, but it wasn't usable at all. I chalked this up to the early Centrino 2 platform but once we got G45 boards in house, things weren't that much better.

The first boards (available for sale, mind you) and drivers had issues with Blu-ray playback in certain situations, HDCP problems and issues with HDMI repeaters (e.g. AV receivers). It was so bad that in AMD's suite during IDF one of the AMD demos was a 780G vs. G45 comparison, showcasing how broken the initial G45 release was. We struggled with G45 for much of the early weeks of its release, but the platform wasn't problem-free enough for a launch-day review.

As of late I get the impression that although Intel is quite successful and executing very well on multiple fronts, the strengths of Intel's less popular teams are no where near as great as their chip designers. We all saw this with the Centrino 2 launch I just mentioned - something we'd never see from the Nehalem guys. But Intel does far more these days than make desktop CPUs, there are chipsets, soon to be graphics cards, solid state drives, wireless adapters, notebook platforms, the list just goes on and on. The incredible revolution Intel was able to pull off in the desktop CPU space over the past 2+ years raises the bar for the rest of the company, and not all of the teams are able to perform quite as well.

What follows is the first part of a three part series on the current state of chipsets with integrated graphics. This first part looks closely at Intel's G45, presently the only LGA-775 chipset with full Blu-ray decode acceleration, as well as a quick comparison of the first G45 boards. Part two will widen our focus and compare G45 to competing Socket-AM2 chipsets with integrated graphics: mainly AMD's 780G and NVIDIA's GeForce 8200/8300. And finally we'll bring this whole series to a close in part 3 with a comparison of IGP gaming performance to the cheapest yet best performing available add-in PCIe graphics cards. After this series of articles we will dive into a AMD 790GX and NVIDIA 750a roundup to complete our look at the hybrid IG solutions.

A High Level Overview of G45

The G45 is very similar to the P45, with the addition of integrated graphics. These boards target essentially the same market, but G45 board will likely not be as targeted towards overclocking and may be a little cheaper. The major advantage of G45 will be for people who don't want any sort of 3D graphics on their desktop. While Intel continues to advance their graphics subsystem performance (as we'll see in the next article in the series), the performance still just isn't there for any but the simplest of 3D tasks. These boards should really go into systems that focus on 2D operations like office applications, internet, and general communication packages. That said, you can play certain types of games if they fall into the casual gaming category, you know, titles like Barbie Fashion Show up to Sims 2 will have acceptable performance at 1280x1024 resolutions.

With the intended target market being those who do not wish to overclock or game with the integrated graphics, we are looking at desktop computers targeted at business tasks and surfing the internet. The G45 provides support full Blu-ray acceleration and multi-channel LPCM audio output, so it will work as an HTPC platform. With a mainstream target, we expect these to end up in plenty of systems over the coming months, especially from the OEM sector. It isn't overkill for those who don't need a lot of features and overclockability, and it offers enough capability to get by in the home and in the workplace.

Before we look at the performance of the chipset, we will take a look at chipset features and capabilities and the southbridge. Understanding how this solution compares to the previous generation Intel integrated graphics chipset and how it compares to the other 65nm chipset offerings Intel has will help give us context for the performance comparisons to come.

Post Your Comment

53 Comments

Impressive how many people just rant on about the review being inadequate when they obviously didn't even read the start of it! If they did that they'd know that reviews of AMD and nVidia boards are coming up and that all will be compared eventually!
I get the feeling that the people talking about "Intel fanbois" tend to have the same kind of appreciation of another brand...
Stating the obvious isn't being partial. It just so happens that AMD don't even come close to competing with Intel in the CPU department! Sure AMD might be cheaper, but there are cheap Intels out there as well. The whole platform tends to get a bit more expensive when you go with Intel but you get what you pay for. I'm perfectly happy with my G35+E2140. Does everything a computer is supposed to do but gaming. I'm not a gamer, so that is a non-issue for me.

Very tempted to go mini-ITX with 1,5TB HDD. Tiny box and lots of diskspace!

Found a nice case for it as well, Morex Venus 668. Not that I know anything about it really but it'll hold up to 3 HDDs and a full size ODD and probably house decent cooling for the CPU while still being tiny (~8"x9"x13"). Reply

Does this sight have an ounce of integrity left? I seriously doubt it. Nothing but Intel pandering left here. You "reviewers" have the gaul to do a review of this attempt at an IGP, yet fail to show any review of either an AMD IGP if it proves how inverior G45 is. Are you seriously implying that people are so stupid that they aren't capable of seeing through this BS? I remember something about a SB750 promise somewhere around 2 months ago that never materialized, then a 790gx promise that never materialized, then another 790gx roundup, that not only never materialized, but the DFI preview article seems to have actually vanished, then the AMD IGP part II looks to be delayed or something, probably vanished due to Intel's poor performance.

I am really really starting to wonder if AT was purchased by Intel. All evidence points to it. If not, then call a spade a spade and don't make promises you can't keep. I'm sure you think none of this matters because you're so popular that people will read no matter what you write here. I wouldn't be so confident if I were AT. Reply

I can tell you guys are really working on gaining that female readership. As everyone knows, women really go for that low-class, vulgar language.

Also, who would want to get rid of PS/2 ports? Whoever on your staff wants this, better have something more than they hate anything legacy. Where's the logic in adding two extra USB ports so you can remove the PS/2 ports? It's not like it's more flexible, really, because you pretty much always need the keyboard and mouse. When's the last time you were in the situation where you said "Oh, I won't be needing my mouse and keyboard today, and I'm so strapped for USB ports, it's a good thing I can use the ones I normally use for the keyboard and mouse for something else". Doubtful you've ever said it, and if you have, you have issues deeper than I am capable of dealing with.

It's not like the keyboard or mouse work better in the USB port, or that it's somehow superior in this configuration. In fact, the PS/2 ports were made specifically for this, and are perfectly adequate for it. Didn't you guys know that USB has more overhead than the PS/2 ports? I guess not. So, you worry about fractions of a percent going from motherboard to motherboard with the same chipset, but you prefer to use a USB mouse and keyboard? I just do not understand that. USB was a nice invention of Intel to suck up CPU power so you'd need a faster processor. It's a pity this has been forgotten.

Sure, let's the replace the efficient with the inefficient, so we can say we're done with the legacy ports and we can all feel like we've moved forward. Yes, that's real progress we want. Good grief. Reply

I really dislike the trend of recent reviews that go off on tangents about the state of the market, or particular vendor performance gripes and then the rest of the review doesn't even touch on relevant benchmarks or features to back up these rants. If you're going to complain about IGP performance from AMD or NVIDIA, you might want to back that up with at least ONE board being included in the comparison charts. Who cares if Intel G45 gets bad frame rates against itself (across the board to boot). Why not show how 3 IGP chipsets from the major vendors stack up against each other in something mainstream like Spore? If it's a G45 only review, how about you save the side comments for a true IGP roundup? Sorry, but if you have the time to post a "(p)review" that brings up competitive aspects with no benchmarks to balance out those comments, it's basically single-vendor propaganda - nothing in the conclusions deal with whether a IGP in the same price range from another vendor would fill the void that G45 clearly does not fill.

Since when does issues at the release date mean you can't post the review? "We struggled with G45 for much of the early weeks of its release, but the platform wasn't problem-free enough for a launch-day review." - Ummm, might want to include that as disclosure in all your other post-launch day reviews!?! Or do other vendors get brownie points for being problem-free when you can actually buy the product?

Unfortunately, the inconsistency across multiple reviews make it somewhat difficult to compare competing products from multiple vendors because the methodology varies between single chipset and competitive benchmarks, even when you can separate the irrelevant introductory comments and bias from the particular author from the rest of the review.

More authors obviously does not equal consistency or more relevant reviews.. Reply

Looking forward to your review of this board(if I understood you correctly), as I have been keeping an eye on this board for a while now. Perfect for an all around general use board(minus gaming of course), but would have been really REALLY nice if that 1x PCIe slot were a 16x PCIe with atleast 8x bandwidth. Hell I think i would settle with 4xPCIe speeds, just to have the ability to use an AMD/ATI 3650/3670 in this system. I think Jetway has a similar board with a 16x PCIe slot, slightly less features, at the cost of like $350 usd . . .

Now if someone reputable (meaning someone who can actually make a solid board from the START *cough*ABIT*cough*) using the Core 2 mobile CPU, SO-DIMMs, etc, AT A REASONABLE PRICE . . . I think I might be in power consumption heaven. Running my desktop 'beast' tends to drain the battery banks dry ; ) Reply

Why with a generation Die Shrink we only get 2 extra Shader instead of like 4 - 6? Where did all the extra available die space went?

With the New Radeon HD 4x series, people have consistent result they can get single digit CPU usage when viewing 1080P H.264 with a E7xxx Series CPU, or slightly more then 15% when using an old Celeron. This is 2 - 3 times better then G45!!!! Even 780G is a lot better then G45 as well. So why such a HUGE difference in performance of so called Hardware Accelerated Decoding? Reply