"It’s not a problem in areas where AMD is competitive, but what about areas in which they’re not?"

AMD chose not to be competitive on the Netbook and lower power notebook market. There was a time when ATIs IGPs were very good, and today an overclocked 3200 can dish out enough fps to compete with NVIDIAs offerings. AMD had a lower power CPU months before the CLUVs, but it never become an integrated platform like Ion, even on the micro HTPC market.

If a company has a super IGP, and there was a time when AMD had one, and the market has a new wave of low power low costs Netbooks, and this company decides not to join the wave, well, dont blame Intel when things go the way of the Dodo.

I am only seeing Intel paying AMD so that Intel remains dominating the market, even on produtcs were Intel's offerings are so crappy as the IGP market. Reply

AMD still offers the AMD Neo X2 with AMD 785G or HD4200 graphics or with embedded AMD 780E + SB710 chipset. But that's not a netbook offering but a consumer or ultra-portable low-power notebook offer. I don't really like it that it's K8 based but it's still not totally bad. I don't think AMD is interested in the MID/smartphone market Intel is going after slowly any way.

There's reason the AMD Neo platform didn't become a HTPC platform, first it was only released with 690E, secondly it wasn't supported in XBMC on linux. Then there's cost. Now CULV makes a lot more sense. GMA4500 is good enough. And it's first now supports really is coming through with Flash acceleration etc. It was a very limited usability with such a platform before. So soon Atom (with Broadcom accelerator), CULV etc really becomes useful and functional in the consumer space. AMD don't have any advantages even if they have support for the same things. I think it's wise that they are designing wholly new mobile products. Then there's gonna be a fair fight. Products will have a lower power envelope and be cheaper and on newer node. More will be released into the market. Reply

This could be a chance for VIA Nano to get in on the netbook market. I fear however that most people buying netbooks won't know or care about the difference between the platforms. If it's cheap and can at least play 720p they won't go out of their way to get something non-Intel. Reply

I won't consider the Nano until they have shrunk it, made it less power hungry, made it dual core and released a good graphics chip with truly open source drivers. I'd rather have Pineview + Broadcom BCM70015 if not.

It's impressive what Centaur can do with ~100 employees, but I'm not impressed with the S3 Graphics or VIA chipset team. Reply

That will never happen. It's Nano + Mobile ATI/NVIDIA GPU tops. But I don't see the point. Via's own gpus are fine for DXVA in Windows etc. Nano aren't that power efficient to begin with. It's not like such platforms will be fast enough to game on either way.

If I remember correctly, you used a standard ATX power supply for testing the LF/LF2 Atom boards, which you speculated was part of the reason for such high power consumption. Did you restest with the same PSU as from the Pine Trail system? Reply

If I am reading correctly the Intel platform with Broadcom media accelerator (about $25) is cheaper than the Ion platform and will consume less energy. Some relevant perfortmance testing may help determine whether the Intel platform is positioned to dominate the frugal HTPC market.

It is a bummer that you will need to spend chump change for a SATA port card to enable software raid 5 for a frugal media server.

Likely, the incremental performance improvements are adequate for Intel to maintain its cachet in the dissed, real world netbook market with its surprisingly large volumes.

For it to be a solution for me it needs to be on a mobo that has a pcie slot and hdmi through the broadcom chip. I'm not to hopeful though, will probably end up with a i3 solution. Too bad, I like the idea of going super low wattage. Reply

The platform is standing still apart from getting more integrated. Hopefully that will mean cheaper. Maybe some netbooks will get cheaper, but otherwise the platform is a complete waste of time because of the lack of HD video capability (for Intel to call the graphics "HD Graphics" is verging on consumer misrepresentation, it can't even do 1080p output via HDMI/DVI, what is this, 2005?).

Why did the review concentrate on benchmarks of applications nobody would run on such a system?

These chips are a solution for nothing apart from the unwealthy Office user who needs 8 hours of battery instead of 6. Reply

There could be a test with underclocked and undervolted E5300 against the Atom D510. Maybe Power vs performance would be quite close if not better for the E5300. I think that everyone could take another 10W for a performance thats not crapy (even netbooks). Reply

10 more watts would mean going from 8-9 hours of socket-free operation down to 6 (or maybe less). Not acceptable for the people that want 8 hours of socket-free operation. As for higher performance CPU, you have the low voltage processors (CULV), which will better fit the bill. Reply

Thing is that 10+ more wats in CPU doesnt mean that u use it all the time. For internet browsing and so (not like u can use it efectivly for something more) the cpu isnt working 100% all the time just sporadicly.
Majority of the power usage in those scenarios is in the rest of the hardware like display ,gpu, mainboard which runs all the time.
And i dont even say that with faster cpu the task are faster done so u use the cpu on 100% state less time in total.

Atom is the only one of its kind now and without competition it will stay this crapy for long time.
Reply

I am tired of Intel forcing crap products on consumers, while touting the great benefits. Clearly there is no benefit for the consumers. Not to a single one as the IGP is barely enough for Minesweeper.

Although I am very disappointed with nVidia these days, their offerings are much better on the mobile segments - both phones and nettops. Reply

Best feature of this platform is getting back to known, supported GPU. Intel damaged its reputation with Paulsbo, now there a chance to get back to supported graphics chipset, with proper Open Source drivers. Reply

The difference in numbers (950 vs. 3100) made me hope there'd be some improvement in the graphics, but it was my fault for not having done enough research. Turns out they're basically the same. I forgot it was the X3100 that had DX10 graphics, not the 3100 (how nice of Intel to use such clear naming convention).

Still, in other respects this seems like an okay upgrade. A little better performance and lower power. I wonder if it will make the GMA 500 solutions go away. Reply

I'm a bit disappointed myself in Pinetrail platform. To me, it seems more or less a way to bring costs down for Intel while masking it as a performance bump to consumers. Unfortunately, my perfect netbook is an Atom dual core with an ION chipset on a 10" screen but it does not exist. There are 12" versions but then, it somewhat defeats the purpose of the netbook (portability and battery life). I wish a 10" version would come out. It would be the perfect companion video player. I could watch Hulu in bed, take to the gym and watch movies while on the treadmill, have it beside my main PC to stream sports programs while working, play hidef movies on my hard drive to a large HDTV, etc.. Hopefully someone will create a 10" Ion notebook in the near future. Reply

It's pretty sad to see that Intel is muscling nVidia out of this market, when it's clear that a better product exists, and now we're getting a "new" product with worse performance across the board. Because if Flash isn't being hardware-accelerated, then that means that they CPU is doing more work, and whatever gains Pine Trail might have (i'm not convinced they're even worth writing an article over), it's negated by the exponential improvement in which Ion brings to the table.

At first, I thought nVidia's CEO was blowing a bunch of hot air about Intel playing hardball, but it's pretty obvious that something is wrong with this picture here. If Microsoft got smacked down for bundling in 1998, I don't see how bundling the CPU and GPU into one chip and then locking other hardware vendors out with expensive contracts is any different. And what's worse, apparently the laws this time around allow the Federal government to not just look at Intel's past actions, but their effect on the future as well. Reply

Why doesnt AMD just take one of their upcoming mobile 880 series northbridges and add a memory controller and a single Athlon core? It would be faster than atom, more efficient than Ion, and could be binned for low power. Instead they just stand there with their thumbs up their butts while Intel shovels this garbage onto millions of unsuspecting consumers at even higher profit margins. Reply

Do you want some whine with that ? Where were you when chipsets were created by taking a bunch of smaller ICs on the motherboard and putting them altogether into one IC ? PCs became cheaper and faster. We thought it was great. Do you know anything about L2 Cache ? It used to be separate on the motherboards as well until it was integrated into the CPU. PCs became cheaper and faster and we thought it was great. Remember when CPUs were solo ? They became Duo & Quad making the PCs faster and dropping price/performance. AMD & Intel integrated the memory controller and, whoa!, guess what ? Faster & lower price/performance and, yes, we thought it was great. It's called Moore's Law and it's all been part of the semiconductor revolution that's still going on since the '60s. GPUs are no different. They're still logic gates made out of transistors and with new 32nm technology, then 22nm and 16nm, the graphics logic will be integrated as well. Seriously, what did you think would happen ? Reply

Intel has artificially handicapped the low-voltage sector in order to force consumers to purchase Pentiums. Right where they wanted you all along.

Since when is it ok for Intel to dictate what type of systems are created with processors?

First it was the 1GB of Ram limitation, now you can't have a dual-core. When does it end?

"We have a mediocre CPU, combined with a below average GPU-according to our amortization schedule you could very well have it in the year 2013(after the holidays of course), by which time we should have our paws all over the video encoding and browsing standards, which we'll be sure to make as taxing as possible. Official release of USB 3.0 will be right up in a jiff! Reply

The historical examples you cite are not analagous, because intel bundling their anemic GPUs onto the package makes performance *worse*, and bundling the two dies onto a single package (they're not on the same chip, either, so there is no hard physical limitation) makes competing IGPs more expensive, since you now have to pay for a useless intel IGP as well as a 3rd party one if you were going to buy an IGP system.

And just because a past course of action was embraced by the market does not mean it was not anti-competitive. Reply

You need to re-read the tech articles. Pineview does integrate both the graphics and memory controller into the CPU. It's the ICH that remains separate. Even if it didn't, what do you think will happen when this goes to 32nm, 22nm and 16nm ? As for performance, Anand says in the title "Pine Trail Boosts Performance, Cuts Power" so that's good enough for me.

Intel obviously created the Atom for a low cost, low power platform and they're delivering. It'll continue to be fine-tuned with more integration to lower costs. The market obviously wants it. SOC is coming too (System On a Chip) for even lower costs. Not the place for high performance graphics, I think.

This is really about Moore's Law marching on. It's driven down prices, increased performances and lowered power more than anything else on the planet. Without it, we'd still be paying the $5000 I paid for my 1st PC in 1980 -- an Apple II Plus. What you're saying, whether you know it or not, is that we should stop advancing processes and stop Moore's Law. Personally, I'd like to see us not stop at 45nm and keep going. Reply

Hector is right in one respect, and that is that if Intel is going to be dumb, we don't have to purchase their products. I especially like the sarcastic cynicism in the article when mentioning all the things that Intel's chip CAN'T do. They just don't know how to make a GPU without patent infringement. If they can't compete, they'll try using their big market share to hurt competition. Classic Intel move. They never did care about innovation, only about market share and money. But I guess that's what happens when you're a mega corp with lots of stockholder expectations and pressure. I'll give my three cheers to the underdogs! Reply