AMD’s new Hondo CPUs aren’t quite a perfect fit for Windows 8 tablets

New chips will compete with Intel's Atom, but may face hurdles to adoption.

AMD has just released some details about its newest tablet processor, codenamed Hondo, which will compete with Intel's Clover Trail Atom chips for use in low-end Windows 8 tablets. There actually isn't much new about the Hondo chip—it uses the same basic CPU architecture and integrated graphics technology as the current E-series APUs found in AMD-powered netbooks and low-end laptops. It's also built using the same 40nm manufacturing process. In general, AMD's chips have better CPU and GPU performance than the Atom chips in competing netbooks, but will this advantage extend to tablets?

The facts: decent hardware, OK battery life, no Linux support

The Z-60 APU combines a dual-core CPU based on AMD's Bobcat architecture, as well as an integrated Radeon HD 6250 GPU. At 1GHz, the CPU is slower than the majority of the E-series CPUs you'll find in netbooks and low-end laptops using the processor. While the Bobcat architecture can usually beat Atom at the same clock speeds, the Clover Trail Atoms will be clocked quickly enough to erase AMD's CPU performance advantage.

The integrated GPU is labeled as a 6000-series part but is architecturally a 5000-series part. The GPU is still pretty advanced compared to the PowerVR SGX 545 in Clover Trail and the Direct3D 9-class GPUs paired with most current ARM chips: you get Direct3D 11, OpenGL 4.1, and OpenCL 1.1 support as well as HDMI and DisplayPort output support, and support for resolutions up to 1920x1200 (which includes 1080p support for 16:9 tablets). The GPUs also offer application acceleration for programs that support it, as well as video acceleration for codecs like MPEG2, MPEG4, H.264, and VC-1 due to the inclusion of AMD's UVD3 video decoding engine.

The entire APU has a TDP of 4.5W, and according to AMD's estimates, tablets using their chip should be a bit lower than Intel's Clover Trail tablets in battery life: about eight hours during Web browsing and other general use workloads, and about six hours during video playback (Intel promises about ten hours for Clover Trail, though it doesn't specify the activity being performed). This is decent, though obviously we'd want to do our own battery life tests to make sure of these numbers. AMD also promises tablets that are as thin as 10mm (or about 0.39 inches), which is just a bit thicker than Intel's 8.5mm estimates for Clover Trail.

As for software support, the story with Hondo is much the same as it is for Clover Trail: the chip is made primarily for Windows 8, and AMD is not planning any Android or wider Linux support for the chip at this time. One difference between AMD's and Intel's tablet chips is that AMD is committing to Windows 7 support to cater to "some commercial markets that are still evaluating the different operating systems," according to AMD Product Marketing Manager Christopher Sutphen. This should make Hondo tablets marginally more appealing to businesses sticking with Windows 7, though the older version of Windows has not found much success on tablets to date.

But will OEMs actually use it in their tablets?

The biggest problem facing AMD will be getting this chip into tablets from major manufacturers; of the x86 Windows 8 tablets we've seen at IFA and other unveilings throughout the year, most manufacturers seem to be offering high-end models using Intel's Ivy Bridge processors and low-end models running either Clover Trail Atoms or ARM-based chips. AMD wouldn't provide information about which OEMs will be shipping tablets based on Hondo. Hondo's predecessor, the AMD Z-01, appeared in only one shipping tablet I can find, MSI's WindPad 110W.

Enlarge/ Hondo still requires a second chip, the FCH, to provide USB, SATA, and other functions. This second chip takes up extra space that ARM or Clover Trail chips wouldn't need.

AMD

One hurdle to Hondo's adoption may be that, unlike Clover Trail, the processor is not a system on a chip (SoC). The CPU and GPU are indeed one piece of silicon, but USB, SATA, and other functions are still handled by a separate chip called the Fusion Controller Hub (FCH). This particular FCH does include some features Clover Trail lacks (most notably native USB 3.0 support), but in tablets, space is still at a premium—having to use two chips instead of one takes up room inside the system that could be given over to a larger battery or shaved off entirely. The FCH also consumes extra power (according to AMD's own slides, between 0.55 and 0.68W during normal use) that has to be considered alongside the 4.5W TDP of the APU itself.

Hondo looks like a more or less viable tablet processor, but if the company can't get it into desirable tablets, it may not make much of a dent. To truly take on ARM and Atom and encourage adoption, AMD would do well to develop an SoC version of the chip manufactured on a 32nm or 28nm process. This would take up less space, consume less power, and could bring more GPU cores or ramp up clock speeds because of the extra thermal headroom. At least some of these improvements are currently rumored to be coming in "Tamesh," Hondo's replacement, which will supposedly launch at some point in 2013. Until that happens, AMD's tiny presence in the tablet market may further cement the chipmaker's slow slide into irrelevance.

We don't yet know exactly when Hondo tablets will begin showing up, but AMD says that availability is slated for later this year.

This is playing out my worst fears for AMD.http://www.slyman.org/blog/2011/02/arm- ... hitecture/They are producing CPUs/ APUs that are good, but not quite good enough to offer serious competition. It appears I wasn't wrong to suggest they just can't catch up in time with either ARM or Intel to avoid getting squeezed by the narrowing gap between ARM's and Intel's merging market segments (by virtue of Moore's Law, ARM's market segment now encompasses big computing devices, and Intel's market segment now encompasses mobile devices). As ARM becomes a serious option on low-end and ultra-mobile desktops and a serious contender in the general market for processors for consumer computing devices, the x86 market is shrinking relative to the size of the new total market, to the point at which there is only competitive space in the x86 market for ONE major player.Based on the reports we're seeing of AMD's new designs (which will take a little while to come onto the market); my best-case realistic scenarios for AMD are either:• AMD becomes a design house for ARM based APUs (like nVidia),• AMD shrinks back into the role it played in the 1990's as a niche supplier of pin-compatible products (harder now, given Intel's IP shenanigans which help keep the motherboard market segmented).

Clover Trail doesn't include SATA support either, AFAIK. Granted, it's not particularly useful on a tablet, so it's probably only there because AMD has been slimming down existing parts rather than starting from scratch.

Could make for a pretty cool gaming tablet though, if that becomes a thing.

Clover Trail doesn't include SATA support either, AFAIK. Granted, it's not particularly useful on a tablet, so it's probably only there because AMD has been slimming down existing parts rather than starting from scratch.

Could make for a pretty cool gaming tablet though, if that becomes a thing.

What I think AMD should do is build their own x86 tablet, with their own hardware...and price it competitively. If they can market an x86 tablet for between $200-$300, I think they have something that Intel cannot compete with.

Anyone with knowledge of the exceptionally shoddy job Intel has done at supporting SKUs with PowerVR IP should be beating at the gates for AMD powered tablets.

The PowerVR architecture used for Intel Atom / GMA parts is more than up to the task, but Intel's drivers are slower than software framebuffer modes on EVERY OS (Windows, Android, Linux, etc). Check out Intel's own forums, they are rife with discontent: [ur=http://communities.intel.com/community/tech/graphics]http://communities.intel.com/community/tech/graphics[/url].

OpenGL and Directx support does not meet with Intel's own promises on most parts such as the GMA 600 and 3600. Intel's own promises regarding the Z6xx / GMA 600 from two years ago. Directx 10.1, never done, OpenGL 2.1, nope, H.264, very broken, 720P sometimes works. This is the same GPU found in the iPhone 4/4S, iPad (1), and Galaxy S series, all of which play video and execute OpenGL 2.0/2.1 just fine. Even the GMA 500, the same IP at 1/2 the clock speed (200 vs 400 MHz), runs video, OpenGL, and Directx better.

Regarding Android, there is even a technician at Intel who compiled a full Android x86 OpenGL enabled driver for the GMA 500. His bosses turned around and told him they would not allow it's release, whether closed or open source. ImgTec is a large part of the problem. The indications are first and foremost that PowerVR does not allow open source use of their drivers. No problem, but then it would also appear that Intel's deal with ImgTec does not allow Intel to directly compile the driver code they have for PowerVR IP for shipping product.

Sure, I would be onboard with an Ivy Bridge powered tablet, but there is no way I would touch another Atom tablet. I love both that I have, but they would have been returned instantly if I had bought either new at full price to find the graphics drivers so shoddily done.

Ultimately, it's all really too bad. Coupled with the moderately decent SSDs in x86 tablets, the Atom is actually more than competent for most workloads appropriate to the form factor; Office, web browsing, e-mail, and other productivity all run great and very responsive once launched (launch times are a tad long in the 5-10 second range for MS Office 2010 programs).

What I think AMD should do is build their own x86 tablet, with their own hardware...and price it competitively. If they can market an x86 tablet for between $200-$300, I think they have something that Intel cannot compete with.

Intel certainly doesn't want to compete at those prices, but between their advantages in scale and process tech, there's not really anything AMD can produce that Intel can't, if they choose to.

It's not the most attractive proposition is it.Want to make a thicker tablet with less battery life and CPU performance than Intel? Go AMD.

Well i don't think that is a good statement as, slightly less battery life which has not tested on either system, so no one really know for sure.Slightly thicker... really is 2mm really that critical. more to hold is often better, especially for a tablet that goes mainly in a bag and not in a pocket like a phone. with ever changing Display tech 2mm will be easy to find. also the extra depth will get more room to fit good things like display port, SD card slot(s) ect.

The need for an SSD drive in a tablet is silly, as flash memory gets cheaper this is built into the system board so there will be no need to SSD interface/power hardware. As SD cards get bigger and faster with lower power requirements - with a deeper tablet 2mm there is plenty of space to run two more more SD card slots.But AMD are wasting time with 32nm chips they need to skip ahead and get the 28nm or smaller stuff running now and make Linux/Andriod capable chips. Or they will be left behind.I think that Microsoft will price themselves out of the market at the end of the month. As they are a few years behind Apple and Android/Google. With Apple' new Mini with fancy new shape and Nexus 7 32GB soon. AMD and Microsoft will find themselves to little too late.IMHO

" space is still at a premium—having to use two chips instead of one takes up room inside the system that could be given over to a larger battery or shaved off entirely."

Have you ever opened up an ultra book, tablet or a cell phone? There still is a lot of space to fit more stuff. A 1/4 inch squared by 1mm tall chip in not going to make any difference in the space needed to fit the board. Its the size and thickness of your thumb nail for an extra the extra chip. I find that argument incorrect. I think a more valid point is it is one more thing they have to buy to make the boards support usb 3.0.

I get 9 to 13 hours on an old Sony VPccw13. You get better battery life with out loosing much performance by changing a lot of settings. Stock time is 3 hours. I am sure I will get a Surface tablet to beat that record when I order one. .

" space is still at a premium—having to use two chips instead of one takes up room inside the system that could be given over to a larger battery or shaved off entirely."

Have you ever opened up an ultra book, tablet or a cell phone? There still is a lot of space to fit more stuff. A 1/4 inch squared by 1mm tall chip in not going to make any difference in the space needed to fit the board. Its the size and thickness of your thumb nail for an extra the extra chip. I find that argument incorrect. I think a more valid point is it is one more thing they have to buy to make the boards support usb 3.0.

Tablet I agree, not much so for cell phones. On motherboards routing takes quite a bit of space. It may look they can fit a dozen more chips but in reality it's not that easy. There is also issue of cooling.

I cringe everytime AMD release something and I feel they are backed to a corner by the mighty Intel. They can't compete with the expansive R&D, process technology of Intel leading to worse power consumption, cpu performance, profitability, scalability and platform packaging. The only thing they can really compete on is price/performance and GPU.

Intel's current Ivy Bridge desktops are idle around 40watts at the wall and that includes a discrete GPU. Intel is claiming the Haswell platform as a whole will consume about 1/20th the power of current Ivy Bridge systems. Assuming 40watts for current Ivy Bridge, this puts Haswell right about 2 watts idle.... For a *desktop*

If AMD's tablet platform can barley compete with Intel's desktop platform, how is Intel's tablet platform going to do?

Big assumptions here. Until we get benchmarks, we won't really know what limitations Haswell has.

Intel's current Ivy Bridge desktops are idle around 40watts at the wall and that includes a discrete GPU. Intel is claiming the Haswell platform as a whole will consume about 1/20th the power of current Ivy Bridge systems. Assuming 40watts for current Ivy Bridge, this puts Haswell right about 2 watts idle.... For a *desktop*

If AMD's tablet platform can barley compete with Intel's desktop platform, how is Intel's tablet platform going to do?

Big assumptions here. Until we get benchmarks, we won't really know what limitations Haswell has.

Yeah - Haswell could be a very interesting product. If Intel can do what it suggests with Haswell then the Atom family becomes irrelevant (as does all of AMD's stuff). The GPU will be good enough for casual gaming (720p level at detail, 1080p level at reduced settings), the CPU will blow the doors off ARM (and unfortunately AMD) on a performance standpoint and it will just sip power when idle. However, I still wonder if Intel can do GPU drivers well enough - they have not had much success so far.

Intel's current Ivy Bridge desktops are idle around 40watts at the wall and that includes a discrete GPU. Intel is claiming the Haswell platform as a whole will consume about 1/20th the power of current Ivy Bridge systems. Assuming 40watts for current Ivy Bridge, this puts Haswell right about 2 watts idle.... For a *desktop*

If AMD's tablet platform can barley compete with Intel's desktop platform, how is Intel's tablet platform going to do?

Big assumptions here. Until we get benchmarks, we won't really know what limitations Haswell has.

The 1/20th wattage Haswell is for the ULV laptop processors. Haswell desktop cpus aren't going to save nearly as much power over current IB ones. Also that 20x improvement is likely coming at idle, not load, with that new sleep state so it doesn't mean overall. So sure they will be really cool for laptops, but it won't be some magic silver bullet people seem to think it will be. You load that thing up its going to eat battery same as always.

As ARM becomes a serious option on low-end and ultra-mobile desktops and a serious contender in the general market for processors for consumer computing devices, the x86 market is shrinking relative to the size of the new total market, to the point at which there is only competitive space in the x86 market for ONE major player.

1. I will get myself an x86 Windows 8 tablet, I'm not even considering an ARM one. Most of the people I know want to run their x86 apps on their tablets, so they won't go with Windows RT.

2. Please explain, why there is "only competitive space in the x86 market for ONE major player", while there are dozens of ARM vendors?

3. AMD should also go nVidia way and couple their GPUs with ARM. This will give them a strong revenue stream.

What I think AMD should do is build their own x86 tablet, with their own hardware...and price it competitively. If they can market an x86 tablet for between $200-$300, I think they have something that Intel cannot compete with.

What is it with people and pricing? You can bet that if companies that are already known for making cheap products, such as Acer, can't manage to do it, then there is no way AMD can either. Sure, Microsoft could, because as with the Xbox, they could be willing to lose a couple of hundred on each sale. But everyone else has to profit, even if by just a little bit. Tablets using these chips won't be cheaper by much, if at all. Remember that AMD has to buy all the other parts from the same companies everyone else does, and they won't pay less, because volume determines that. They also have the same expenses as everyone else, in addition to parts and manufacturing cost.

You guys just come up with these numbers out of your ears, like a magic trick.

It's not the most attractive proposition is it.Want to make a thicker tablet with less battery life and CPU performance than Intel? Go AMD.

Well i don't think that is a good statement as, slightly less battery life which has not tested on either system, so no one really know for sure.Slightly thicker... really is 2mm really that critical. more to hold is often better, especially for a tablet that goes mainly in a bag and not in a pocket like a phone. with ever changing Display tech 2mm will be easy to find. also the extra depth will get more room to fit good things like display port, SD card slot(s) ect.

The need for an SSD drive in a tablet is silly, as flash memory gets cheaper this is built into the system board so there will be no need to SSD interface/power hardware. As SD cards get bigger and faster with lower power requirements - with a deeper tablet 2mm there is plenty of space to run two more more SD card slots.But AMD are wasting time with 32nm chips they need to skip ahead and get the 28nm or smaller stuff running now and make Linux/Andriod capable chips. Or they will be left behind.I think that Microsoft will price themselves out of the market at the end of the month. As they are a few years behind Apple and Android/Google. With Apple' new Mini with fancy new shape and Nexus 7 32GB soon. AMD and Microsoft will find themselves to little too late.IMHO

Since AMD, the master of performance claim BS, seems to be saying it, I would hardly be willing to argue otherwise. Don't forget that AMD rarely meets its own performance expectations. The only thing I would wonder, therefor, is by how much the performance trails whatever they claim.

The other problem is that now they are dependent on an outside chip manufacturer, as they no longer have any control over them, and other customers are gaining a bigger share of the output, giving AMD less influence. Apple is having that problem too, if rumors were true about them testing TSMC last year, and finding them wanting. But at least Samsung is a viable chip manufacturer, and the plant in Texas is mostly aimed at producing Apple chips.

" space is still at a premium—having to use two chips instead of one takes up room inside the system that could be given over to a larger battery or shaved off entirely."

Have you ever opened up an ultra book, tablet or a cell phone? There still is a lot of space to fit more stuff. A 1/4 inch squared by 1mm tall chip in not going to make any difference in the space needed to fit the board. Its the size and thickness of your thumb nail for an extra the extra chip. I find that argument incorrect. I think a more valid point is it is one more thing they have to buy to make the boards support usb 3.0.

I get 9 to 13 hours on an old Sony VPccw13. You get better battery life with out loosing much performance by changing a lot of settings. Stock time is 3 hours. I am sure I will get a Surface tablet to beat that record when I order one. .

It depends on the phone. Apple's phones are crammed, with no space available, and I imagine that there are others just as crammed. Ultrabooks may be different in some cases (sic). But even MacBook Airs are crammed tight. I'm sure there will be Ultrabooks crammed like that as well.

What is it with people and pricing? ...Remember that AMD has to buy all the other parts from the same companies everyone else does, and they won't pay less, because volume determines that. They also have the same expenses as everyone else, in addition to parts and manufacturing cost.

You guys just come up with these numbers out of your ears, like a magic trick.

Agreed.

AMD would also have to develop a retail channel to sell those tablets, pay to advertise their new product, and establish their brand in the market, etc.

Slapping a tablet together might be easy for them, but getting traction in the current market doesn't come easy, or cheap.

Intel's current Ivy Bridge desktops are idle around 40watts at the wall and that includes a discrete GPU. Intel is claiming the Haswell platform as a whole will consume about 1/20th the power of current Ivy Bridge systems. Assuming 40watts for current Ivy Bridge, this puts Haswell right about 2 watts idle.... For a *desktop*

If AMD's tablet platform can barley compete with Intel's desktop platform, how is Intel's tablet platform going to do?

Big assumptions here. Until we get benchmarks, we won't really know what limitations Haswell has.

The 1/20th wattage Haswell is for the ULV laptop processors. Haswell desktop cpus aren't going to save nearly as much power over current IB ones. Also that 20x improvement is likely coming at idle, not load, with that new sleep state so it doesn't mean overall. So sure they will be really cool for laptops, but it won't be some magic silver bullet people seem to think it will be. You load that thing up its going to eat battery same as always.

Since a processor spends most of its time idling, that's not true. There will be a good average drop in power use. As you read Anandtech's article on this, as I can tell, you should have gotten that part as well.

As ARM becomes a serious option on low-end and ultra-mobile desktops and a serious contender in the general market for processors for consumer computing devices, the x86 market is shrinking relative to the size of the new total market, to the point at which there is only competitive space in the x86 market for ONE major player.

1. I will get myself an x86 Windows 8 tablet, I'm not even considering an ARM one. Most of the people I know want to run their x86 apps on their tablets, so they won't go with Windows RT.

1. ...Personal taste! There's very little x86 software I'm running and using regularly that couldn't be replaced with ARM software. Even as a software engineer with current requirements for MS SQL Server, I could get by with one x86-64 based device plus a few ARM based systems with locked-down digitally signed OS/apps; and I'd prefer things that way so I don't get viruses skipping about between a digitally homogenous array of Wintel devices. You say you prefer to continue running your x86 apps... Have you actually sat down and listed the apps you need, and looked at potential alternatives? Or is this based on a gut feeling that you don't want to lose x86 backwards-compatibility?Generally speaking, this entire class of questions is becoming less relevant as:• Language & compiler technology improves so that software can more easily be targeted to multiple platforms,• Processors become faster, so that a larger back-catalogue of "other platform" software comes within reach of emulation.• Standards of information interchange such as those promoted by W3C become more popular. (With HTML5, the web is finally becoming a platform in its own right.)

Amongst other OSs, I've used Acorn Archimedes RiscOS (ARM), Amiga (68k), UNIX terminals (some with X-Windows, not sure what processors they had), MacOS (68k and PowerPC), OS/2 (x86), DOS/Windows 3.1–Win98 (x86), Windows NT–Win7 (x86), and Linux (x86). I think these platform lock-in limitations are now more imagined and habitual than real and absolute: the CPU is becoming commoditised, which favours ARM's business model. (My favourite OS? For ergonomics: RiscOS — way ahead of its time in 1990. For technical power: Linux...) Pick your personal favourite (it doesn't matter much which one), and you can guarantee that you will have access to all the basic apps you need for word processing, internet, email, spreadsheets, software development, etc.

JSawyer wrote:

2. Please explain, why there is "only competitive space in the x86 market for ONE major player", while there are dozens of ARM vendors?

(BTW, nice smiley!) I wrote "major" player. Intel mostly tolerates AMD because of US anti-trust law, whereas ARM encourages partners to develop their own ARM based architectures and SoCs! The character of the competitive landscape is completely different between these processor ecosystems. With the famously paranoid "only the paranoid survive" approach of Intel, and with ARM eating away at increasing portions of the bottom of the market, and with the increasing technical barriers-to-entry to the semiconductor fabrication industry as we near CMOS end-point; I'd suggest that in no more than two or three years, there'll only be room for one big, highly profitable mass-market large-scale performance-competitive x86 vendor.4... "VIA x86" doesn't count as that, does it? The entire traditional market segment for big fat power-hungry high-profit-margin x86 processors is being cannibalised by ARM and by companies like VIA. AMD is likely to become the custodians of a shrinking market-segment between VIA, ARM and Intel; or return to their roots as a supplier to price-conscious low-margin "bottom feeders". VIA's presence in the x86 market only makes things 10× worse for AMD because there's basically nowhere in the x86 market for AMD to retreat to, if they continue with their current dead-end strategy.

When the ARM and x86 markets start to look more like one market, AMD will have outlived their usefulness to Intel, and Intel will do what they can to muscle AMD out of the market so they can control ticket prices for entry into the medium-to-high performance x86 market...

John Ghast wrote:

kray28 wrote:

What I think AMD should do is build their own x86 tablet, with their own hardware...and price it competitively. If they can market an x86 tablet for between $200-$300, I think they have something that Intel cannot compete with.

Intel certainly doesn't want to compete at those prices, but between their advantages in scale and process tech, there's not really anything AMD can produce that Intel can't, if they choose to.

JSawyer wrote:

3. AMD should also go nVidia way and couple their GPUs with ARM. This will give them a strong revenue stream.

Yes. This may ultimately be the only way for AMD — they should do this while they have money and a competitive GPU architecture.

Anyone with knowledge of the exceptionally shoddy job Intel has done at supporting SKUs with PowerVR IP should be beating at the gates for AMD powered tablets.

The PowerVR architecture used for Intel Atom / GMA parts is more than up to the task, but Intel's drivers are slower than software framebuffer modes on EVERY OS (Windows, Android, Linux, etc). Check out Intel's own forums, they are rife with discontent: [ur=http://communities.intel.com/community/tech/graphics]http://communities.intel.com/community/tech/graphics[/url].

OpenGL and Directx support does not meet with Intel's own promises on most parts such as the GMA 600 and 3600. Intel's own promises regarding the Z6xx / GMA 600 from two years ago. Directx 10.1, never done, OpenGL 2.1, nope, H.264, very broken, 720P sometimes works. This is the same GPU found in the iPhone 4/4S, iPad (1), and Galaxy S series, all of which play video and execute OpenGL 2.0/2.1 just fine. Even the GMA 500, the same IP at 1/2 the clock speed (200 vs 400 MHz), runs video, OpenGL, and Directx better.

Regarding Android, there is even a technician at Intel who compiled a full Android x86 OpenGL enabled driver for the GMA 500. His bosses turned around and told him they would not allow it's release, whether closed or open source. ImgTec is a large part of the problem. The indications are first and foremost that PowerVR does not allow open source use of their drivers. No problem, but then it would also appear that Intel's deal with ImgTec does not allow Intel to directly compile the driver code they have for PowerVR IP for shipping product.

Sure, I would be onboard with an Ivy Bridge powered tablet, but there is no way I would touch another Atom tablet. I love both that I have, but they would have been returned instantly if I had bought either new at full price to find the graphics drivers so shoddily done.

Ultimately, it's all really too bad. Coupled with the moderately decent SSDs in x86 tablets, the Atom is actually more than competent for most workloads appropriate to the form factor; Office, web browsing, e-mail, and other productivity all run great and very responsive once launched (launch times are a tad long in the 5-10 second range for MS Office 2010 programs).

This is to a large extent the reason why I am not willing to consider a Windows 8 tablet running an Atom processor until the next redesign. The substantial rumors of 22nm FinFET, new archcitecture allowing things like true OoO processing and Ivy Bridge HD graphics (granted, at a supposed 4 EUs with no idea on clock speeds), oh and supposedly clock targets near 2Ghz and quad core designs has me very interested.

Ideally better graphics, but at the very least in house graphics and probably faster and deffinitely with better support are the key. The, likely, vastly improved compute performance within the same power envelope (give or take a little) also has me very interested.

That said, Broadwell only about 18 months off, give or take a little has me maybe more interested. Haswell is already supposedly tackling 10w TDPs and tablets. What might Broadwell bring to a tablet form factor?

I don't need krogging compute and GPU performance in a tablet form factor. However, I'd love to see a 10" tablet capable of delivering roughly (I said roughly) the performance of my 3317u equipped laptop both compute and GPU wise, all with "10 hour battery life" in a thin and light weight form. Hell, pack that in an 8" tablet (so long as the price isn't north of $500) and you have me sold.

Intel's current Ivy Bridge desktops are idle around 40watts at the wall and that includes a discrete GPU. Intel is claiming the Haswell platform as a whole will consume about 1/20th the power of current Ivy Bridge systems. Assuming 40watts for current Ivy Bridge, this puts Haswell right about 2 watts idle.... For a *desktop*

If AMD's tablet platform can barley compete with Intel's desktop platform, how is Intel's tablet platform going to do?

Big assumptions here. Until we get benchmarks, we won't really know what limitations Haswell has.

The 1/20th wattage Haswell is for the ULV laptop processors. Haswell desktop cpus aren't going to save nearly as much power over current IB ones. Also that 20x improvement is likely coming at idle, not load, with that new sleep state so it doesn't mean overall. So sure they will be really cool for laptops, but it won't be some magic silver bullet people seem to think it will be. You load that thing up its going to eat battery same as always.

Yes and no. Standard voltage laptop parts actually see a TDP INCREASE, likely as a result of the much "hotter" graphics baked in to Haswell. Keep in mind, performance increase for ULV 17w parts are supposedly only around 30%, where as all up standard voltage GT3 equipped parts are set to see a 100% increase in graphical performance. Likely where the slight step up in power draw is from.

Based on Anandtech's review of ULV Ivy, as it stands the HD4000 graphics already consume a fair amount more power than the CPU cores do at full burn, resulting in both the CPU and GPU cores having to be throttled. So the fact that, supposedly, actual graphical performance improves 30% over current Ivy Bridge graphics, and possibly that compute performance also improves roughly 10%, give or take a little depending on the task at hand, is pretty impressive, fitting that all in the same TDP without a process change. Also a question of what the CPU clock speeds are, Intel hasn't said anything on that, other than that Haswell should deliver something in the 7-13% (I think) range for improvement in per clock CPU performance. Haswell might even have lower CPU clocks in ULV parts than current Ivy Bridge processors for all we know so far. Or they could be higher. Rumors (fact? Not sure if Intel said it at IDF or not) is that Haswell will have both more agressive turbo and more turbo bins I think.

So ULV parts might see a pretty big increase in both graphics and CPU, especially when both aren't being hit big time (and even if they are, probably still fairly impressive improvements to both).

For actual power savings, ULV parts are looking at a 95% power savings in connected standby, compared to I THINK S1/S2 (system up, processor asleep). I don't believe that the power savings are as much as if the system was truely in S3 mode (suspend to RAM, RAM in slow refresh mode). However, power use should still be pretty close to S3 mode, and it would be CONNECTED! That is pretty damned cool. No more needing things like WOL or what not. The system will be there present and connected and can almost immediately resume and still process some functions whilest still basically being asleep.

Honestly I want Intel to move this to full desktop parts pronto. Please in Broadwell even though you aren't doing it in Haswell Intel! I want this for my home server. No more S3 mode to preserve power. Just live it in connected standby on the network and it'll respond like it is really there awake. Put a call in to the server, such as a disk IO request and a few miliseconds later the thing is active again and the disks are spinning back up. No more having the server disappear because it is in S3 and requiring a WOL request to get it back on the network, and THEN query the server.

Other things of note from Anandtechs review of IDF Haswell info, the chips, even the standard voltage parts with the slightly higher TDPs are likely to still use less power in typical use cases. In part the higher draw is from the heavier graphics (which wouldn't be used if you have a discrete graphics card running, well depending). Even using the graphics though, unless at full continuous burn, the chips have a significant number more C-states and better voltage steeping as well as more agressive voltage throttling apparently. So between that and better IPC, it is likely that Haswell will use somewhat less power than Ivy.

I think Intel said at IDF that standard voltage parts, even ones with a higher TDP, were expected to use on average 20% less power than Ivy. Not sure if that is in Standy, or typical use cases. I suspect typical use cases with the extra C-states, agressive voltage throttling, better IPC, etc.

Intel's current Ivy Bridge desktops are idle around 40watts at the wall and that includes a discrete GPU. Intel is claiming the Haswell platform as a whole will consume about 1/20th the power of current Ivy Bridge systems. Assuming 40watts for current Ivy Bridge, this puts Haswell right about 2 watts idle.... For a *desktop*

If AMD's tablet platform can barley compete with Intel's desktop platform, how is Intel's tablet platform going to do?

Big assumptions here. Until we get benchmarks, we won't really know what limitations Haswell has.

The 1/20th wattage Haswell is for the ULV laptop processors. Haswell desktop cpus aren't going to save nearly as much power over current IB ones. Also that 20x improvement is likely coming at idle, not load, with that new sleep state so it doesn't mean overall. So sure they will be really cool for laptops, but it won't be some magic silver bullet people seem to think it will be. You load that thing up its going to eat battery same as always.

Intel did say that the "Haswell platform will consume 1/20th of the Ivy Bridge platform". Nothing about servers vs laptops other than "this will save a lot of power in the data center".

Anyway, there is nothing stopping desktop/server chips from throttling down as much as tablet CPUs while idle.

The CPU benchmarks were showing Haswell using about 60% less power than Ivy Bridge while at near the same performance.

If real world is actually 60% less power at peak and 95% less power idle, then it will be "nasty"; and that isn't even their "low power" cpu.

Intel is running a very solid 22nm process which nobody else has been able to approach as yet. Selling the fab was short sided (in my hind-sight enabled but still myopic viewpoint) and has resulted in some good and bad possibilities. TSMC has had limited success in moving past 32nm (thus the 28nm which is a combination of 32nm and the next actual step) and Global Foundries seems to be experiencing the same level of problem with even worse yields then TSMC.

At least AMD is not bound to Global Foundries alone and hopefully TSMC will get a solution in place. That would be good for many companies, not just AMD. A lot of them are ARM suppliers although an ARM is designed not to need the latest process.

Hmmm, so AMD puts out an APU that has a GPU that mops the floor with Clover Trail's GPU and almost every ARM SoC GPU out there, yet it's "not quite a perfect fit" because it doesn't integrate everything into a single chip? Riiight. This writer realizes this isn't an ARM part right? The purpose of this is to have a powerful tablet that can run the huge legacy of x86 code. This isn't for smartphones, but for x86 tablets and convertibles.

Hmmm, so AMD puts out an APU that has a GPU that mops the floor with Clover Trail's GPU and almost every ARM SoC GPU out there, yet it's "not quite a perfect fit" because it doesn't integrate everything into a single chip? Riiight. This writer realizes this isn't an ARM part right? The purpose of this is to have a powerful tablet that can run the huge legacy of x86 code. This isn't for smartphones, but for x86 tablets and convertibles.

And those AMD-powered x86 tablets and convertibles are going to be bigger and takes too much battery life. Clover Trail is better for those devices you mentioned.

Why is everybody talking as if AMD is reloading their last bullet? No one here has ever heard of AMD Embedded?

Nevermind the usual-Ars-writer-bias.

If you aren't familiar with AMD's history, then it's understandable as to why you feel that way. AMD has has serious problems from the beginning, despite a time when Intel's Netburst failed. Selling off their foundry was the first step to an attempted recovery. But so many missteps since have caused people to become skeptical. Embedded chips are cheaper chips, meaning less profit. A company like AMD which relies for most of its sales dollars on CPU's can't rely on embedded for its future, unless it expects to become a much smaller company.

Embedded chips are cheaper chips, meaning less profit. A company like AMD which relies for most of its sales dollars on CPU's can't rely on embedded for its future, unless it expects to become a much smaller company.

Embedded may be cheaper, but they outnumber the number of regular chips.

Embedded chips are cheaper chips, meaning less profit. A company like AMD which relies for most of its sales dollars on CPU's can't rely on embedded for its future, unless it expects to become a much smaller company.

Embedded may be cheaper, but they outnumber the number of regular chips.

Intel's chips for tablets (especially those of Win8) puts the price points north of $800 ($799 for Atom Win8) leaving nothing for the lower end segments. ie $350-$600 segment which "wants" to use x86 cpus rather than ARM. The difference with RT as it is marketed makes it look like a cripple product. It is in some respects.

We are to expect that only ARM based Win RT will be holding the $300-$600 price segment which is now crowded by very competent Android and IOS based tablets with ranges of capabilities including their native OS. So if AMD wants to populate the lower end of the Win8 tablet space, I say go for it. It looks like the redux of the CPU wars of years past where AMD did very to saturate a particular market that Intel abandons. This will create momentum in the volumes and drive prices downwards as the volume spreads. OF course, it will apply pressure on the Android/iOS tablets with price erosion. Maybe that is sorely needed for Win8 to improve its quality in order to properly compete in the tablet space.It would be particularly interesting for manufacturers to include a mSata slot in the tablet for decent SSD expansion. A few fractions of an inch thicker formfactor is not going to hurt as long as prices remains low.

The new MS tablets should give them that edge - Intel tablets are definitely priced a significant chunk "above" their ARM equivalents.

Actually not necessarily true, so far. It remains to be seen where Dell, Samsung, and the Surface come in at for their RT tablets, but ASUS's RT tablet is $599. Samsung has an ATOM tablet at $649 that runs full Windows 8 (pricier software), Lenovo has one for $629, and Acer actually has an atom tablet at $499. Supposedly, the atom chips are pretty close to ARM prices. But who knows.

Andrew Cunningham / Andrew has a B.A. in Classics from Kenyon College and has over five years of experience in IT. His work has appeared on Charge Shot!!! and AnandTech, and he records a weekly book podcast called Overdue.