Really wonderful interview. I'm really hoping that Intel is working hard towards building global LTE (heck, even multi-band LTE to cover all US carriers) so users don't have to think about whether their phone works on XYZ network.

Question though, how does a wireless standard have "secret sauces"? Doesn't that defeat the entire purpose of calling it a standard?Reply

Unfortunately almost every standard has this... not just wireless. If you implement it to spec, it usually doesn't work. Yes, it is frustrating. I deal with it at work for automotive and aerospace all the time. It comes from specific vendors tweaking it, and then customers implementing that vendor's tweaks.Reply

Some of the fancier, more specialized 802.11 radios can be set up as a normal access point or client, but there's a lot of out-of-spec features that would improve performance (802.11g at 40MHz wide, AirMax, etc). Even consumer devices have often had a "Buy devices with this logo to enable extra features" sticker for out-of-spec functionality. Not that I'm an expert on the subject, but there are also features that don't break spec too, simple things like automatically optimizing ack timings and the such.Reply

Of course there is "secret sauce". The spec gives you a range of different ways to communicate info, it doesn't tell you how to do everything.For example:- how to choose which of the (ever-growing) possible PHY modes should a device use for communicating? (And you can modify that choice to optimize for throughput vs optimize for energy.)- Should it use CTS/RTS or not?- How should it balance aggregation vs latency?

You can also use better HW, for example your Viterbi decoder can be one or two stages longer than everyone else's. More HW, more power, better performance --- choose which you want.Reply

Definitely an interesting interview. I want to be excited about future wireless technologies but I get frustrated knowing that most laptops or devices are released with a 1x1 setup. Being able connect to everything is great goal if your connection is actually stable. I think this is less of a technology hurdle than an issue of device makers using connectivity as a way to cut corners to save money. Current LTE and wifi N/AC standards should be more than enough for most workflows if it were to work as advertised.Reply

Speaking of which, Brian states Miracast support is flaky and has never gotten it to work. Along with WiDi, are we ever really going to get wireless display off the ground? Hopefully Windows 8.1 is the push it needs to actually be successful so I don't need an HDMI cable to display content on my TV.Reply

Yeah, despite Brian's recent comments I ended up getting a Netgear PTV3000... At $50 with a $10 discount it seemed more sensible than paying $35+ for the SlimPort USB/HDMI adapter for my Nexus 7 (never mind that currently you can't even find those anywhere, tho there's some $25 knockoffs on Ebay).

Haven't had the time to try it yet, tho it did seem like they've been putting out firmware updates left and right (going by the Amazon reviews). The PC/WiDi setup seems far more complicated (I guess Miracast is more of a it either works or doesn't deal). I know reviewing flawed stuff is like low hanging fruit sometimes but it seems that exposing the issues with at 'least one of those WiDi/Miracast boxes might be worthwhile.

My hope is that the Xbox One supports Miracast and will be the de facto standard hardware to test against. So one hopes that Microsoft is implementing the standard correctly with enough test cases when it appears other vendors aren't doing the same.Reply

Yeah. I really wish Intel would've put 2x2 dual band in the 2013 ultrabook spec. It'd be useful to a lot more people than widi will.

And while I know ultrabooks are only a minority of laptops sold; the platform is Intel's biggest soapbox to push minimum standards and unlike most of the places where OEMs blight their models wifi is easy to solve via speccheck.Reply

It's really nice to be able to go on to newegg, hit ultrabook, sort by Haswell, and see pretty much almost all viable solutions.

Personally, I'd like to see Intel say, 1st gen, 2nd gen, 3rd gen, etc. I feel like the fact that I can sort by ultrabooks, and perhaps pick someothing from 2 years ago since many vendors carry that is unfair to a consumer who isn't as educated. Intel should require people to list what generation Ultrabook it is, as well as have a chart showing the benefits from 1 generation to the next.

Considering how nice Ultrabooks are, How nice Haswell is, and how the Ultrabook standard is pretty much for Windows/Intel only, it's amazing how these 2 companies haven't worked together, and harder, to push this platform for the back to school season.

An Ultrabook with Haswell would be a monumental upgrade for the MAJORITY of people I know who have PCs, yet there are no ads at all that I've seen on TV convincing me that I should want something new. I mean I personally know through research on here obviously, but the average consumer is just clueless.Reply

When will we see 3x3 802.11ac cards from Intel?Can the wireless driver team get a reboot? There are serious known bugs causing bsod crashes in every driver released for over a year.Can we get an LTE enabled chipset for Haswell?Reply

Is Intel going to make any attempt to push that the MAC from 802.11ah become part of "standard" 802.11, perhaps even a rapid followup to 802.11ac?

With 802.11ah the IEEE seem to have finally bitten the bullet and admitted that the venerable DCF, even with all the patches that have been applied to it via successive specs, just isn't good enough. The question then is: why are we not allowing more standard WiFi use cases (especially the two very common extremes of home with only one device and base station active, or conference with 500 devices active) to utilize the better performance of the 802.11ah MAC (which as of right now is targeted only at a very limited and specific use case --- sub 1GHz low bandwidth long range operations).Reply

A question for Aicha - Can you give us any updates on Rosepoint? It was one of the most exciting demos in IDF last year and frankly, true technological innovation. However, the demo last year was on 32nm. While you may not want to share your roadmap with us (or especially if it will act as a spoiler for the upcoming IDF), but can you share anything that will keep the tech crowd excited? Even the potential possibilities that this technology will enable - will really help keep us excited.Reply

I'm curious why Intel haven't released a decent desktop WiFi adapter? It's fairly common knowledge that Intel Mini-PCIE cards are more or less the best in the business, but it's still nigh on impossible to find a consistently good desktop WiFi card manufacturer. I've resorted to putting Intel Mini-PCIE cards onto adapter boards and getting external antennae, but it's not a very tidy solution!Reply

Intel makes great SSDs, combine that with wireless chipset capability and I am wondering if there is a wireless hard drive on the horizon with 802.11ac and USB 3 that can be used as an iOS or android backup?Reply

Fantastic interview, this is why I keep on coming to anandtech, keep up the great work, Aisha seems like someone to keep track of in this industry, love seeing people like that at the helm of tech.Reply

Thanks Anand, I loved this. Were you flirting with Aicha a little?Intel is never to be underestimated, it will be fascinating to see how things play out over the coming years.

I might also note that I've frequented AT for well over 10 years now and this is my first post, specifically to thank you. I also really enjoyed reading Ian's recent interviews too. Fantastic stuff, keep it up. Only one thing would make what you guys at AT do even better and that's releasing Podcasts more often/regularly!Reply

As mentioned in the interview, many people view wireless as an afterthought. When ordering new systems, these people choose default single-stream options that offer inferior performance. Further, many people simply do not understand the differences between different wireless cards released by the same company. I have seen examples of Intel offering simple bars or charts demonstrating how powerful each type of processor is in comparison to the rest of the lineup. Are there any plans in place to simplify branding and promotion of Intel wireless solutions so the average consumer can differentiate between wireless products, similar to what is already in place for CPUs? Also, as there is already an 802.11ac dual stream wireless card released by Intel, is there a timeline for when we can expect an 802.11ac triple stream solution? Reply

Have you gotten any calls from Charlie Rose? I think he wants his set back.

I unfortunately am not very informed in details of all the wireless standards available and their specific applications, but I agree with Ms. Evans that creating a unified wireless standard would be great. However, I see that as unrealistic as standards bodies do not move fast enough to add new features to an existing standard before independent manufacturers create proprietary standards for themselves and those proprietary standards gain traction. The 802.11n draft was a prominent example of this and there are numerous others .

However, I wonder about the issue of spectrum. Ms. Evans talks about having new wireless connections such as wireless displays, but I have trouble envisioning this all working properly because that all these new connections will be operating in the existing public wireless spectrum. It's easy to ration spectrum when only one company has legal access to it, but public wireless is a tragedy of the commons. I live in a city and have trouble getting reliable connections on any 2.4GHz network whether it be Bluetooth, Wi-Fi, or something else. 5GHz works well enough for me, but that spectrum soon will be just as congested as well.

In the increasingly crowded public wireless spectrum, what technologies does Ms. Evans foresee that will improve pubic spectrum usage and reliability? Also, how can you improve the public spectrum usage while still maintaining compatibility with legacy devices that might not support the newest wireless standard and thus would not be spectrum efficient.

2.4ghz only has 60mhz to play with. 5ghz currently has 575 mhz available in the US with FCC proposals to add 120 and 75mhz blocks expanding it even more. 5ghz also is slightly less capable of penetrating walls meaning in urban areas fewer of your neighbors networks are interfering with your wifi. 60ghz has even more spectrum available; and since it's a purely in room technology (walls reflect it) your wireless home theater in your man cave won't interfere with the wireless home theater in your family room.Reply

I agree with your argument about short range 60GHz wireless signals being perfect for display connections. But switching to 5GHz for local area networking would mean that a lot more access points would be needed to maintain the same coverage as 2.4GHz networks. I need only one 2.4GHz access point to get some signal in the house (even competing with neighbors) but I need three 5GHz access point to get complete coverage and that is with nothing else I can detect on the 5GHz spectrum. The 5GHz propagation characteristics are not very good at least for my usage.

The more I think about it though the more I realize that I may be thinking too narrowly. Maybe the access-point/client model is dated. Maybe the future involves mesh networks. Does Ms. Evans see the world switching to mesh networks for anything other than low bandwidth devices (e.g. Zigbee)?Reply

Thing is, if you want reliable networking in a semi-large house, you want to avoid walls and need more access points. It's pretty much unavoidable. What we need is combination of 500Mbps powerline network adapters with built-in auto-configurable wireless APs so that it's easy to setup while providing the best performance to clients.Reply

We've seen wireless technology go beyond connecting between computers (like wireless I/O)What would be the technical limit on max bandwidth and min latency on wireless I/O?Wireless USB/eSATA/HDMI/PCIe integrated onto a SoC?Since Thunderbolt is still on copper and there is no real demand for more bandwidth, I hope for a wireless solution in the works.Reply

Intel's LTE efforts are dead in the US no matter what they say. Qualcomm controls the IP needed for backwards compatibility with CDMA, IP Qualcomm has used to force the likes of Nokia to make a $2.3 billion USD settlement in 1998, and eventually Nokia had to switch to using Qualcomm ARM SoCs and Qualcomm LTE baseband chipsets in their phones (running Microsoft Windows Phone as the OS). Very soon after Intel bought Infineon, Apple dumped Infineon for Qualcomm LTE baseband chipsets.

Read the following vision of Intel for WiMAX from 2007 very carefully:

What Intel was saying was their plan with WiMAX was to disintermediate the major US carriers, to at best make them dumb pipes. Needless to say, the US carriers did not react well to this. Observe how the MID, Intel's vision of the mobile device of the future, never even got off the ground. The carriers imposed a special tax on Intel's devices with extra charges for tethering.Reply

Your statement contradicts the statement of Aicha Evans that Intel will pursue 4G worldwide, not just in the US. Also just because Nokia has to settle with Qualcomm over some particular IP case doesn't mean it's impossible for someone else to develop 4G services. So what are you getting at here?

WiMAX didn't make it, so what? Just because one effort failed doesn't mean the next one will. Anyways, in 2007, Intel was still working with 100+ watt processors and really couldn't even compete in low power processing, let alone communications standards or capabilities.Reply

CDMA is mostly an afterthought these days. Verizon won't be requiring it for much longer on new handsets (likely next year) since their LTE network already almost overlaps 100% with their 3G CDMA footprint.Reply

Do you have a citation for VZW dropping CDMA as a requirement; or is this just speculation?

I'm dubious about the claim, because while Verizon is on track to upgrade all of their infrastructure soon there are still a number of areas where their data access is roaming on someone else's 2/3g network. On their data coverage map these areas marked as extended 3g or extended 1x coverage. This is especially true for international roaming where most of VZWs partners are not as far along in their LTE network deployments.Reply

Anand and Aicha, thank you. I am a physics graduate student and I have two questions.

Firstly, what is the progress on monolithic integration of RF and CPU onto the same die? Has a 4G modem and baseband been fully integrated with a Silvermont class CPU yet? More generally, what is Intel doing to advance the state of the art in wireless technology, like what Justin Rattner and Yorgos Palaskas discussed at IDF 2012?

Secondly, Intel has really done incredible work in enabling the high performance of modern servers and HPC facilities. However, most mobile communications technologies seem very consumer centric. They don't seem to offer the same utility for "serious computing" like for servers and HPC, or engineering software like ANSYS or MATLAB. So my question is how can the work in mobile communications be relevant at the highest levels of performance? Obviously, copper and fiber optics are probably going to be faster for the foreseeable future, so are the Xeon groups ever going to have to talk seriously with the communications guys? It seems like this is a pretty good counterexample to "everything that computes connects."

Don't look at intel if want to see what is possible today. And "serious computing" doesn't help a chip making company like intel if the the market share for "serious computing" isn't big enough to refinance the fabs.Reply

Sometimes it's hard to know the focus of Anandtech. The technical level is second to none within the mainstream review sites. Today sitting down with some honcho from Intel discussing the future, tomorrow reviewing a gaming case, i.e. a hunk of steel.Reply

Great interview! Dear Aicha Evans, will Intel aims to win Google Nexus phone and tablet this year? I would really like to have Intel Atom Silvermont on my next Google Nexus phone. Hope it will be widely available in Indonesia, and all Asia Pacific countries too. :)

First, great interview. Captivating and insightful. Great to see a fellow French speaker doing a good job in English :-)

Of all the interesting potential developments, the one that interests me the most is inter-device connectity and the many possibilities that might come with it. As a technology enthusiast, one of the (admitably first world) problems I find the most frustrating is the number of devices I own to fulfill different missions - phones, tablets, notebooks, desktops, consoles, home theater/entertainment devices - and how I can at best use 1 or 2 of them at the same time while the other ones - and their potential power - are "wasted". A standard for sharing the processing power of increasingly powerful small devices to make the different "terminals" I use come alive, with centralized data coming from using a central device as an added bonus, would be fantastic.

How far are we from wireless technology enabling users to rely on their now (or soon to be) powerful enough smartphones to provide processing power to an array of bigger devices, from tablets to TVs? How is Intel working on a way to make such a dream a reality? How far away are we from a wireless Thunderbolt equivalent, for instance? Since such a development could mean customers wouldn't need to buy multiple gadgets anymore, is there a chance companies like Intel might want to hold back or limit features that might otherwise be feasible so as not to cannibalise their market with products that would be too good at doing everything? Reply