With Skylake, Intel wants to abolish all PC cables by 2015. What a terrible idea.

At Computex 2014, Intel has indicated that its next-gen Skylake platform — the new microarchitecture tock after Broadwell — will enable new PC designs that are completely cable-free. Intel will use WiGig for wireless “docking” to peripherals and streaming media, and Rezence for wirelesss power transfer. Eventually, we might move towards a future where your PC’s only cables are inside the case. Obviously this is slightly different from the previous dream of using a single, awesome ultra-high-speed Light Peak cable to connect everything together…

Since prehistoric times, the humble PC has been inexorably tangled up in a tangle of wires. Because computers are electronic beasts that are powered by electricity, receive input as electrical signals, process data in the form of electrical signals, and output data in the form of electrical signals, conductive metal wiring has generally been the go-to material for connecting everything together. Copper wires are cheap, flexible, have a huge bandwidth capacity, and can run incredibly long distances. Cables have served us very, very well.

But, as you all know, this over-reliance on cables can result in some pretty nasty nests of wiring that get inextricably and inexplicably knotted. Case in point, here are a few of my favorite cable fails:

The back side of a fairly large PC setup

This is what it looks like inside the average telecom box at the end of your road

In recent years, though, as wireless technologies have improved, it has become increasingly possible to do away with cables. The most obvious examples are cell phones (which replaces the grandest cabling system in the world, the POTS) and WiFi, which, in the home at least, has almost made wired ethernet a thing of the past. There are wireless mice and keyboards and headsets too, of course, and thanks to Bluetooth 4.0 LE more devices are being freed from their wired bondage every day.

But two areas have remained resolutely off-limits for wireless tech: Power delivery and outputting to a display. Until now. Kinda.

The thing is, we can wirelessly transmit power fairly efficiently with magnetic resonance, but the range is still very short (a few inches) and your device needs to be placed fairly exactly (you can’t move around with the device; it has to sit on some kind of charging plate). We can beam data to a TV or monitor using WiGig — but with a max speed of around 7Gbps (in perfect conditions), there’s a pretty hard cap on the resolution and frame rate; you won’t be playing a video game at 4K via WiGig any time soon, put it that way. WiGig uses the 60GHz band, which requires line of sight and only works over short range (a few meters).

But, seemingly, none of this is going to stop Intel. At Computex 2014, the company showed off a range of prototype devices that use WiGig and Rezence (magnetic resonance charging) to achieve cable-free nirvana. There was a wooden table with Rezence tech built-in, which could wirelessly charge an Intel tablet sitting on top. There were some WiGig displays. By the time Skylake rolls around, probably in 2015 or 2016, it seems that Rezence and WiGig will be built into the platform — or at least an easy addition to the platform — allowing OEMs to push out lots laptops that can be joyously marketed as “cable-free.”

A server rack, with beautifully tidy cabling. Not all cables are messy!

A tidy, wall-mounted PC

Personally, I would much rather Intel invested its time and effort into One Cable To Rule Them All — a high-bandwidth, multi-purpose cable. Something like Light Peak, or its bastardized offspring Thunderbolt. I think wireless displays and charging have their place — but more in the sense that it’s useful to send photos and videos from my phone to a TV, and I’d love it if power could be beamed to the smartphone in my pocket, rather than to a laptop on a desk.

Having said that, though, I’m sure there are a lot of people who are pissed off by having to plug a power cable into their laptop (and, of course, the added risk of tripping over it). If Skylake makes it cheap and easy to deploy cable-free laptops, then so be it — but just be aware that it’s more of a mom-and-pop thing than a power-user thing. Also, with decent battery life, isn’t a laptop already kind of cable-free anyway? In the photo at the top of the story, is Intel’s cable-free laptop actually just a laptop sitting on a table?

I would rather have the cables than cancer causing brain microwaving wireless setup..

Marc Guillot

Indeed, they are gonna need a massive amount of microwaves to feed with power and data that example PC setup. I’m not very sure I wanna be around there.

CherryTraylingrif

just before I looked at the receipt ov $8130 , I
didn’t believe that my sister woz like actualy bringing in money part-time from
there pretty old laptop. . there aunts neighbour has been doing this 4 only
about 22 months and at present repayed the mortgage on their appartment and
bought themselves a Chrysler . see here M­o­n­e­y­d­u­t­i­e­s­.­C­O­M­

just before I looked at the receipt ov $8130 , I
didn’t believe that my sister woz like actualy bringing in money part-time from
there pretty old laptop. . there aunts neighbour has been doing this 4 only
about 22 months and at present repayed the mortgage on their appartment and
bought themselves a Chrysler . see here M­o­n­e­y­d­u­t­i­e­s­.­C­O­M­

James Tolson

If you have ever seen the movie “scanners”? I have seen secret Intel film footage testing wireless technology with the same head exploding effect ;-)

Better be safe than sorry. Sitting in an area with tons of EM energy around can’t possible be good for you in the long term. Microwaves heat up water, and our body depends on water to shape proteins & enzymes an for them to function properly. One day one of your cells might be dividing and one of the enzymes involved in the replication gets hit by some EM radiation and screws up, copies DNA a little incorrectly, etc. BAM you got a mutation and a possibly cancerous cell. If cancer manifests a few months or years down the line, there’s no way you could assign causality to that EM field you were sitting in. Like I said, better be safe than sorry. It’s wiring up your devices is a tiny price to pay.

Joel Hruska

“Sitting in an area with tons of EM energy around can’t possible be good for you in the long term. “

1). You are exposed to orders of magnitude more EM from the sun than you will ever receive from any other source.

2). All of the radiation in the home is non-ionizing radiation. That means it is incapable of producing the genetic effect you just described.

It’s not a matter of “better safe than sorry.” What you have just described does not happen. Cannot happen. The laws of physics do not permit it to happen.

I agree that for data transmission the needed power (mW) is so low we probbably have nothing to worry about.

Lots of studies have covered this and we have used it for decades with no clear negative effects on human health.

However I can’t yet say the same thing about power transmission. As the name implies the power is several orders of magnitude higher (think 10w) if it’s to be useful. We have not been using it for as long and not as many studies have been done.

I’m not too worried but we definitely should be more careful with power transmission then data transmission.

Joel Hruska

I don’t think anyone is necessarily talking about wireless power transmission. Even wireless charging takes place across a gap of inches, not meters. Intel, as far as I’m aware, is talking about wireless video and wireless data, possibly with induction charging — but not wirelessly powering a Thunderbolt display from inside a laptop.

Magnus Blomberg

So Intels vision is: “Wireless but only when inches away from a wired device” I can’t wait!

What am I missing?

nick hawk

some people just hate innovation..

Magnus Blomberg

I have no problem with innovation. This is just not very innovative.

nick hawk

lol.. wireless not innovative? wireless technology is the future whether u like it or not.. it would be quite tedious if we still use the soon-to-be antiquated cables..especially managing them.. it would also cut the costs of materials needed to build a computer or other electronic equipment..

with innovations and upgrades on wireless tech, im sure it would become a much more efficient and advantages would defeat cables.. and that will be the end of cables and wires..

Magnus Blomberg

I’m all for wireless to but not with all sacrefices this implementation requiers: range of inductive charging at a few inces, bandwidth not enough to get an 8k 120fps 10 bit color image through without lossy compression, sesetivity to interferences, weakened security.

There are a lot of “kinda” and “sort of”:s in the article.

I’d take a foot of cabel and a lot more certainty over all that any day.

Call me when any of the problems I described are solved.

foljs

The need to plug and unplug, and the need for a cable.

traderjoel

“The WHO/International Agency for Research on Cancer (IARC) has classified radiofrequency electromagnetic fields as possibly carcinogenic to humans (Group 2B), based on an increased risk for glioma, a malignant type of brain cancer, associated with wireless phone use.” – http://www.iarc.fr/en/media-centre/pr/2011/pdfs/pr208_E.pdf

Here’s my point. Look at the benefit vs. the cost of having all of your devices *unnecessarily* wireless. I say unnecessarily wireless because truly it is not a hassle to plug in my monitor, speakers, keyboard, etc. The benefit of wireless = no cluttered cables. The cost? Some of the cells in your body might randomly get cooked by microwaves. Microwaves are proven to cook flesh… put a steak in your microwave, turn it on, and you’ll see the effects. The effects are clear: the meat, proteins, etc become *denatured*, meaning, they lose their shape, functionality, etc because they are cooked. Take that scene to the next level: the enzymes replicating your DNA. If they screw up, it could cause a cancerous cell.

Now the power output from a keyboard’s microwave transmitter might not be much. But if your body crosses the beam of transmission, even if it may be 100 mW of power, you might have a few cells here and there cooked. You won’t know about it and it’s not likely to cause a problem, because your body’s regenerating cells all the time. Add a few more devices, and that 100 mW of transmission can add up to 500 mW and more.

Why bother with this risk, though it be tiny? Just hook up your devices with cables and avoid it entirely.

Regarding the sun… yes long term exposure to sunlight is known to cause cancer. Regarding cancer, it only takes one screwed up cell to cause cancer. Regarding risk, you can’t eliminate it from life entirely, but you can at least not expose yourself unnecessarily for little benefit.

Chris Bordeman

I doubt the cancer things is a major risk.
However, another cost is much lower transmission rates and much higher latency, a hallmark of wireless (air will never conduct nearly as well as copper!). And that’s why IT pros still wire their homes and the *main* reason every business that has an IT department still uses wires.

CyberAngel

You really believe that it’s the microwaves?
The educational system in your country must be abysmal…

Chris Bordeman

“1). You are exposed to orders of magnitude more EM from the sun than you will ever receive from any other source.”

We evolved with the sun’s radiation ever present. The copying process that takes place in the cells of all living things is extraordinarily well suited to avoiding damage by solar rays. The same cannot be said of other types of radiation, which is why this #1 is meaningless.

#2 could be correct (I didn’t read it).

Joel Hruska

We take plenty of damage from UV light emitted by the sun. That’s why darker skin tones evolved. People living near the equator need more protection from UV; people living in the farthest north need to absorb more Vitamin D.

Notably, the damage we take from UV light is *not* caused by ionizing radiation, because the UV band that reaches Earth is still non-ionizing. Ionizing radiation knocks off an electron from atoms.

So no, the sun absolutely can harm us, but it harms us through specific, well-understood mechanisms that do not apply to any conventional devices.

Chris Bordeman

Well only your first point relates to your initial first point (which is the only one I differed on).

“1). We take plenty of damage from UV light emitted by the sun.”

But not cancer causing genetic damage, which is my point. Our cells evolved to be incredibly good at replicating DNA with tons of solar radiation, because it was always there.

So saying these newer forms of radiation are not so bad because there’s a lot less of them than what we get from the sun is not correct. There may be other reasons cell and wireless radiation are harmless (you stated one them), but that’s irrelevant to the incorrect logic of your first point.

Joel Hruska

The point I was originally making is that chalking up fears of possible damage to “EM radiation” is pointless because yes, you are exposed to far more EM radiation from the sun than any other source.

If you want to identify the sources of actual potential damage, identify them as something other than EM radiation. But since none of them are harmful anyway at the levels of exposure we actually have in real life, this thread is rapidly becoming pointless.

Chris Bordeman

And your initial point was incorrect. Our cells can handle massive amounts of sun radiation because we evolved with it ever present. So you can’t compare the sun’s radiation to other forms. Please tell me you understand this.
Subject: Re: New comment posted on With Skylake, Intel wants to abolish all PC cables by 2015. What a terrible idea.

Long-Term Exposure To Microwave Radiation Provokes Cancer Growth
“In this review we discuss alarming epidemiological and experimental data on possible carcinogenic effects of long term exposure to low intensity microwave (MW) radiation. Recently, a number of reports revealed that under certain conditions the irradiation by low intensity MW can substantially induce cancer progression in humans and in animal models. The carcinogenic effect of MW irradiation is typically manifested after long term (up to 10 years and more) exposure. Nevertheless, even a year of operation of a powerful base transmitting station for mobile communication reportedly resulted in a dramatic increase of cancer incidence among population living nearby.”

Does c4st look like a credible source to you? It’s pseudo-science galore…

ncgh

Doesn’t make much difference. When the Luddites start framing their scare stories, logic or science won’t make a difference.

Neel Gupta

Bluetooth is Microwave frequency.
WiFi 5GHz is Microwave frequency.

Maventwo

But it must be a huge difference in effect for transfer electricity to a laptop, tablet or even a large TV screen than to a large web server.
Intel is member of Rezenze (former A4WP=Alliance for Wireless Power) where also Qualcomm,Samsung and some other electronic companies are members.
Rezenze is working with same magnetic resonance tech as MIT spin-off company Witricity (or in co-work with Witricity).

Marc Guillot

That asian chick is hot.

About the news … no idea, I justed entered to comment how insanely gorgeous are most of the girls on the asian events. :-)

Jack

Unfortunately she’s a imported prostitute so she’s not available so don’t get any ideas :)

Justin

So you’re saying there’s a chance…!?

Oranji Juusu

How much money do you have?

eonvee375

AMD users usually dont have a lot on a side spending ^^
source: personal exp

Antoine Talbot

But she is wireless… That means you can’t plug any sort of male connector into her… Sadly. You will have to radiate your love at a distance trough your WiDick.

Chris Bordeman

Hmm…you just gave me an idea for an app!!

Chris Bordeman

Yeah, they starve ’em good! :)

Jack

Lol that telecom box xD

Phobos

Sebastian, is that your pc?

Jack

Unfortunately no!
he currently has an i3 setup he can only dream of such setup poor guy

http://www.mrseb.co.uk/ Sebastian Anthony

Nice try! 4770K o/c to 4.5GHz tyvm!

Bryan_S

That’s cute, why are you on the plebeian socket?

http://www.mrseb.co.uk/ Sebastian Anthony

As opposed to one of those Ivy Bridge-E chips?

This thing was better for single-threaded stuff, I think. The only multithreaded stuff I do is video encoding — and that’s pretty rare. Couldn’t quite justify the insane cost of the top-end chips :)

(If Haswell-E had been out at the same time as Haswell, I may have got it though.)

Bryan_S

I was mainly teasing. It doesn’t make sense to go for the ultra high end unless it primarily a workstation. Gaming certainly doesn’t use it. Even BF4 doesn’t go past 12 threads. Although…there is something uniquely satisfying watching cpu load hit 15% max playing BF4… I have been tempted several times to build a 1p box with HW and see if it is any snappier but I honestly don’t think it will be.

I particularly like the empty roll of toilet paper sitting atop your speaker.

Is it overclocked as well? Or just the toaster?

http://www.mrseb.co.uk/ Sebastian Anthony

The toilet roll isn’t overclocked, no.

The rig is overclocked about as far as it’ll go on standard air cooling. There’s plenty of space for water cooling, but not sure I want to go down that route. I do a lot of LAN parties (it’s already pretty heavy)

massau

why is you PC standing backwards ?

brekinapez

My rolls are all overclocked, so I can generally process a “data transfer” with three sheets or less.

Phobos

ah, then it must be Joel’s.

Joel Hruska

The point to using wires is that they work really, really well. Sure, I like the idea of wireless charging and wireless playback, but it’s always going to consume less power (and allow more banwidth) to shove signal down a wire as opposed to through the air.

Wired networks consumer about 1/3 the power of wireless networks, and while this report is from 2011, there’s no way to build fast wireless networks without consuming significantly more power to drive it. The closer wireless gets to wired network speed, the more power the wireless network uses.

There’s no solution for that, so long as we concentrate on ramping speeds up just as quickly as we can improve process technology and drive power consumption down.

Jack

No one cares we all love cancer :)

pelov lov

Didn’t Intel showcase this at one of their IDFs 2 or 3 years ago? They showed wireless charging of a notebook, WiDi, and even wireless charging of a smartphone via an Ultrabook. Seems like we might see some of it in a product.

massau

i think the only way to really make it power efficient to to make a direct communication beam between the 2 devices instead of an omnidirectional beam.
the only problem is finding each other when comunicating. Also i wouldn’t like to design an antenna for this system.

Marc Guillot

Dockport, that has been included within DisplayPort by VESA looks like a good candidate for a multi-purpose cable to connect everything to a single cable (and did I say cheap ?, it’s royalty-free and passive).

Angel Ham

Will this wireless technology will be interrupted by the interference caused by turning on a blender?

Joel Hruska

Not if your house is properly wired and your appliances weren’t built in Communist Russia. ;)

johnqpublic

I lose my wireless every time I use my microwave (it’s a panasonic)

Joel Hruska

No, I’m just kidding with him. I’ve seen this happen.

massau

the some micro ovens have the same wavelenghts as WIFI if it is badly shielded than you hill have trouble (maybe 1mW leaks out which is enough to cause trouble).

massau

what if it comes from Communist China is that ok?

massau

couldn’t they just make USB 4 100W+ standard so you just plug in the usb port and have Ethernet, and all your peripherals on it after it got through a switch.

Glasskeys

Wireless charging is idiotic. Wireless chargers are inefficient, and paradoxically use more wire than regular cables do. Additional coils are needed for not only the charging surface, but also inside each of devices that support wireless charging. Inductance & the inverse square law make it energy inefficient, and is the same reason each device can’t charge more than a couple of inches away from the wireless charging surface.

All of this to shave an extra second off the time it takes to plug a device in — even less for setting a device on a well designed traditional type of stand that uses connectors.

If wireless charging isn’t a marketing gimmick created as an excuse to charge people extra for something totally unneeded, nothing is.

dc

Very good points, while i love new technology, it has to serve some value. if anything is a step backwards increasing energy costs, decreasing reliability and as you say paradoxically requiring more materials.

Joel Hruska

Clearly you have never accidentally yanked on a device while plugged in and damaged it.

>.>
<.<

Not that I've done it PERSONALLY. ;)

dc

Totally depends. if cables are still “optional” then it’s fine. I don’t want no ranged charging nonsense though and if they try to force me to buy that setup I’ll pass.

Mo Friedrich

That server rack just gave me a nerdgasm. Mine never look that beautiful :(

tricorn

That’s because, as soon as it looks like that, someone comes along and says “hey, can you switch the connection in room 203 to the other hub” and you have to move that one cable from the bottom row to the top.

ronch

Today’s PCs are fairly easy to set up. When I was a kid we had to deal with all sorts of ports, from serial ports (there were the 9-pin and 25-pin types, IIRC), parallel ports, Game ports, PS/2 ports. Now we have USB. Thing is, everyone just seems to want to create their own standard for their own benefit, not everyone else’s. So now that DVI works just fine for my monitor, there’s also DisplayPort, HDMI, and who knows what else they wanna put in to make plugging in a monitor more complicated than it is (people then wonder which video port is best to use). Then there’s eSATA, Firewire, etc. I’m ok with cables, but what people want, I reckon, is a port that works with everything. That would be USB, of course, sans all the other ports cluttering my rear I/O port cluster. And if you’re too lazy to even plug your laptop’s power brick, then I’ve got bad news for you.

disqus_oARMhnwJTA

Since the WHO International Agency for Research on Cancer has classified radiofrequency/microwave radiation (YES, WIRELESS EMISSIONS) as a ‘Possible Human Carcinogen, why the hell would any company go all-wireless? Inflicting this on consumers is a terrible corporate strategy. At least keep some products with cable (not wireless) so informed consumers can have a choice in the matter. And, for godsake don’t let your kids use wireless devices of ANY kind.

Matt Hunn

I have to agree that going completely wireless is good in theory but perhaps may be a bit to soon for something like that mainstream. MDH Tech is a Dallas Computer Support company, http://mdhtech.com we work in a lot of areas that are inundated with wireless access points. High rise buildings for example where that or hundreds of tenants, all with witless networks. We have found that dealing with the interference of so many wireless network in such a small area can sometimes be problematic. There may need to be some due diligence in that regard before this will work well (reliably) in a business environment. It will be interesting to see what happens with this technology.

johnwerneken

I have yet to see ANY application of ANY form of wireless ANYTHING that was is or could be imagined to be as RELIABLE and ROBUST as the wired equivalent. Bandwidth alone tends to just rule it out..

Tooluka

Whatever you will do with wireless in a standard concrete apartment, eventually you will wire everything with gigabit Ethernet. I had resigned “fixing” WiFi at home, after buying 200$ dual band soho router, additional 100$ access point and 100$ repeater (most useless thing I have ever seen), playing with bands, frequencies etc all to cover less than 50sq.m. Everything was useless, so I put copper in every room and I’m done with it. Phones and tablets will suffer but whatever.

Olav

What happens if there is magnetic interference from other equipment? Wouldn’t that be equivalent to yanking out cables while the PC is powered on?
Some things should remain old-fashioned.

erlewis

Laptop? I saw a pretty girl …

Blastergamer

No extra Microwaves. We have WiFi and 4G/LTE, which is not that powerful, but don’t want increasing harm.

Laststop311

I dont get it. Wigig already exists alienware had it on one of their laptops a couple years ago. You tv plugged in the WiGig receiver into its HDMI port and the laptop would beam its display to the tv. And wireless charging already exists. Both of these features work with current tech and don’t require skylake to run. This can be implemented in anything right now. Tying this to skylake makes no sense. Now if you could wirelessly beam enough power so you don’t even need to plug in your monitor or pc into the wall that would be something to talk about. These features are as bad as the smell of my finger after i stick it in my belly button

phredered

I don’t see how any of this helps me render even small animations in less than say overnight :) .

http://www.daha.co.uk/ Daniel Harris

Actually, what I don’t get is that many people here are just arguing about what is and what isn’t dangerous. I don’t care what anybody else thinks. I’m convinced that adding more radio waves are incrementally bad for me. That’s my choice! If you don’t think they are bad for you then fine – have everything wireless. That’s your choice. Just don’t force me to have everything wireless because I don’t want it. OK?! Ever heard of “freedom of choice”? :-/

Use of this site is governed by our Terms of Use and Privacy Policy. Copyright 1996-2016 Ziff Davis, LLC.PCMag Digital Group All Rights Reserved. ExtremeTech is a registered trademark of Ziff Davis, LLC. Reproduction in whole or in part in any form or medium without express written permission of Ziff Davis, LLC. is prohibited.