Posted
by
kdawsonon Friday February 22, 2008 @10:42AM
from the can-you-say-giga dept.

mickq writes "The Age reports that Melbourne scientists have built and demonstrated tiny CMOS chips, 5 mm per side, that can transmit 5 Gbps over short distances — about 10 m. The chip features a tiny 1-mm antenna, a power amp that is only a few microns wide, and power consumption of only 2 W. 'GiFi' appears set to revolutionize short-distance data transmission, and transmits in the relatively uncrowded 60GHz range. Best of all, the chip is only about a year away from public release, and will only cost around US $9.20 to produce."

At first blush, it seems like this is a bluetooth replacement, until you look at the cost of the chips- almost ten dollars per unit! Wowza- that means it'll cost $15 to put it in anything.

'Course, I don't know how expensive bluetooth chips are per unit, but I expect they're cheaper than that- especially with all the tiny USB bluetooth receivers you can find floating around for $19.99 and under these days.

True, but all USB 1.1 gizmos are backwards compatible with USB 2.0, and this is hardly backwards compatible with Bluetooth.

In this case you have a totally different standard that appears to be competing not so much in the PAN area but in the wireless-USB area, and in that respect I see it competing with UWB and WUSB. However, WUSB is only 480 Mbits per second...

That said, at the moment, WUSB seems to be a solution looking for a problem; which leads back to my original issue. Where is this going to come in handy at this price point? Nobody's going to pay upwards of $35 for a glorified USB cable.

At this data rate, this appears to be not so much competing with the keyboard/mouse/printer USB connector than it does the DVI video connector. Now all we need is some of Tesla's magic to transmit the electricity wirelessly and we're home free.

No magic required! Just take apart a microwave oven, find a way to focus the microwaves into a narrow bundle, and use a microwave antenna to convert it back to electricity. There you have it, wireless power! Heck, why bundle the waves? Just put an industrial-grade magnetron in the middle of your room and you don't even have to aim the emitter towards the antenna.

Small print: the author of this comment is not responsible for any side effects occurring during this experiment.

I've been working on a totally wireless monitor for years, and I've almost got the solution - details here [wikipedia.org].

To make it the most efficient, I use a directed beam of energy. I also pre-convert that energy to photons before sending it, so that the monitor won't have to waste energy doing the conversion. I also pre-modulate the signal spatially so that I only send the energy needed -- again, another win for efficiency.

Where is this going to come in handy at this price point? Nobody's going to pay upwards of $35 for a glorified USB cable.

This would fit the bill for an idea I had for supercomputer connections. Depending on how it's implemented, a wireless connection 'fabric' between nodes would allow for ad-hoc connections between any two processors, with no central switch needed. While the wireless speed might be slower per processor than something like Infiniband, the potental for 5000 simultaneous full-bandwidth connect

You're kidding, right? Stick one of these in your laptop, then have one that's a dongle (at first) that you can plug into a USB 2(+) port. Instant FAST wireless. Then these will start getting built into things like digital cameras, monitors, etc. Bluetooth is way too slow for any decent digital camera. USB is a pain in many cases. Personally I use a Firewire card reader, and frequently wish it would go faster. 5 Gbps? Yes please. Will I pay $30 for it? Or $50? Definitely. Not that the price won'

If you brought up bluetooth, let me throw my 2c in about the price.When bloototh WAS about to be coming, they mentioned 10-20cent chips. Then the dongles came out and they were $60-80+. Any device sporing bluetooth, was WAY more expensive than others (mostly cellphones).

Now when they tell me $10, I wonder 1. how much a dongle will be, 2. how much an ipod/other player, cellphone or wireless storage is going to really cost.

Well, if an NSLU2 (linksys file sharing device which is capable of running Linux), is g

Yeah but blutooth is only a couple of mbps and in practice seems to be much more susceptible to interference. The few times I've tried to use it for large data transfers have been pretty slow. Its just easier to grab a usb cable.

Right now there's a sort of race to come up with a bluetooth replacement. UWB, wireless USB, etc are the things this product wants to compete with.

I would think household wireless routers could utilize this since most small-medium sized houses will have a radius of about 10m from the router, or even businesses that would rather have an indoor WiFi(GiFi) available to customers rather than broadcasting outside of their building.

Not a bad idea. But I wonder how much at 10 m is affected by walls. I also wonder how much it's affected by interference from cordless phones and other wireless devices. Usually when they say the range is 10m, the actual usable distance is half that, and only when there's no walls.

It's basic RF. The higher the frequency, the worse the penetration. 700MHz and 900MHz go through just about everything (except dirt and metal). 2.4GHz (802.11b/g) can go through wood panels, drywall, and some forms of metal (not many). I don't know what the mathematical description is for the ratio of frequency vs rates of absorption/penetration, but it gets pretty bad at about 5.8GHz (802.11a). I can't imagine what it is at 60GHz and only 2W of output power.

... who cares?This isn't a wifi replacement -at all-. This is a wireless USB replacement and then some.

At 5Gbps you'd have enough throughput to put a hypothetical smartphone on your desk, and not only use your desktop monitor/keyboard/mouse for comfort, but to be able to use your desktop's processor and ram to accelerate the apps that still basically 'live' on your phone.

So imagine a setting where work data is coming off the network, personal settings and user data are coming off your phone, and desktop wo

Coupled with the higher power consumption, you have a higher data rate. This means that you'll be using the device for a shorter period of time while syncing your mobile device to your computer. It's entirely possible that it will be a wash, overall, though I think that Bluetooth uses around 1/20th of the power, and has a listed data rate of only 3Mb/s.

Of course, the truth is that it's just too early to speculate on its performance, as real world performance rarely matches up with theoretical performance.

Think of a [slightly] different market...Most home theatres have a common issue. Rats nest of cables for the various components. RCA/HDCP/HDMI/Optical/etc. to connect a myriad of components - XBoxn, Wii, Playstationn, receiver, amplifier, DVR, speakers x7, television, htpc, remote control. If you could increase the cost of each of these devices by $10 to eliminate the requirement for cables... you could simplify the installation procedures and improve the "ease of use" factor. Take it out of the box, and pr

It consumes two watts of power. It is not a Bluetooth replacement.
Using my phone for comparison: 1100mAh 3.7 V
3.7V / 2W = 1.85 A
1.1 Ah / 1.85 A = 0.59 Hours = approx. 36 Minutes.
I know it won't be transmitting the whole time, but essentially this will be useless in a mobile application.

Little problem with your math there. I=P/V, not V/P.3.7V * 1.1Ah = 4Wh. If that were just powering the chip, thats 2 hours, not 1/2 hour.

Now a pessimistic guess that a 5Gbps link will actually get something like 500Mbps of data throughput, thats 62MB/s. At that speed it will take about a minute and a half to copy a DVD image. 2W*1.5s=50mWh, or roughly 1% of the phone's battery life. Seems like it would be perfect for use on a mobile phone.

Case in point: at home, we just ditched cable and DSL and switched to an optic fibre triple-play (internet/IP TV/telephone) offer, which is much cheaper. For technical reasons the main receiver box can only be located near our entrance door, while the TV sits at the other side of the house.

Out of three possible solutions, none work well:-laying an ethernet cable in the ceiling is possible, but a headache-IP over the power lines is unreliable-WiFi, regardless of the flavor, doesn't provide enough bandwidth (keep in mind that the box streams several HDTV channels at once, for instance when recording one while watching another)

So in our case, the proposed chip and protocol sounds ideal. 10m doesn't seem like a lot, but it's more than enough to cover most apartments / houses, and I expect it will be possible to get signal at much greater distances, with degraded signal. 2.5Gbps over 20m, wirelessly, would rock.

10m doesn't seem like a lot, but it's more than enough to cover most apartments / houses, and I expect it will be possible to get signal at much greater distances, with degraded signal. 2.5Gbps over 20m, wirelessly, would rock.

Yeah, some type of repeater would be nice. Although if placed centrally enough, 10m isn't that shabby.

Of course they're going to be expensive in small quantities, but if this takes off, that price will come down drastically, to something more like $1-2 per chip, which will only increase usage. $10 isn't particularly expensive for cutting edge technology like this to begin with, so it really won't make much of a difference.I think you'll also find most bluetooth receivers at the $19 price range are pieces of shit that aren't worth the money, and you'll have roughly 1000 times the speed or whatever? (I don'

I'd use it for wireless(ignoring power) LCDs as the monitor-pc cable is the last holdup for wireless KVMs(although I currently just use a laptop as my KVM).

Wireless external discs would be another. It's not a huge hassle to plug in an eSATA cable, but it would be kinda spiffy to just stack another enclosure on top of your computer(or just in the same room) and have another TB show up in your RAID.

Yes, because "Gyro" has no alternate pronunciations. I've hear Gyro (in reference to the sandwich) be pronounced in no less than 4 ways. Including Jiro, Yiro, Giro (with a hard G), and Hiro (no kidding).

Obviously it is "Guy-fie" because this will be the next "guy toy" - hooking up your home-theater system with no cables, new kind of remote controls with little video screens built in, all kinds of potential for guy toys.

The fact that the folks in the article actually have something that works. Vubiq says they have something but it's larger, and costs $12500 for a "development system", whatever that means, vice $10 for the one linked in the original article. All the other links you provided are still working on designs and haven't proven any design at all.

I would hope that this drops the price of wireless routers from what they are now, about US$60? The only drawback I could see is how the signal is transmitted through materials, as I live in a three story townhouse and I have a room in the furnished basement. I have a Wireless-G router that I have had no trouble with but from the article it says it is for short distances/= 10m with a 60GHz frequency. I would assume this is a high enough frequency to penetrate most household materials including any cement or cinderblocks. I'm all for it since most routers today just create a lot of noise and/or interference and confuse the laptop I have for some reason.

This will have nothing to do with routers or wireless internet access of any kind. This will strictly be for unit-unit communication that is line of sight (since 60GHz won't penetrate ANYTHING), can't use wires, and needs high speed. It is NOT a bluetooth replacement or WUSB replacement. I'm trying to think of the applications for this, since line of sight will be critical and there are few things I can think of that would require 5Gbps and still be line of sight. Bluetooth is still fairly expensive to

I'm trying to think of the applications for this, since line of sight will be critical and there are few things I can think of that would require 5Gbps and still be line of sight.

Home theater maybe? All of your equipment can be in a location other than the front of the room, leaving just the display and speakers in the general viewing area (as in seen from guests viewing positions), and the chip(s) could be used to transmit wireless HD audio and video to the display and speakers.

If that were the case then we wouldn't be able to get any new members. For most people, the first time they ever plugged in a USB cable was quite a special moment. It takes a little while for the memes to totally eradicate any manly primal urges and associations that were once within.

From "Hi-Fi" (High Fidelity) to "Wi-Fi" (Wireless, but the Fi sounds cool and people vaguely know what you mean) to "GiFi" as gigabit wireless, you've basically lost the actual underlying words.

True, but the WiFi Alliance (the ones behind the "WiFi" name, logo, and certification (as well as the "Wireless x" branding), and completely UNrelated to the IEEE) does it because they want to ensure compatitibility between various products. You do, after all, want to be able to connect your Intel chipset to your Net

Yeah, GIFI stands for General Index Of Financial Information!
Created by the Canada Revenue Agency in 1999, the GIFI is a system which assigns a unique code to a list of items commonly found on income statements, balance sheets, and statements of retained earnings.
The purpose of the GIFI is to allow the CRA to collect and process financial information more efficiently; for instance, the GIFI lets the CRA validate tax information electronically rather than manually.
Unpleasant...

So what if the "fi" suffix takes on a new meaning, different from its original one? Languages change, that's what they do. The only languages that don't evolve are _extinct_ languages. Get over it.

You know, the school of thought that language evolves badly and shouldn't be commented on is just as annoying as thought that language doesn't and shouldn't evolve which you incorrectly attribute to me.

they're just groups of letters people put together when they discover a new concept they need a new word for

They do that every time we discover one of their rules. It keeps us on our toes, as well as providing them with hours of entertainment as they describe our latest fuckup to all their friends and coworkers.

...and will cost $500 to get in your grubby paws. That is until the amazing powers of supply and demand take effect and the price drops over an unjustifiable period of time. The demand for 5G wireless will be huge...

You might not need it to your mouse, but would you want it to your billion inch HD plazma hanging on the wall so you don't need to run cables? There are plenty of uses for this. No, none of them are necessary and it's certainly more expensive than cables, for now, but more uses will present themselves in the future too. People didn't used to have any need of a home computer, but that didn't stop the industry from developing such that it's considered a standard appliance in most houses (in developed count

I, for one, would love to get rid of the massive collection of cables in my home theater. I, for one, would love being able to stop playing the 'do I have an open component/HDMI/optical/coax/whatever port' game. I, for one, would love to be able to buy a new piece of kit, plug it in, stick it on it's shelf, and pair it with the video display and audio receiver, bluetooth style, and that's IT. It just works at that point.

My company is putting 600mW through 9mm^2, (switched through on-die mosfets) so 2000mW through 25mm^2 is high but not actually delusional. However, that 9mm^2 is the actual die size, not the package size. I don't know which they're talking about. If they're shady about package size, they'll quote die size, but if they're actually quoting package size then they're a big step closer to delusional.

I'm about as far from being an RF engineer as possible, while still holding the title of electron-rassler.What we're making are switching power supply chips, so this is the control circuitry and FET that are switching into a big inductor and capacitor -- so there is some excitement past just resistive loads. But, still, we don't ever have the FET in between 'off' and 'on' for more than a nanosecond. I don't know anything about digital RF, but it's hard for me to imagine they can switch a big FET that fast

Short for "GirlFriend"? Ok, I was joking there but I'm still wondering what in the hell the "fi" is for. WiFi the Wi is "wide" and GiFi the Gi is obviously "gigabit". The old "HiFi" stood for "high fidelity".

I was told "Wired Fidelity" at one point, touting the reliability of WiFi and comparable speed (initially) to 10 Mbit hardline ethernet..."it's like ethernet except no wires!". Take that for a what you will.

Here's some more info. Yep it's just a brand name. The WIFi alliance referred to it in slogans as wireless fidelity though, even though it doesn't mean anything, and apparently now they're trying to backtrack on it.

While they are the first ones out of the gate with an all-in-one CMOS solution, I doubt they will be the only ones. Look for Intel to have something available later this year (with the marketing power to make it successful). What we need now is someone like Sony or Toshiba to jump on board so that TVs (er, should I say monitors now) and audio receivers are integrated as well.

I mean WOW... $10 for something that has the transceiver and antenna on ONE single CMOS chip is awesome. Prior technologies requi

CardBus, PCI, local bus.put the device on a laptop's mainboard or on a device's internal bus. your question is like asking if you can run a USB host controller over your USB and Firewire 400 ports. no there isn't enough bandwidth for the overhead, and no it doesn't make sense to connect it that way.

the 10m range means it's not really a substitute for 802.11. Also 60GHz range is, to the best of my knowledge, not very effective at going through walls.