Posted
by
Zonkon Thursday July 19, 2007 @01:56PM
from the now-where-is-my-hud dept.

Anonymous Howard passed us a link to the Press Escape blog, and a post about the future of ultra-fast wireless connectivity. Georgia Tech researchers unveiled plans to use ultra-high frequency radio transmissions to achieve very high data transmission rates over short distances. In a few years, the article says, we'll have ubiquitous multi-gigabit wireless connectivity, with some significant advances already under their belts. "GEDC team have already achieved wireless data-transfer rates of 15 gigabits per second (Gbps) at a distance of 1 meter, 10 Gbps at 2 meters and 5 Gbps at 5 meters. 'The goal here is to maximize data throughput to make possible a host of new wireless applications for home and office connectivity,' said Prof. Joy Laskar, GEDC director and lead researcher on the project along with Stephane Pinel. Pinel is confident that Very high speed, p2p data connections could be available potentially in less than two years. The research could lead to devices such as external hard drives, laptop computers, MP-3 players, cell phones, commercial kiosks and others could transfer huge amounts of data in seconds while data centers could install racks of servers without the customary jumble of wires."

These speeds are basically marketing hype. the need to start declaring 54MB at [b]half-duplex[/b]. We all know marketers love big numbers, but what they dont tell you is it runs at half-duplex, so you lucky to even get half of the rated speed. Although, at higher bandwidth rates (and hopefully increased throughput!) this will become less and less of a problem.

but the point is that if you throw a hardrive to a friend, say containing 40gb of data and the throw takes 4 seconds, then you have just achieved 10gigabytes per second bandwidth wirelessly!

now if only we could throw faster!

Transfering data is one thing, but transfering it in a way that makes it usable is another. Sure you can achieve over 10 GB/s by throwing a hard drive to your friend, but if you want to transfer "live" data that your friend needs to use, you also need to factor in the time it takes f

while data centers could install racks of servers without the customary jumble of wires

Somehow I don't see "whole data centers" using a data transmission method where any device can potentially intercept the data going to and coming from any other device. Might make your hosting clients a bit nervous.

Just because some encryption mechanisms are crappy, doesn't mean you can't put together good encryption.SSL with a reasonable key length is essentially unbreakable unless there an exploit in the encryption software.

Some day someone may come up with a trick to defeat it, but there are lots of other encryption mechanisms that are just as secure.

Honestly, I don't get why wireless uses WEP or WPA. We've had VPNs and SSL secure enough for very valuable financial data for a long time; they are proven in the fiel

True, just saying that given the choice between leaking data all over the place and taking for granted that it's secured by some unassailable algorithm vs. having a fiber-optic cable transmitting the same data, sufficiently security-conscious people aren't going to opt for wireless...especially an unproven wireless technology.
Good encryption means different things to different people, and it also means different things depending on the type of data being transmitted and its value to someone who wants to ge

Honestly, I agree - for a long time to come, reputable datacenters won't (and shouldn't) use wireless except in limited circumstances. The common acceptance of the sad state of wireless security gets to me sometimes, and I tend to be reactionary when people claim it as a real problem rather than something easily solved if the standards boards and/or manufacturers would just raise the bar.In a datacenter, though, there is no reason to go with an unproven technology, especially when the alternative is to run

I didn't read TFA, but I find it hard to believe that there's enough spectrum available to permit a dozen or so racks of 1U servers to communicate with UHF signals. Especially if (as is becoming common) they're hooked up to both a SAN and a router. Couple the bandwidth required for both signals with frequency separation requirements (so signals don't interfere with each other) and pretty soon you've got signals spread across more spectrum than one antenna can handle effectively. Then what? Do you instal

It doesn't make any sense to make a network card emit microwaves at intensities similar to microwaves because not only would you get a huge power consumption, it is also massive overkill unless you plan to search the sky for stelth bombers. The FCC ( or local equivalent ) would probably have a few things to say about it as well. The scaremongering about radiation from comunications equipment is simple unbeleivable. You are more likely to get hurt from tripping in a cat5e cable.

While people do go to crazy about the effects of "radiation" from electronics, there can be problems. In the VHF range for instance, about 200W of energy can give you a severe burn. Different frequency ranges have different absorption levels in the body. The electromagnetic radiation is somewhat cumulative so a lot of RF at high power is not a good idea; They say that RF burns are like sun burn under the skin.

Even if you had a 2.4ghz wireless device powerful enough to actually harm you (and you don't -- your cell phone is orders of magnitude more powerful than your WAP), you'd have to be unconscious for it to do any real damage, simply because you'd feel your body being heated by it, and get the fuck away from it long before it heated you to a point where it could actually hurt you.So, yeah, if standing in front of your Linksys feels like you're in a 400 degree oven, then you have a problem. Good news is, its p

Maybe some lower security data centers might enable wireless, but I doubt it. Being that we're a financial institution (a small one, mind you), there's no way in the h to the e to the double hockey sticks that I'd ever enable any kind of wireless anything in our data center.

I'd rather deal with a network cable gone sentient and whipping around like a snake and attacking people, than go wireless at the data center.

Only an idiot thinks there's a wireless transmission that's invulnerable to being intercepted. Heck, wired communications aren't 100% secure, either, but my boss's business is about minimizing risk, and wireless networks even inside a data center is not minimizing risk.

I figure you could put it in a faraday cage of some sort. Still, I'd prefer a little planning and cable management to several hundred machines and peripherals transmitting wirelessly anyday. Especially since I have to spend days on end in there every so often.

My little cage at the colo doesn't have 5 servers. It has hundreds. I'm also sharing that datacenter with many many other companies that have cages with hundreds of servers. We deal with SAN / iSCSI, NAS, backups over networks, etc. With the noise and limited bandwidth available in a shared frequency space, I seriously doubt any type of wireless will be very useful in a datacenter - especially since everything is already connected via hard-wired connections.

It also won't be very useful in my home, where wires are already easy to run for the short-distance devices, and noise / distance prohibits the use in cases where I could really use and WANT high-speed wireless.

So it does sound like a neat trick, but what is a valid, viable use case for it?

I could REALLY use something much different. I want to get rid of the 20 or so wall-wart power supplies under my desk. I want one larger power supply that I can run small cables to all the devices. Why can't devices negotiate for how much voltage / current they need?

I want something different too. I dont want higher speed, I want more range. I want one or two megabits at 30 miles NLOS. Either simple point-to-point, with many different 'channels' for seperation, or point-to-multipoint. Of course, the question of wether such a thing is technically possible is irrelevant, because the telco's would kill it in its crib anyway.

I want something different too. I dont want higher speed, I want more range. I want one or two megabits at 30 miles

Thirty miles is an alright start but I'd like at least 100 or 200 mile range. I love hiking and photography and would like to be able to upload, transmit my photos wirelessly to a server. While it may be possible to do so with a 30 mile range that would require a lot more tower transceivers.

My little cage at the colo doesn't have 5 servers. It has hundreds. I'm also sharing that datacenter with many many other companies that have cages with hundreds of servers. We deal with SAN / iSCSI, NAS, backups over networks, etc. With the noise and limited bandwidth available in a shared frequency space, I seriously doubt any type of wireless will be very useful in a datacenter - especially since everything is already connected via hard-wired connections.

I've seen security rooms inside datacenters that had copper cloth over the windows, etc etc. What if every cage in the colo were a faraday cage? In theory, wouldn't that permit this? Or, how about UWB? Isn't UWB supposed to allow an effectively infinite number of transmitter/receiver pairs to operate together? If the whole building were shielded so that it wouldn't penetrate, it would eliminate interference issues.

I still think that fiber is more desirable. I wish it were cheaper (although it's getting c

I wouldn't be surprised if someone proposes a standard for low-voltage DC distribution in the home. You'll wind up with dual-socket outlets, with your standard AC socket and two to four 12V sockets. Maybe use a multi-bladed plug to determine how much current the device can sink (each blade signifies 500ma, so a four-bladed plug can sink 2A). Somebody else has probably already thought this out in detail, so I'll just wait for someone to post a link to a complete spec...

That's exactly why, when I first read about this, that I thought that the appeal of high speed wireless would mostly be on the consumer end. Most businesses are bound to see the potential security risk of wireless and stay away from it, regardless of how fast it is.As I don't manage any data centers, I'd love it. Mostly because the wife has forbade me from running CAT5 through the house and I'm stuck with 802.11g connections. It's annoying to try to transfer a large file from the office upstairs to, say,

there's no way in the h to the e to the double hockey sticks that I'd ever enable any kind of wireless anything in our data center.... my boss's business is about minimizing risk, and wireless networks even inside a data center is not minimizing risk.

Your network is on the internet. That and any non free software you have are bigger threats than sftp over wireless.

Even better than that, if he'd linked to his own post [slashdot.org] like he normally does, we'd all have seen that I posted a link to the comment list where the employee in question admitted he was in the wrong:

Agreed, but it's less about security and more about speed and troubleshooting, I would think. Sure, my home datacenter (a NAS and a Xbox360) might like to use wireless, but tell that to a guy trying to get 10-40Gbps out of his servers. I don't think that 15 Gbps is going to do it across his datacenter.

Only an idiot thinks there's a wireless transmission that's invulnerable to being intercepted. Heck, wired communications aren't 100% secure, either, but my boss's business is about minimizing risk, and wireless networks even inside a data center is not minimizing risk.

Only an idiot thinks his copper connections aren't radiating his data on RF frequencies that can be picked up outside his building with the right gear. This tech is decades old.

I see my comment wasn't clear - using secure communications over any media is much more important for minimizing risk than whatever that media happens to be. e.g., if you want to minimize risk, and have to chose between TLS over wireless or in-the-clear over wired, go with the wireless.If you're using TLS on both I'm not sure that wired gets you any more security, though wired has plenty of other advantages, but I'm not sure the 'never wireless in the data center because of risk' rationale is necessarily t

Pinel is quick to point out that a multi-gigabit wireless system would present no health concerns as the transmitted power is extremely low, in the vicinity of 10 milliwatts or less and the 60 GHz frequency is stopped by human skin and cannot penetrate the body.
The team admits that the fact that multi-gigabit transmission is easily stopped means that line-of-sight is essential, and this could be a stumbling block in practical settings.

Doesn't this make it being wireless kinda pointless? It's like a wired connection where you can't step over the cable or drill a hole through the wall!

I always thought it would be cool to have a pad that was nothing more than a screen and input device that you could carry around the home instead of a full-fledged laptop. You would be actually "running" your powerful desktop off basically a second screen that you could carry around with you in the house.

Well, let's do some math. Let's say we've got a 1680x1050 display at 24 bpp and an update rate of 60 Hz. That's 1680*1050*24*60 bits per second -- in other words, 2.37 Gbps. So, yes, a connection like this could conceivably run a remote display.

I believe the max data rate for 802.11n is 248 Mbps, so it's unlikely. I've rarely seen PNG get better than 2-to-1 compression on photo-quality images -- not to mention that compressing that much data in real time would take a significant amount of CPU time.

You could cut the bandwidth down a lot with interframe compression. Unless you're playing a FPS, you're unlikely to be updating every pixel 60 times a second. Even something relatively simple like VNC could probably run quite fast over a 200Mb/s network. Of course, it would make more sense to have an X server in your display, so you would just send the high-level drawing commands over the absence-of-wire, including OpenGL commands if the display supports GLX. You can buy ARM CPUs with on-board 3D for ve

What you're talking about is a remote (aka "dummy") terminal -- those already exist, and you can build one yourself, if you want. However, they require more expensive (and bulky) hardware than a simple monitor, since you'll actually be responding to X events. On top of that, such a solution wouldn't work with any operating system that didn't use X (or whatever other windowing system it was based on), and you wouldn't be able to use it as a primary display before your operating system had loaded (or had st

I don't know how mice work, either, but let's think of a way to do it. Let's say that our mouse is an optical mouse that will take 200 samples per second (that's actually quite a lot), and it transmits back to the computer how much its X and Y positions have changed. Surely a 4-byte int is enough precision for each of those. We can assume that actual clicks happen incredibly infrequently compared to that, and they only take up a few bits, so they're insignificant. So, that's 1,600 bytes per second, or a

It's a great idea/product, but unfortunately it bombed in the marketplace. I am writing this reply with mine. I love it! All it really is is a remote desktop slave, but I can administrate my whole network from my living room while watching a movie, lounging on the couch.

When it first came out ~2002, they were a little less than a full-featured notebook computer. I got mine in early 2004 from a company that buys pallets of discontinued tech prodcts for $400.00, (that price included AL

That looks interesting, but I was thinking more of a just a 2nd display rather than a remote desktop on a lightweight computer. The original price on that thing was the same as a laptop - no wonder it failed. At $200 it might be okay for somethings, but $400 still seems a little steep. I don't imagine the video card was too great, while in the case of a pure 2nd display it would be utilizing whatever video card your desktop has.

At $200 it might be okay for somethings, but $400 still seems a little steep

The latest iteration of these go for about $1200.00, which is just crazy in my estimation, however the $400.00 to me is reasonable, just for the freedom of not having a smoking-hot notebook in my lap. The thing runs cold as a stone. Also I have been using it for 3 years and really nothing to wear out 'cept the battery and the stylus.

Since I am not a video/gamer the video quality is OK for what I use it for. The active matrix is neat, (use your fingernail as a stylus) and at 600 X 800 is just about righ

You'll be able to watch pr0n through your neighbors open wireless network *and* fry up a steak by positioning the frying pan between the access point and your notebook. Don't worry, the sunburn should fade in a few weeks.

I can't see any real application for this in a data center. They'll always use wires, switches, and routers.
One simple reason is that one bad wireless transmitter could jam a whole bunch of nearby servers, which probably wouldn't be good.
Wires have their uses. Sometimes it's good to keep your data flow contained and controlled.

Don't get me wrong, I don't see this happening any time soon. But to go so far as to say words like "always" or "never" is just begging your foot to be inserted into your mouth at least sometime down the road.

They'll always use wires, switches, and routers.

Well, two out of the three you just mentioned (and I guess conceivably even the wires too) are subject to failing. So just because having a physical connection makes you feel all warm and fuzzy inside (and rightly so with the current state of wireless

It is closed mindedness like this that can keep good tech from even having a chance. You really don't have to defend your stance from what is currently available, but to say that nothing will ever be good enough to replace those good old fashioned, tried and tested wires is simply ludicrous.

Wireless technologies cannot, ever, provide as much bandwidth as a wired (copper/fiber/whatever) connection can, simply because wires allow for a higher signal to noise ratio. Additionally, wireless is a shared medium, equivalent to using a hub instead of a switch/router.

There are 2 ways to increase the amount of data that can be sent. Increase the carrier frequency or increase the bandwidth. What these people have done is increase the carrier frequency. Wireless today runs on 2.4ghz, these devices run up to 60ghz. What does that mean? Well it'll take more energy, higher frequency means higher energy, also it attenuates more, meaning shorter range. Not only that, but it can will be more readily absorbed by things like bricks, desks, your foot, etc.

The alternative to this is to increase bandwidth, say use 2.1ghz through 2.6ghz for 1 signal. The obviously downsides to this are you can't run many concurrent streams.
All in all wireless data transfer has a very real ceiling on the amount of data that can be transferred, lower frequency means longer range and ability to go through obstacles, at the cost of reduced data-carrying capacity. I guess the point of this post is to point out that there is only so far we can go with wireless data transfer. I don't think it will be able to keep up (over the long run) with the increasing size of traffic to be a viable alternative to cables when it comes to things like comptuer networking. Anyone have any thoughts on this?

The use of multiplexing codes has not been fully exploited, yet. MIMO and others are used extensively in cellular networks (which are, let's face it, wireless networks too) but are less common in 802.11 and similar networks.

Perhaps the next generation of wireless will include UWB/CDMA based transmission.

Even with multiplexing there is still a very real limit to the throughput of a certain frequency. I suppose my point is that there are clever ways to allocate bandwidth to users depending on how much they need, or to combine a bunch of frequencies to get the throughput you need, but it just isn't realistic to think that one day everything can be wireless and sending movies to and from each other no problem. Basically with wires you can do intelligent switching, but wireless requires you to broadcast and tak

You can run at 2.45Ghz, and instead of keeping constant power of a few milliwatts, instead, say, modulate the power output from, you know, 1000 watts to 1.21 gigawatts, you can use the resulting modulation to carry more information per wave. This would be really hot new technology, and really start the economy cookin'.

The relevant parameters are bandwidth and signal-to-noise ratioright but center frequency (roughly the same as carrier frequency) affects availible bandwidth for a few reasons.

1: the obvious limit, there is no such thing as 1mhz of bandwidth centered on 100khz.2: antenna capabilities, you couldn't easilly (if at all) design an antenna to go from 10khz to 1.01mhz but you could easilly design one to go from 10mhz to 11mhz.3: spectrum crowding, its pretty crowded arround the few ghz spectrum, the only way to g

terms used in this postbandwidth: availible channel bandwidth measured in hzbitrate: number of bits that can be transmittedqam: quadrature amplitude modulationsymbol: a voltage level encoding one or more bits/Channel capacity is measured in bits/second, not Hz. I can transmit 1Mb/s using a 100kHz channelYou probablly can. with appropriate modulation schemes (such as qam) you can get 2 symbols per Hz. That means you need to cram 5 bits per symbol. That means 32 level encoding. That is feasible.

Changing the carrier frequency has no effect, except that there's more room for higher-bandwidth signals at higher frequencies. 2.400-2.422GHz seems like a smaller chunk than 400-422MHz, but it can carry the same data.

The formula for how many bits you can send and receive error-free is the Shannon-Hartley theorem [wikipedia.org], and spectral efficiency is typically stated as a percentage of the theoretical.

Yeah I was going to include that in my post but didn't seem relevant as I see improving signal to noise (directional atennas, shutting off other equipment nearby) as "cheating". For example, assume my cellphone has the best electronics available, how can I increase the signal to noise ratio? Standing at the focal point of a dish aimed at the nearest tower? I just sort of assumed the S/N ratio is going to be essentially constant, or probably worse in the future. You could increase the number of symbols (OFDM

For example, assume my cellphone has the best electronics available, how can I increase the signal to noise ratio? Standing at the focal point of a dish aimed at the nearest tower?

One way to do this is to use an array of smaller antennas, and change through software how signals are timed through each. Say I have two dipoles 30cm (1 nanosecond) apart. If I transmit something at exactly the same time from both, the receiving antenna broadside to the two transmitting antennas will see a 3db boost. If I delay

Tin foil hat act like antenna and capture all of multi-gigabit signal and route all of data direct to cerebral cortex, where corpus medula hippocampus cerebellum act like giant "Google" and put all "byte" into main storage. some time Often cause all of sound to "ears" like bad technical translation Chinese goto English, like bad video game of the cheap PC accessories.
when All of signal "Scramble" brainwave, error message to help tech support gets to you responding quickly.
Zipping all of signs to Brain

This technology could be used in applications besides just strict data transfer. 15Gbs should be fast enough to drive a display, as well. The proverbial rats' nest behind your computer could completely disappear with this technology. Keyboards, mice, displays, network - Just about cable plugged into the back of your computer could be replaced with wireless this fast.

But if only it were so simple. Of course now the problem we have is with security. Never mind TEMPEST [wikipedia.org]. If you had a big enough antenna and you could decrypt (it IS encrypted...heavily...right?) the datastream emanating from this technology from a distance - you could see the display, keystrokes, data transfers, everything. Obviously, strong encryption is very important - But the overhead from strong encryption will reduce the theoretical bandwidth because of the extra baggage on the packets, and increase costs significantly because of the very specialized ASICs that will likely be required to encrypt a stream at that speed. And they'd have to be standard across all devices. AND an exploit had better not be discovered in the algorithm. Then there's the issue of the 60GHz band. A frequency that high is very unforgiving of obstructions, even at the short ranges we're talking about. If you have a metal desk, forget it. And what about jamming from computers in close proximity? What about from a "l33t hax0r" with some time on his hands and an inclination to make trouble?

If you had a big enough antenna and you could decrypt (it IS encrypted...heavily...right?) the datastream emanating from this technology from a distance - you could see the display, keystrokes, data transfers, everything. Obviously, strong encryption is very important - But the overhead from strong encryption will reduce the theoretical bandwidth because of the extra baggage on the packets, and increase costs significantly because of the very specialized ASICs that will likely be required to encrypt a stream at that speed.

The real problem here therefore is one of cost. You can have as much bandwidth as you can pay for (because this is the kind of problem that responds well to parallelism. The penalty for that parallelism need not be all that significant. You can have no encryption cheaply, but uh, yeah. Next.

I don't suppose anyone out there knows of any properties of physics that would allow for linked "random" number generating systems that were consistent?:)

...when it said wireless in the data center. Yes, I've heard the theoretical figures for wi-fi. Try dropping a bunch of access points and various clients in tight proximity and see what it's really like. In a datacenter you can run 10x 10Gbps wires right next to eachother without problems. Can you do that with wireless? Hell no. I imagine the speeds quoted are ideal with free line-of-sight and no interference, good luck trying to achieve that in that bunch of wires. Personally I was fed up with wireless when I realized one AP couldn't even cover the ground floor of my parent's house. It'd take probably three to cover the whole house. Great... not.

Unless your parents live in a copper mesh manufacturing facility there is no reason that an access point wouldn't cover their floor. Did you buy the AP at a flea market? Did you place it inside of the microwave? MY GOD MAN WHAT IS THE PROBLEM!?

i suppose that you could always combine one of these [hyperlinktech.com] with one of these [hyperlinktech.com] and use the combo to cook burritos...

The i-squared-r law at 60ghz means that even if the spectra was available (it's not) then you'll need both line of site (reflections won't help and will slow the data rate considerably) and you'll need the will to gulp content that fast. Of course, a shared fixture like an access point in WiFi suffers from duty-cycle problems and raw bandwidth will help. But we could also use spread-spectrum and/or advanced coding techniques like n-Pole modulation to accomplish the same thing.Therefore, with all due respect

I don't mean to be a wet blanket, but all of the advantages of the latest whiz-bang technology don't amount to a bucket of warm spit unless and until the major carriers adopt it. If I live to be a hundred, I'll never see Gigabit data service where I live in the St. Louis MetroEast area of Illinois because no one will force our regulated monopoly (AT&T) to provide it. Until Universal Service is expanded to include broadband, and regulatory bodies set the definition of the term broadband to be 2 Mbits/sec

First, successful lab demonstration of multi-gigabyte speeds with mass-market capable technology is still missing. Call that at least 5 years to a real product. Then deployment. Who needs this stuff enough to deploy it immediately? Right, allmost nobody. Also the first product generation will not really be usable. Call it another 5 years to wide-scale deployment. That gives me an estimate of at least 10 years, but more likely 20 years. The 3 years are a direct lie, plain and simple.I hope these ethically ch

I expect the line-of-sight requirement is a dealbreaker for 'personal area network' type situations. I've got my computer underneath my desk, and all gadgets that could possibly benefit from high-speed wireless links are above the desk. Reconfiguring my desk to provide LOS for everything (including keeping the desk clean, no stacks of paper between the computer and the gadgets) would be a major PITA. I'll stick with wired connections, thank you.High-speed wireless could be useful for 'last mile' connections

99% of all the CO2 in the atmosphere is natural, and we chalk up a change in climate to our 1% fluctuation, as if, that vast lion of 99% doesn't fluctuate on its own. So, why not worry about radiowaves in a radioactive universe.

but that's not really the same as saying that we will now saturate the biosphere with radiation of our own making.

As opposed to all that radiation saturating the biosphere not of our own making? You do realise that light is radiation right? Also, in case you're worried about all the terrible WiFi access points, your average 60 watt bulb puts off far more energy (radiation) than any WiFi AP in use. Now, admittedly, not all radiation has the same effect on everything (such as UV), but the key thing with EM radiation like light and radio waves is the total power and the distance from the source. Remember, power dissipates with the square of the distance, so if you're anything but sitting on top of the transmitter, and even then if it's relatively low power, you've got more to worry about standing outside on a sunny day. The fact that they're talking about such short distances with this tech leads me to believe this will probably be a very low power device, much the same as bluetooth and RFID are.

My concern is that we lack the science to even understand the implications of all of this radiation we're creating upon our environment. Sure, you can put a frog in a box next to a wireless system and say, "oh, the frog lived", or jack up the energy by 100 times as some sort of a proxy for exposure over time, and say "the frog did not get cancer", but that's not really the same as saying that we will now saturate the biosphere with radiation of our own making.

UHF frequencies (millimeter waves and microwaves) cannot cause cancer. The photon energy is not high enough to break chemical bonds in biological tissue.

When a chemical bond is formed (say, in DNA), a certain amount of energy is released. To break that bond (and cause cancer), you need to put that energy back. The catch is, because of quantum mechanics, the energy can't be accumulated. You can't pile in more and more photons until it finally snaps; you have to get one big photon to come in and snap it. When you state the frequency of a photon source (e.g. 60 GHz), that indicates the energy of each individual photon (0.00024 eV). Typical bonds in DNA are on the order of hundreds of eV. It's physically impossible for this to cause cancer.

Even if you put your cat in a microwave oven, it won't get cancer (though it will die a pretty horrible death).

The danger with electromagnetic waves is heat and depth. UHF electromagnetic waves have far less energy per photon than visible light (~2.5 eV), but they have much greater depth penetration. They go deeper before they collide with your molecules, so they deposit heat deeper into your flesh than visible light or UV radiation. This is why putting your cat in a microwave is very bad; it essentially gets "cooked from the inside out". But the energy outputted by wireless devices is barely enough to cause even measurable changes in the temperature of human flesh. How much heat can you apply to a glass of water with a 1.5 V AA battery? Not much. Now spread that out spherically in a 100 meter radius. Almost zero.

Even then, biological organisms are very good at regulating their temperature; humans live across a wide variety of climates all across Earth, and yet still manage to balance their internal temperature.

I have this weird feeling that pervasive, high frequency radio needed to make wireless work is going to wind up with some unforseen bad side effect, the same way every other technology that we used too much had.

Even if for no reason other than having security of communications, it would be preferable if data were communicated via fiberoptic cable. Bonus points for creating optical transceivers that don't broadcast their signals all over the RF spectrum as a side-effect of operation.

Dear Luddite,The data is not with you. Test after test has dispelled the general myth that all pervasive radiation regardless of characteristics must be bad. If you have something more than general hand-wringing and whining about our fallible nature then please post it.