Would allow for high speed links between aircraft over hundreds of kilometers.

Despite all the advances in commercial wireless networking, even the most industrial-strength radio frequency links can't come close to the speed and reliability of wire and fiber. While industry groups such as the WiGig Alliance strive toward providing two gigabit-per-second wireless connections at short range, longer-range wireless links such as the directional microwave systems used on some cell towers top out at around 250 megabits per second—a small fraction of what can be pushed over a fiber backbone.

Of course, you can't run a fiber backbone through the air or summon one up at will on the battlefield. That's why the Defense Advanced Research Projects Agency has launched a program to create technology that can act as a backbone for an airborne network with the same sort of bandwidth as fiber optic backbones—100 gigabits per second. If successful, the program could mean not just faster data connections on the battlefield, but better broadband for people in remote areas and cheaper expansion of cellular networks.

The effort, called the 100 Gigabit-per-second RF Backbone (or 100G in DARPA shorthand), seeks to do more than just overcome the physics that limit current radio-based data connections using the Defense Department's Common Data Link (CDL) standard protocol. The initiative is searching for a solution that will be able to be deployed both to the battlefield and aboard aircraft—and work at distances of over 200 kilometers.

The goal set by DARPA isn't just pulled out of the air, so to speak. The amount of data collected by sensors on aircraft, such as synthetic aperture radars, are so vast that only a small amount of it can be pushed back to commanders on the ground—which is why the military has command and control aircraft like the E-3 Sentry Airborne Warning and Control System (AWACS), filled with crewmembers who can interpret the data close to the sensor. But AWACS are expensive. And with more and more drones carrying sophisticated radar systems to track targets on the ground—along with optical and infrared sensors—the DOD needs a way to beam all the data back at higher fidelity, either to an AWACS, another aircraft, or to a command center on the ground a hundred miles away.

The most likely route to creating this sort of Skynet is to use the same sort of technology used to collect much of the data in the first place—synthetic aperture antenna technology. There have been a number of efforts to turn the Active Electronically Scanned Array (AESA) radars of fighter aircraft into dual-purpose systems capable of both acting as a radar and as a data link. Raytheon, L-3 Communications and other companies working on previous DARPA-funded projects have demonstrated the creation of airborne mobile ad-hoc networks by connecting a data modem to an AESA radar. This turns some of its transmission array into a multiplexed transmitter and establishing network connections of over 4.5 gigabits per second.

DARPA sees the next leap in data throughput coming from improvements in extreme high frequency (EHF) radio technology. Using wavelengths measured in millimeters, EHF frequencies—such as the 60 gigahertz frequency used at the top end of the WiGig standard—are typically only effective for communications at short range and within line of sight. But DARPA believes that by using techniques in the modulation of signals, including quadrature amplitude modulation (QAM), the millimeter wave band can be used over much greater distances, through cloud cover, and to achieve even higher throughput. In a statement on the program, DARPA program manager Dick Ridgeway said the project "plans to demonstrate how high-order modulation and spatial multiplexing can be synergistically combined to achieve 100 Gigabits per second."

While DARPA is looking at this from a purely military perspective, 100-gigabit wireless connections could have much larger ramifications for wireless carriers. They may allow for the creation of temporary network backbones in response to disasters. They could also create an opportunity for an artificial intelligence to create a centrally controlled network of cybernetic killing machines (but that's not within the scope of this research project).

Why they need that much data through put is a mystery to me cause I have a working knowledge of there networks and the data demand doesn't even come close to that.

Reading articles before commenting is so old school, glad you fall for this cliche.

Sounds like they want to put more data on these networks that the current ones do not allow for. Also, a high-throughput network will allow for much more tactical and logistical flexibility than what the current technologies offer. Combine that with the potential for use in the civilian sector, and we have something that is not a particularly bad idea.

... Well it really kind of simple you use FDMA. sure sure I know we already FDMA but we use FDMA over a single radio link in order to multiplex data. Use it for what it's intended for. use multiple transmitters and receivers on the same data stream. one radio can tx rx 100Mb okay so lets use 1024 radios. ahh now your moving large amounts of data. ...

Physics would like to slap you in the face. There's only a limited amount of spectrum available for radio use, and it doesn't matter whether you call it one highly capable radio or 1024 less capable radios, you've still got a limited spectrum across which to send those bits...

Ooops, I posted before seeing his second post -- sorry for feeding the troll. Hopefully others may learn from my mistake?

I have to say the title might have been a bit of nerd click-bait......but the article was cool so I really can't complain. I like how crazy ambitious Darpa will get with some of this stuff. 200KM at 100gbit through the air.... just damn.

you don't understand the current amount of data isn't even in the same class. It's not remotely close AKA your home computer has a faster connection to the internet than deployed military systems and way lower ping time by order of magnitudes. Not only that the plans for advancing these systems in the next 15 or so years doesn't even scratch 100g barrier. there is something else currently unknown going on here and it's not drone planes or man portable battlefield computing systems.

What I'm trying to say is that this hints at something much bigger on the horizon.

Did you read the whole article? Sean mentioned synthetic aperture radar images. Exact sizes are classified but they're pretty large. Wikipedia reports resolutions down to 10cm; multiply that by 10,000 square km and you get a big image.

There are plenty of other uses too -- streaming video from high-res FLIR cameras, real-time targeting data, etc. Then you factor in that there could be multiple collectors sharing a link, and you can see how a multi-gigibit per second pipe could get full.

If the technology to have a wireless mesh with 100Gb/s throughput between any mesh nodes is developed, it can revolutionize broadband, especially in emerging nations that do not yet have a ponderous investment in infrastructure. For the US, it would be harder to implement, but even if it can be used for individual back-bone point-to-point links, it can greatly reduce the cost of bringing real bandwidth to remote locations, or even suburbs, not to mention the benefit to the cellular industry in helping feed data to towers.

In a statement on the program, DARPA program manager Dick Ridgeway said the project "plans to demonstrate how high-order modulation and spatial multiplexing can be synergistically combined to achieve 100 Gigabits per second."

It wouldn't be a dog & pony show without somehow shoe-horning the word synergy into the language.

It's all about transformationally allowing commanders to get all the data from air, space, and cyberspace into a single collaborative environment which will synergistically enhance combatant commanders capabilities for near and real-time ISR data.

I don't know how this is better than the existing EUTELSAT KA-SAT 9A, already covering Europe for 2 years with 70Gbit+ capacity.

Quote:

EUTELSAT KA-SAT 9A :: 9° East

EUTELSAT KA-SAT 9A, the first High Throughput Satellite (HTS) in Europe, marks a new generation of multi-spotbeam high-capacity satellites.

Built for Eutelsat by EADS Astrium, KA-SAT’s revolutionary concept is based on a payload with 82 narrow Ka-band spotbeams connected to a network of ten ground stations. This configuration enables frequencies to be reused 20 times and takes total throughput to beyond 70 Gbps.

The ground network uses ViaSat’s SurfBeam® technology, similar to the solution powering broadband connectivity for almost 450,000 satellite homes in North America. The combination of KA-SAT’s exceptional capacity and ViaSat’s SurfBeam® technology makes it possible to deliver Internet connectivity for more than one million homes, at speeds comparable to ADSL.

I don't know how this is better than the existing EUTELSAT KA-SAT 9A, already covering Europe for 2 years with 70Gbit+ capacity.

Quote:

EUTELSAT KA-SAT 9A :: 9° East

EUTELSAT KA-SAT 9A, the first High Throughput Satellite (HTS) in Europe, marks a new generation of multi-spotbeam high-capacity satellites.

Built for Eutelsat by EADS Astrium, KA-SAT’s revolutionary concept is based on a payload with 82 narrow Ka-band spotbeams connected to a network of ten ground stations. This configuration enables frequencies to be reused 20 times and takes total throughput to beyond 70 Gbps.

The ground network uses ViaSat’s SurfBeam® technology, similar to the solution powering broadband connectivity for almost 450,000 satellite homes in North America. The combination of KA-SAT’s exceptional capacity and ViaSat’s SurfBeam® technology makes it possible to deliver Internet connectivity for more than one million homes, at speeds comparable to ADSL.

I don't know how this is better than the existing EUTELSAT KA-SAT 9A, already covering Europe for 2 years with 70Gbit+ capacity.

* Can't be used on the move, due to precise pointing requirements* Upstream speed is slower, and total capacity is 70 Gbps for the satellite, not 100 gbps for a single link* Satellites are easy to disable/destroy

I'm guessing you confused link throughput with total network compacity.

JSawyer wrote:

I don't know how this is better than the existing EUTELSAT KA-SAT 9A, already covering Europe for 2 years with 70Gbit+ capacity.

Quote:

EUTELSAT KA-SAT 9A :: 9° East

EUTELSAT KA-SAT 9A, the first High Throughput Satellite (HTS) in Europe, marks a new generation of multi-spotbeam high-capacity satellites.

Built for Eutelsat by EADS Astrium, KA-SAT’s revolutionary concept is based on a payload with 82 narrow Ka-band spotbeams connected to a network of ten ground stations. This configuration enables frequencies to be reused 20 times and takes total throughput to beyond 70 Gbps.

The ground network uses ViaSat’s SurfBeam® technology, similar to the solution powering broadband connectivity for almost 450,000 satellite homes in North America. The combination of KA-SAT’s exceptional capacity and ViaSat’s SurfBeam® technology makes it possible to deliver Internet connectivity for more than one million homes, at speeds comparable to ADSL.

Using wavelengths measured in millimeters, EHF frequencies—such as the 60 gigahertz frequency used at the top end of the WiGig standard—are typically only effective for communications at short range and within line of sight. But DARPA believes that by using techniques in the modulation of signals, including quadrature amplitude modulation (QAM), the millimeter wave band can be used over much greater distances, through cloud cover, and to achieve even higher throughput. In a statement on the program, DARPA program manager Dick Ridgeway said the project "plans to demonstrate how high-order modulation and spatial multiplexing can be synergistically combined to achieve 100 Gigabits per second."

This is not a great explanation. QAM is an ubiquitous modulation technique used in everything from broadcast TV to LTE cell phones. Saying that they're going to use "modulation of signals" is like saying they're going to use "antennas" or "radios" to send data through the air electromagnetically. Its technically true but not really informative because it kind of goes without saying.

The key here isn't "modulation" or even QAM, its the use of very high frequency RF where you can have enormous bandwidth and the use of spatial multiplexing (basically sending parallel beams through space at the same frequency) to further increase bandwidth and make transmissions more robust to interference and even weather effects. Basically they're looking at using extremely large bandwidths to move a lot of data, and to make up for absorption and pathing issues while further boosting data rates they're looking at spatially multiplexed systems.

Obviously nobody wants to recreate Skynet. This claim is a bit outrageous and seems only meant to create a reaction?

DARPA works on "DARPA hard" problems. Anything less than 100Gb would not be a difficult problem - there are technologies available today that can do this.

60Ghz is an area that does not have much congestion now. However, there is a lot of atmospheric loss. This is typically used for directed links (i.e., spectrum allocation is not a problem) and for space-based communication (i.e., atmosphere is not a problem). Achieving 100Gb (clearly) enables communication at less than 100Gb; therefore, the data rate can be reduced to increase "energy per bit" - increasing reliability of the communication. This can address some of the atmospheric absorption issues.

Atmospheric absorption and the spectrum required to transmit such bandwidth will prevent this technology from being used for earth-based consumer links.

The general problem with today's C4ISR platforms is that there to much more data gathered than there is capacity to transmit it back. There are some publicly known government programs that could give you some clue about this, but just consider the fact that you can fly a plane around with hundreds of commercial HD cameras and quickly generate a lot of data for transmission...

there we go let's spend billions of tax dollars researching technology that like will not yield results we desire. Okay so the military has a need for 100G transmission over RF links. Why they need that much data through put is a mystery to me cause I have a working knowledge of there networks and the data demand doesn't even come close to that.

Controlling drones in the even that someone takes down their satellite links basically.

The real problem is when they just assume that such bandwidth, even if it is a 40,000% increase over current tech, will be a reality they can plan around, spending BILLIONS more on systems that can't work when all of the pixie dust wears off.

First, communications links that have demonstrated over 20 Gbps in a single link have been demonstrated over large distances. The data for some of those tests is publicly available.

Second sensors exist today that will swamp a 100Gbps link. Data is being decimated at the source and information is getting lost. Look up Multi Spectral Imagers for just one such type of sensor. Synthetic Aperture Radar (SAR) alluded to in the article is another type.

The 60-65 GHz band will not be used for this. The atmospheric attenuation will be too great. There is a "hole" in the absorption a little ways above that. Look there.

As said elsewhere, modulation in and of itself is not enough. Systems have been built using as high as 256 and 512 QAM. There are issues that crop up when going that high. Significant work needs to be done not only in the digital domain (think of extremely efficient error correction code, etc.) but also in the analog domain (extremely few components are "linear" in their actions at these frequencies and modulation rates causing distortion).

100 Gbps links over 100 km will be demonstrated within the next five years, possibly sooner -- count on it. They may not be in military operations or even minimally fielded by then, but they will be demonstrated in the open air.

This is not a great explanation. QAM is an ubiquitous modulation technique used in everything from broadcast TV to LTE cell phones. Saying that they're going to use "modulation of signals" is like saying they're going to use "antennas" or "radios" to send data through the air electromagnetically. Its technically true but not really informative because it kind of goes without saying.

Agreed.

redleader wrote:

The key here isn't "modulation" or even QAM, its the use of very high frequency RF where you can have enormous bandwidth and the use of spatial multiplexing (basically sending parallel beams through space at the same frequency) to further increase bandwidth and make transmissions more robust to interference and even weather effects. Basically they're looking at using extremely large bandwidths to move a lot of data, and to make up for absorption and pathing issues while further boosting data rates they're looking at spatially multiplexed systems.

The problem is the diversity (robustness) vs capacity (throughput for a given bandwidth) tradeoff inherent in MIMO communications systems. If you want to maximize throughput by carrying as much data as possible with less redundancy, you become more susceptible to impairments in the channel and noise.

I can't fathom how they expect to achieve such a massive throughput using MIMO on line of sight channels as these don't provide anything close to the isotropic scattering environment where capacity is theoretically maximized.

Another obvious problem is that an enormous amount of power is required to transmit such high bandwidth signals to a receiver 100km away with sufficient SNR. By necessity they're going to be dealing with GHz frequencies which are less favorable for long range propagation. Directional antennas can improve the SNR situation but will further reduce any spectral efficiency gains through MIMO.

The real problem is when they just assume that such bandwidth, even if it is a 40,000% increase over current tech, will be a reality they can plan around, spending BILLIONS more on systems that can't work when all of the pixie dust wears off.

DARPA's annual budget is $2.8B and they employ 240 people.

Somehow I doubt BILLIONS will be spent on this project. From wiki they appear to have at least 32 projects that they're working on in collaboration with partner organisations/companies.

Furthermore, DARPA is about blue-sky 'out-there' research and development. 40-50 years ago it was pretty out there to think we could hook up a global network of computers with a common framework, but now we have the internet and we can thank DARPA for that.

We need this kind of research. Sure it's a little unfortunate that it's often tied to the military but we have examples of it filtering out into civilian use so it's not all bad.

My first thought was that it was going to be incredibly difficult, but from the DARPA website:

Quote:

The goal is to create a 100 Gb/s data link that achieves a range greater than 200 kilometers between airborne assets and a range greater than 100 kilometers between an airborne asset (at 60,000 feet) and the ground.

At 60,000 (18.3 km), the density of the intervening air between planes is far lower than in the lower troposphere. and even in the air-to-ground link, a lot of the 100 km range for the transmission will be in rarified atmosphere.

I wonder if there are any ecological ramifications from all the energy of radio waves, microwaves, etc. being transmitted over the air.We here lots about CO2 contributing to global warming, but it seems the constant stream of these transmissions wouldn't be without effect of some kind.If the effects are relatively inconsequential now, at which as we employ higher energies do those effects become harmful?

Why they need that much data through put is a mystery to me cause I have a working knowledge of there networks and the data demand doesn't even come close to that.

Reading articles before commenting is so old school, glad you fall for this cliche.

Sounds like they want to put more data on these networks that the current ones do not allow for. Also, a high-throughput network will allow for much more tactical and logistical flexibility than what the current technologies offer. Combine that with the potential for use in the civilian sector, and we have something that is not a particularly bad idea.

Other than possibly blocking a wide spectrum of existing radio services, that is.

It's an airborne (vs satellite) network for latency reasons? Or maybe range? Anyone able to provide me with a clue? Thanks.

Latency is part of the issue (a GEO hop is 1/2 second when round trip is considered, 1/4 when one way). Even a 300 km air to air or air to ground link is only about a millisecond. However, many one way links (e.g., streaming data from a sensor) don't care about 1/4 second latency.

Also range at 60,000 feet to another 60,000 feet aircraft, free space loss gets you before any appreciable atmospheric loss until a range of well over 500 km.

The biggest reason at the moment is power. No matter how many games you play with coding, modulation, interleaving, linearization, etc., etc. it all comes down to energy per bit. In order to reliably sustain a 100 Gbps link in the near future is will take a significant fraction of the satellites power for a single link.

You can get hundreds of concurrent, independent, point to point (air to air or air to ground) links versus a couple (or at most a very few) through a satellite. -- And putting up dozens, let alone hundreds, of satellites of this nature is just not practical.

Am I the only one who sees a serious problem heading our way when DARPA starts looking into creating Skynet?

Hell, why not? They've already started creating terminators =)Just look at DARPA's petman and other various models of robots. This 100Gbps wireless network will the the way to control them in the field.

My first thought was that it was going to be incredibly difficult, but from the DARPA website:

Quote:

The goal is to create a 100 Gb/s data link that achieves a range greater than 200 kilometers between airborne assets and a range greater than 100 kilometers between an airborne asset (at 60,000 feet) and the ground.

At 60,000 (18.3 km), the density of the intervening air between planes is far lower than in the lower troposphere. and even in the air-to-ground link, a lot of the 100 km range for the transmission will be in rarified atmosphere.

That probably helps a lot I'd imagine?

Maybe. What military aircraft cruise at 60,000 feet? I should imagine the SR-71 and U2, and apparently the F-22 and also the non-military Concorde. Of these the U2 and SR-71 and Concorde are out of service. And the F-22 is a fighter, or does it also have a recon role where it gathers tons of data?

Otherwise there does not seem to be anything that can fly at that altitude to take advantage of the thin atmosphere, at least nothing that I'm aware of.

I wonder if there are any ecological ramifications from all the energy of radio waves, microwaves, etc. being transmitted over the air.We here lots about CO2 contributing to global warming, but it seems the constant stream of these transmissions wouldn't be without effect of some kind.

I don't think the energy would be sufficient to affect atmospheric temperature, except perhaps directly in the beam path. I do wonder about the affects on wildlife, birds specifically. I've read of radar dishes causing cooked birds to drop from the sky (leading to the microwave oven), but I don't know if the frequency would cause heating, but at high power levels there may well be other effects.

Quote:

If the effects are relatively inconsequential now, at which as we employ higher energies do those effects become harmful?

I don't believe the military is especially concerned about these things. Apparently the military sonar is suspected of having a disastrous effect on some marine mammals (dolphins, whales), for example, but the military does not have to do environmental impact assessments and pretty much appear immune to judicial review.

My first thought was that it was going to be incredibly difficult, but from the DARPA website:

Quote:

The goal is to create a 100 Gb/s data link that achieves a range greater than 200 kilometers between airborne assets and a range greater than 100 kilometers between an airborne asset (at 60,000 feet) and the ground.

At 60,000 (18.3 km), the density of the intervening air between planes is far lower than in the lower troposphere. and even in the air-to-ground link, a lot of the 100 km range for the transmission will be in rarified atmosphere.

That probably helps a lot I'd imagine?

Maybe. What military aircraft cruise at 60,000 feet? I should imagine the SR-71 and U2, and apparently the F-22 and also the non-military Concorde. Of these the U2 and SR-71 and Concorde are out of service. And the F-22 is a fighter, or does it also have a recon role where it gathers tons of data?

Otherwise there does not seem to be anything that can fly at that altitude to take advantage of the thin atmosphere, at least nothing that I'm aware of.

You're forgetting the concept of having balloons/blimps/long duration drones which have been in concept and a bit of testing for a while that would remain in a general vicinity to provide the long duration ISR, but could easily be nodes for this type of network as well, or instead. Kinda like a giant mesh network in the sky, constantly reconfiguring itself as nodes come in and drop off of the network.

Networks do not live from throughput alone/ Latency is a major factor in many applications. So I'm rather surprised, that DARPA is only specifying the 100Gbit top line. Or is it the reporting that omits the latency requirement?