Performance Test: 802.11b Takes a Lickin' and Keeps on Tickin'

With all of the talk about the performance impacts of WEP (Wired Equivalent Privacy) and noisy
environments, I thought I'd throw some quick, informal benchmarks together to see if any of this hearsay was true.

To test this, I generated a 1,441,792-byte file (as 1,441,792 bytes x 8 bits per byte = 11,534,336 bits, or 11 megabits) by copying it from /dev/urandom using dd.
I then tried sending this file to an IBSS (ad hoc, also known as peer to peer) node
like this:

$ time dd if=random.bits |ssh -c none 10.0.1.254 dd of=/dev/null

This essentially means "Take these 11 million random bits, copy them over
the network using no encryption, just dump them into the bit bucket on the
other end, and tell me how long it took you." I did this to try to minimize
the impact of disk usage and CPU crunching, and just try to make the bits
fly as fast as possible.

I tried sending the file at 1, 2, 5.5, and 11 megabits per second, from a couple of different
positions: about 25 feet away from the node (through three walls and a
non-operational microwave oven), the same with the oven boiling some tea
water, and then from about 20 feet away in the same room as the operating
microwave oven. I did this with WEP encryption off, on, and at Lucent's
128-bit RC4, all without external antennas.

Have you ever experienced microwave oven interference, or any other type for that matter, with your 802.11 network?

The first interesting thing that I noticed was that, no matter how hard I
tried, I couldn't squeeze out more than about 4 Mbits. This could be due to
a number of factors, the most likely being that I'm using a cheap PCMCIA/ISA bus adapter in my router (a Pentium 233), and the bus probably just
couldn't keep up. I'm really not so worried, as my Internet connection is
only 1.544 Mbps max anyway! ;)

Before we get to the numbers, I'd like to point out that even with the
above fancy command, there was still a small amount of system overhead in
actually getting the packets sent. As the exact amount is difficult to
calculate but non-trivial, I decided to weigh the figures like this:

At 1 Mbps, it should take 11.00 seconds to transmit my 11 Mbit file,
in the best possible case. On average (each test was sampled five times and
averaged), it took 14.91 seconds to complete.

So, we have 3.91 seconds of unaccounted-for overhead (or 35 percent of the total
transmission time.) For purposes of argument, we'll assume that the 1 Mbit speed
is optimal, and deduct 35 percent from all transmission speeds (chalking it up to
system overhead.) And so we are grading on a curve.

At any rate, here's the performance at 25 feet, through three walls and a
solid wooden door, no antennas, behind the unplugged microwave, without WEP:

As you can see, WEP appears to have virtually no effect on throughput, until
cranking 11 Mbps. The difference in transmission times was just about 1 percent. I
believe it can be chalked up to natural variances (not to mention user
error!) At 1 Mbps, WEP performance actually tested slightly better!

There was a bit of an impact at 11 Mbps (about 8 percent or so) when going to 128-bit
encoding. Not sure what that means, as it wasn't even getting anywhere near
11 Mbps without WEP to begin with. Really, the highest figures should
probably just be tossed until I figure out why it isn't transmitting
efficiently.

As the difference in throughput was negligible, I left 128-bit encryption on
for the remaining tests.

Next came the fun part. 300-watt microwave oven on at full blast, nuking a
cup of water, while trying to upload an 11 Mbit file:

At 1 Mbps, I saw the worst performance when directly behind the running
microwave oven, at a 45 percent performance hit. But at 11 Mbps, in the same
setting, the performance was only down about 23 percent.

Granted, that was a highly contrived test setting. I think the ambient room
test (about 20 feet from the node, with a microwave oven chugging along in
the same room) was more representative of what you can expect in the "real
world":

There we have it: 11 percent loss at 1 Mbit; 8 percent loss at 11 Mbps.

So, in the absolute worst case, inches behind a running microwave and at
1 Mbps, you're still pulling 640 Kbps -- about double the speed of the average
DSL line, and about 12 times the bandwidth of a "very fast" dial-up
connection!

I can't wait to test the above against Bluetooth equipment running in the
same environment -- assuming it ever makes it to market. ;)