Putting 802.11n routers to the test

Since the final ratification of the Institute of Electrical and Electronics Engineers' 802.11n amendment last year, testing the wireless capabilities of 802.11n devices has become easier, though it’s still a time-consuming job. So we again called upon the help of VeriWave, of Beaverton, Ore. The company’s loan of testing equipment allowed us to go beyond simple file transfer tests.

After we connected the WaveTest90 appliance to a router through a shielded enclosure box, we ran the test application suite. This suite has more than 20 different types of tests. However, in the interest of delivering the most crucial information, we concentrated on different versions of throughput testing.

The first throughput test had only one simulated client on the local-area network port sending data to a single client on the wireless end. This is by far the most common use of an access point, when a client browses the Internet or downloads a file from a network server. And this simple test allowed us to form a good baseline for comparison for the other tests. We ran throughput tests using the default 802.3 Ethernet frame sizes (64, 88, 128, 256, 512, 1,024, 1,280, and 1,518 bytes) at 2.4 GHz.

Most draft-N broadcast devices claim to have throughput speeds of 300 megabits/sec, which is the theoretical limit of such devices. However, in practice, a device might approach 300 megabits/sec at times, but the average rate will be less than that because data is almost always transmitted in packets.

A packet contains certain control data at the beginning and end that helps with error detection and makes it easier for information to be processed on the receiving end. Because the access point must perform checksums and other functions in addition to transmitting the data, the average throughput rate is less than the maximum. Also, the smaller the packet size, the more often devices must perform those tasks, so the average throughput generally becomes lower.

We then ran the same throughput test again, but this time, we sent the data from the wireless client to the wired one. Usually, those rates tend to be higher because an access point typically can more quickly process the control data that comes in through the wireless radio.

To give us a better idea of how the access point would perform in real-world conditions, we ran these tests a third time while simulating 10 clients on the wireless end. We determined the optimal workgroup size for the router based on how much the average throughputs dropped. We then simulated a mixed group of 802.11n and 802.11g users, a fairly likely possibility in the real world as offices transition to the 802.11n standard. To determine the effect of security on data throughput, we ran a single-user throughput test with the security of the router set to Wi-Fi Protected Access 2-PSK.

Those tests gave us greater insight into the inner workings of wireless routers. However, we felt that because implementation in an office environment has challenges that can’t be duplicated in a shielded enclosure, we could not overlook testing file transfers over distance.

We connected each access point to a desktop computer with a 10/100 Ethernet adapter, and used a laptop with a USB client adapter to connect to it. After we set all adapters on the same subnet as the router, we measured the amount of time it took to transfer a folder with 100M of different file types commonly used in an office. We took several readings at distances of 20, 40, 60, 80 and 100 feet to get a good average. From those, we could calculate the effective throughput in both directions at each distance mark.

The VeriWave tests were invaluable in pointing out each of the access point’s flaws and quirks. However, the real-world file transfer tests were an indication of how it would perform in an office implementation and, as such, were the greatest factor when determining the access point’s performance grade.