Sceptically passionate on Enterprise Mobility, AutoID, WLANs, OSes and other technical stuff I happen to work with

Wi-Fi Riddles: Strong signal = bad signal?

While I was working on the next part of the “Unobvious and overlooked Wi-Fi” (which is about channels), I got an interesting knowledge nugget from our engineering. We all know that there is a lower limit to receiver sensitivity, we all know that there must be some upper limit, after which the Rx signal is so powerful, it simply oversaturates the radio. But that is it? Now I know it, even though I did not ask for it explicitly – I merely happened to run into a situation where it matters.. Read on…

I was looking into channel overlap, and have configured two APs (AP-7131N) blasting at full power on the same channel. I positioned them so close they were literally touching the antennas. When I looked at RSSI measurements on the AP, I could see RSSI of -1 to -5dBm (per-packet RSSI as reported by APs radio):

[AP CLI, can be done in GUI, but I’m lazy when it comes to clicking. Output redacted for clarity.]

Losses: only FSPL. So, in order to see -1dBm RSSI, we need to lose 30dB due to FSPL.

FSPL calculator gives me 31.5cm of spacing for such a loss, and my APs are 0 cm apart! I know that FSPL is non-linear when it comes to small distances, but I still expected to see positive numbers! Even at 10cm apart (FSPL ~20dB) I should have up to 9dB of positive margin… I checked my set up several times and spent quite some time trying to get positive values – nothing. What is wrong?

Well, the only thing that was obviously wrong, was my understanding of how Wi-Fi radios work.

After asking our engineering, I’ve been told that most chipset vendors cap their receivers (by design) at -20 to -30dBm (based on product designation). This is still conformant to the 802.11 spec.

[Update] Thanks to Chi-Thahn Hoang I got a lead on the standard specifications on the upper Rx limits. There is no single table, instead the data is spread across multiple sections, depending on the PHY type. The key phrase to look for is “maximum input level” which typically resides in the “PMD Receiver Specifications” section. Here’s an example for HR-DSSS (802.11b): “The receiver shall provide a maximum PER of 10% at a PSDU length of 1000 octets, for a maximum input level of –30 dBm measured at the antenna for any baseband modulation.”

Summarizing the data across all chapters were’re getting this:

802.11b: -30dBm

802.11a: -30dBm

802.11g: -20dBm (including when operating at 802.11b rates)

802.11n (2.4): -20dBm

802.11n (5): -30dBm

So, -30dBm for 5GHz and -20dBm for 2.4GHz. [/Update]

Any stronger signal then simply saturates the radio, causing signal distortions (up to a level on not being able to decode the frames, think of being blinded by a light that is too bright), and any RSSI numbers above those thresholds are inaccurate.

If you think about it, it makes total sense: in real-world scenarios one would never see such values. Both clients and adjacent APs are typically spaced much further. It makes so much sense that our software, as I’ve been told, cannot even display RSSIs above 0, which perfectly explains why I could not see any positive numbers! 🙂

Summary:

Well, as you can see, when it comes to Wi-Fi signal, “The more the merrier” principle does not work. Too much signal is no good. Also, do not trust RSSIs >= -20dBm when reported by Wi-Fi devices (specialized tools might be an exception). Never know what one may learn another day… It’s not the only strange thing I ran into while experimenting recently, so expect more articles in my new category “Learning by doing …wrong“.

4 thoughts on “Wi-Fi Riddles: Strong signal = bad signal?”

Arsen, as I know from CWNA- Certified Wireless Network Administrator Official Study Guide, RSSI is a relative metric, you can not compare in to transmitted level.
The 802.11-2007 standard deﬁ nes the received signal strength indicator (RSSI) as a
relative metric used by 802.11 radios to measure signal strength (amplitude). The 802.11
RSSI measurement parameter can have a value from 0 to 255. The RSSI value is designed
to be used by the WLAN hardware manufacturer as a relative measurement of the RF
signal strength that is received by an 802.11 radio. RSSI metrics are typically mapped to
receive sensitivity thresholds expressed in absolute dBm values…
CWNA Study Guide, Chapter 3. RF Mathematics. P 88-89.
Some additional info about RSSI I’ve found with google:http://lists.shmoo.com/pipermail/hostap/2006-December/014832.html

Maxim, thanks for your comment. This may have been an unfortunate choice of terminology from my side. But when APs do RF management (RRM, ARM, SMART RF, erc) they do measure somehow the absolute values, don’t they?
I will have to ask our devs, thanks for pointing out 🙂

You will only ever be able to “ballpark” this as a sanity check, they are not near accurate enough to go by spec’d target powers without expensive equipment.

For example, 20dBm transmit will really be 17-18dBm (or less, depending on your vendor QA, as I recently tested a popular vendor that was 5-7dBm down from their spec’d targets).

Target powers are often different for different modulations. For example, 6Mbps might output 24dBm and 54Mbps outputs 19dBm. So unless your receiver reports which rate it measured, you don’t know. You can cap your TX power to ensure all modulations use the lowest transmit power across all rates.

Antenna gain will be slightly less than spec’d, as often antenna mfg’s round up (2.5 becomes 3, with a cherry-picked good VSWR antenna sample).
RSSI is not accurate to 1dBm. You will be lucky if its within 3dBm. RSSI was off by more than 12dBm on the popular product I mentioned above.

The first 11b devices I tested with Prism chipsets had a relative RSSI. They did not report in dBm’s, and they could not be reliably converted to dBm’s from one unit to the next. They would be different for different PA’s. Windows PC’s would have a 5 level bar of signal or something.

With Atheros chipsets, the RSSI has been fairly accurate in dBm’s, however, many drivers do not have proper noise floor measurements, and measured RSSI is relative to noise, so if your noise floor is wrong (as is in MANY cases), the RSSI is off by the same amount. if you have a hardcoded -95 or -96dBm noise floor, you will have this RSSI accuracy problem. Some companies even put calibration offsets in to trick the driver into thinking they are outputting 20dBm when they could be outputting 26dBm, causing a 6dB offset on the RSSI that would need to be subtracted for RSSI measurement in the driver. I’m sure this isn’t done in many cases.

So as a sanity check, you can get an idea of what signal is at the connector and compare what the device reports, but keep in mind there is +/- 2dBm on the transmit power, +/-3dBi on the antenna gain (labs specify +/- 4dBm in their reports for antenna measurements), and some measurement error on the receiver side. But as you found later, this is operating outside of the expected operating range, so there would be even more reason that reported values would be further from spec’d values.

It’s much easier to check Tx and Rx accuracy using cables and a power meter, and then moving to an antenna test since you have some known values.