It is generally accepted practice to not set your AP output power greater then your lowest power client device. Often I hear people say that many of the lower power type devices output about 25 mw or 14 dBm so it often you hear of people designing their AP's max output power at this level.

When we purchase and deploy AP's the various model AP's have antenna's with gain factors of 3-6 dB on many internal antenna AP's as well as gain factors of even more with external antenna style AP's. As far as the client side I generally don't see any gain ratings on that end and I think often the antenna is generally a "Compromise" antenna since it usually has to be contained internal to an PC or device that does not have form factor to allow a true radiating element.

The question:

Since the power setting on the AP is Pre antenna and the antenna only adds gain should we take into account the gain of the AP's antenna (effective radiated power) in setting the max AP output level in our designs and deployments?

Generally speaking I would say no. However, increasing its gain implies increasing its directivity, so in a way you had better design for its EIRP. The choice depends on your environment.

For example, if you had a very high gain omni-directional antenna it might work okay on a single floor, but would be terrible on floors above and below its installation point. Which might be exactly what you are looking for in some circumstances.

Likewise a high gain antenna, without down-tilt, mounted high in a warehouse would be bad too.

I test client devices primarily, and many of the antennas I see have very low gain figures - some of them even having negative gain (ie losses) of 1 dB or more. However their radiation patterns are very close to a sphere (nearly isotropic). These can work surprisingly well in the right circumstances.

Personally I like magnetic field antennas, like those from Ethertronic. We use them in a diversity setup and they work very well. Better immunity to noise too.

Sorry for going off topic.

P.S. Most client device manufacturers seem very loathe to advertise their antenna gain or REAL output power levels.

The antenna gain works both ways: transmitting and receiving. Therefore you can compare the transmitting power levels of the devices as such. You can think that the receiving/transmitting antenna gains "cancel out" each other. Likewise you don't consider the gain of the antenna in the PC or cell phone.

The antenna gain increases the amount of energy in the air between the antennas, which is good. Your local authorities may have limited the allowed maximum energy radiated. In that calculation you need to account for the antenna gain, but in this case you are going for the low end of the transmission level.

(Not a very clear explanation. I'll need to think if I can come up with a better one)

A national park has asked you to provide a fixed, weatherproof tablet on a hilltop for visitor use. You’ll use solar cells for power, but the internet connection is a problem. There is a clear line-of-sight to a nearby ranger station. With a 20 dBd dish mounted on the roof of the station you can receive the probes from the tablet sent at 13 dBm. What should your transmit power be set to?

If you account for the gain of the dish and set the power to -7 dBm you’ll get the 13 dBm right next to the antenna, but not at the far end. If you set the power to 13 dBm you’ll get the same signal strength at the hilltop as you are receiving the tablet’s signal. That’s because you get the gain for both receiving and transmitting. Therefore you use the same power level at both ends irrespective of the antenna gains.