-174dBm/sqrt(Hz) is the minimum that you can achieve at "noise room temperature" (290 Kelvin), because that is the spectral density of noise in the RF region that a black body will emit. But every component from the antenna, antenna switch, low noise amplifier, downconverters, filters, more amplifiers, and ADC's will add a certain amount of noise to degrade the signal further. This can be discussed as noise factor, noise figure, noise temperature, and so on, but those are all also equivalent to having an increased noise floor at the signal reaching the antenna, and by converting to input referred noise floor, the minimum detectable signal is often defined as the point where the signal power equals the input referred noise power.

This will definitely NOT be the same for all phones.

A very good cryogenic low noise amplifier like astronomers use for very sensitive radio telescopes might have a noise temperature of 5 Kelvin, corresponding to an addition of -191.5dBm/root(Hz) noise power at the input. However the low noise amplifier in a cell phone probably has a noise temperature around 75 Kelvin (1dB noise figure at room temp), adding -179.7dBm/root(Hz) noise power. The first amplifier would be able to detect a signal 15 times smaller because of its superior noise performance. In fairness though it probably costs about a thousand times more...

I think they should have looked at the signal levels that calls begin to drop or get garbled data. THAT would be more interesting. What if the iPhone4 is "over reporting" because it has a more sensitive radio? If I were apple, or any company, I would show signals bars based on the chance of dropping data, not the raw signal strength. With having half the range as 5 bars, seems like that's what they did.

*Disclaimer: I have a WinMo phone. I really don't give a damn about any of these platforms. None of them suite me.

That would require they move away from their current setup that shifts away from 'inflating' your signal and 'inflating' apples awesomeness...

I think part of the issue is dB ranges of 0-~100 = 4-5 bars. dB Ranges 100-113 = zero-3 bars. You don't enter the '3' bar range until you're already on a weak signal, and can 'death grip' your phone to death. The article reported a max of ~24 dB signal drop from poor holding. From the looks you don't have to hold it too improperly to suddenly go 3 bars->disconnect.

This becomes an issue since people check their reception.. okay, 2-3 bars, im good... Then go make a call, or hold their phone to their head, and boom, 15dB difference, bye call.

The idea of "showing more bars to make users more comfortable" (or 'showing more bars to make people who think bars are standardized across phones think ours are better)... backfires when your 'bar' range doesn't properly tell people how close to disconnect they are and is 'mysteriously' goes from 3 bars to 0 -- like some people report.

That would require they move away from their current setup that shifts away from 'inflating' your signal and 'inflating' apples awesomeness...

Ah, but from what I've heard the last few days (and this is also mentioned in TFA) it was AT&T who told Apple "This is how we want you to report signal strength on the iPhone 4/in iOS 4" and while Apple isn't without blame (they were after all the ones who implemented this) it could just as well be AT&T trying to hide flaws in their network that resulted in the iPhone 4 reporting signal strength in a strange way.

I believe the correct engineering term for this reality distortion field is called the Bogon force field. The Bogon flux is measured by a Bogometer, in units of "bars." Apple has the most respected Bogonomists in the industry, but the Bogon is a strange quark that mysteriously vanishes when a detector is used.

Probably 99% of the population has no idea that -80 dB is extremely good and -100 dB is awful. Further, the curve is logarithmic, which makes things confusing because most people are only particularly familiar with linear.

Switch every phone over to display dB directly and everyone in the world would understand it in 6 months, though some would bitch about it for years to come.

People don't need to know what the numbers MEAN, they need to know that at 100 it doesn't work, and at 96 just barely works, but 80 is golden, and they'll figure that out fairly quickly.

Of course in reality all people really want is the phone to give them a good reason why they lost their call, can't get calls or have shitty data rates, and that could more accurately be represented with a simple block of text when the users asks and a green or red light in place of the bars.

The whole point of a "signal strength" meter is so that one can determine when one is approaching a "no signal" zone and so that one can determine how well their phone will work at a given location without having to make a call. It is disappointing that traditional signal strength meters (with 3-6 "bars") fail to do this reliably.

You can tell if the phone will work or not should you try to make a call or transmit data by a simple on/off indicator like you said. If the meter just displayed the S/N ratio, it would be the equivalent of having a traditional meter with lots of bars. This would convey more information, probably take up less space on the display, and allow people to generate detailed enough data that they might be able to fix things in places where performance is bad.

The problem of large or mysterious numbers could be remedied by offsetting the value by some fixed amount so that "0" is where the S/N ratio is so bad that the phone can't do anything.

That just makes it a great marketing tool, as long as you talk about Android phones "having reception of NEGATIVE 80dB! And they can download porn! Think of the children!". Just saying, if I was paid by the lie (I am, but not for Apple) I would be all over it;)

An alternative could be to have it show a percentage between 0 and 100. As this might be too distracting perhaps just show them in groups of 20% each. To save space, you could leave out the number and just show a block.

That way you can easily show the strenght of the reception and made it understandable for everybody.

Of course. Because the average phone user knows what a dB is and would much rather see it than a bar graph. My mother was just telling me the other day that she gets a -10dB attenuation in the kitchen compared to the lounge.

How about phones just print the dB signal loss and be done with it? A number should be far easier for someone to tell about signal strength than guessing by 0-5 bars.

Yes, because "-70 dB" would be much clearer to your average cell phone user than "5 out of 5 bars"...

Keep in mind that many of those users also think it's a good idea to send text messages while driving. It would take a $10M advertising campaign just to convince those people that "-70" is better than "-100".

You don't measure received signal strength in dB loss, unless you know exactly how much was transmitted for comparison. You measure it in terms of received power, usually in units of dBm. At the signal levels we're talking about here, you will see a range from -51 dBm all the way down to about -113 dBm. Good luck in getting anybody who's not RF tech-saavy to understand how a signal can have a negative level.

So, to make it simple on those who don't need to know (or really care about) the engineering behin

All mobile phones have tradeoffs in antenna design in order to look pretty, because people don't like visible external aerials. Apple have come up with what should be a very good design but compromised it by not coating the metal in a dielectric layer. Apple have created bad publicity for themselves by coming up with a BP-like response to the complaints, but this won't affect their sales because Apple buyers don't take any notice of negative publicity for Apple products.

"Apple buyers don't take any notice of negative publicity for Apple products."

Actually, some buyers do. Not the hardcore fanboy types, but my gf's parents saw a segment on the local news about the iphone 4 problems and have decided to look at Android phones rather than blindly upgrading their current iThings to the latest model. They may still get an Apple phone, but they would not have even considered alternatives if it weren't for the issues.

Hmmm, I'm guessing that's irrelevant, since it's probably just the same signal attenuation that's present on every cellphone.

The issue on the iPhone4 is the ability to detune the antennas just by touching both the GSM/3G antenna and the WiFi antenna at the same time with a sweaty finger - something that could have been so easily prevented with a dielectric coating. That's the reason people should be pissed, but a lot of people seem to be confused about what's really the problem.

this won't affect their sales because Apple buyers don't take any notice of negative publicity for Apple products.

It won't affect sales because in normal use, the iPhone 4 has better reception than previous iPhones. If there was a real problem, that would affect sales, but the average phone buyer doesn't read slashdot and gizmodo, and so doesn't get put off by this sort of hysteria.

It's not because it's visible. It's because bits that stick up tend to break. And the fractal-style antennas that are in modern phones have very similar performance to external aerials. Given the choice between the two, it's a no-brainer.

His graph is erroneously labeled in dB, which is an arbitrary scale, whereas it ought to be labeled in dBm, which is received signal strength.
In case you're wondering,the B is a Bel, which is a factor of 10. A dB is a deciBel, which is 1/10 of a Bel. dBm is decibels relative to a milliwatt of signal strength.

To be slightly but meaningfully pedantic, "dBm" should be interpreted effectively the same way "dB" is, (except you should add 30 to it -> 0dB == 30dBm,) because there aren't any units present. The "m" just adds the "milli" prefix to a unit that isn't stated. If you mean dB relative to 1 mW, you want dBmW. If you want dB relative to 1 mV, you want dBmV. If you've ever had an argument with someone about whether a "factor of 2" is 3 dB or 6 dB, this is usually because the 6 dB guy is unaware that he is

That last point that you made for posterity is not correct, because the definition of dB relates to power ratios, and a 2:1 ratio of power is 3dB, whereas a 2:1 ratio of voltage results in a 4:1 ratio of power and 6dB of change.

So, a factor of 2 is only 3dB when measuring power, because for power dB is defined as: dB = 10 * log(P1/P0),

and 10 * log(2) = 3

But when measuring voltage, a factor of 2 is 6dB, because for voltage ratios, dB is defined a

I'm quite sure that AT&T and Apple have always been aware that their phones were fudging the signal quality indicator on their product... Reality is hard to sell when your competitors fudge their numbers, too.

heck, my understanding is that carriers can influence how many bars a phone will show via their base station config.

something about altering the minimum transmitted requirement for a call. As this requirement is then used as the zero point for the bar scale, one get multiple bars but so poor a signal thanks to the relative nature of the bars.

there is also the issue of channel saturation. full channels will not show up as zero signal on the bars, but you will still be unable to place calls or do anything els

Holy hell the code for the Android OS StatusBarPolicy in the StatusBarPolicy.java file is a stinking mess. So much for Google having the best programmers in the world. A single public method -- installIcons() at the class level, and a pile of private methods doing all sorts of things. Hundreds of lines of different private variables and worst of all the slew of private anonymous classes.

Ahh spoken by someone who cares more about what some guy in a book calls a style of coding than actually getting the job done.

There's no reason you can't get the job done and do it well at the same time. I'd rather work on well-written code by a "clueless programmer" than a spaghetti mess written by a top notch guru, every time.

Looks like java written by a c programmer... Which is not all that unexpected. Main problem with it is a severe lack of comments. First comment it then refactor so you might be able to deal with it longer term.

If the rest of android is anything like this then there's a lot of work to do for anyone trying to maintain it long term.

Good programmers have learned to write maintainable code... If they don't then they are not a guru, they are a hack.

You're right about one thing: You've reminded once again that I made the right choice in quitting the industry after holding a variety of lucrative sysadmin, software development, IT, and technical lead positions from 1983 to 2009. Too many projects where getting it done mattered more than getting it right, ending up in the software equivalent of a Deepwater Horizon rig explosion. I'm so glad to be done with that.

that commented on/. about how Apple was making false claims about the incorrect signal bars? Surely if the responders on Friday had the balls to stand on a pedestal and make grand claims based on no evidence, they can have the balls to come back and admit they were wrong.

I don't recall them complaining about bars, I recall them complaining about trying to fix reception with a software update. All this graph seemed to confirm is that somebody was working awfully hard to eliminate the 3rd bar while keeping the other 4.

No one was claiming that Apple's response was a lie, just that it was misleading. There is still a hardware problem that won't be fixed for the users who have these devices, unless they want to slap on a case.

so what if the calculation is wrong or different between phones! It has nothing to do with the problem the iphone is having. If you normally have 4 bars with the wrong calculation, and you hold it and get no bars with the wrong calculation, then there is something wrong with the design of the phone, All apple is doing is trying to confuse the masses with technical facts hoping to confuse the issue and save money from all the lawsuits that are being filed.

Wrong. The range from 1-4 bars is about 13 dB but the range for the 5th bar is about 50 dB. If you have 4 bars on the iPhone it means your signal is crap to begin with so even a low amount of attenuation can drop you down to nothing. Their scale is wrong & misleading. After they "fix" it, I expect there will be many complaints since people will now be showing 1-2 bars where before they were showing 5.

The article is worth reading. Right on the first page it explains what is really going on with the "grip of death".

In other news reports I have seen about iPhone 4, it was explained that the iPhone 4 has a strip of metal wrapped around the body of the phone that serves as the antenna. Not so! There are two strips, of different lengths, serving as two antennas. One antenna is for WiFi and GPS, and the other antenna is for cell phone service. The "grip of death" happens when you make an electrical contact between the two antennas (on the lower-left corner of the phone).

According to the article, bridging the two antennas with your hand causes a drop in cell phone signal to noise ratio of about 24 dB. This can be enough to cause a dropped phone call, if you are already in an area with weak cell signal strength. If you are in an area with good cell strength, you won't drop the call and you might not even see the signal strength bars change.

And according to the article, as long as you don't bridge the two antennas, this phone really does do a better job of locking on to a weak cell phone signal.

So, if you have an iPhone 4, definitely invest in some sort of case that insulates the two antennas. And the article scolds Apple for not having put some sort of insulation over the antennas; presumably a future iPhone will do so.

Other pages of the article discuss other things. I did like the page where Anand explains why Apple's claims are valid that the screen is sharper than the human eye can resolve.

That's been known for a while. The question is, why on Earth did they not test the phones properly? And by properly, I mean in real world circumstances. It's not real world to have it covered in a fake mock up of a previous iPhone. Sure many, perhaps even most, users will put some sort of protective coating on their iPhone, but that's still not appropriate testing conditions. Given that you can only be sure that people can use it without, that's one of the conditions under which to test.

That's been known for a while. The question is, why on Earth did they not test the phones properly? And by properly, I mean in real world circumstances. It's not real world to have it covered in a fake mock up of a previous iPhone.

It does work very well. I have an iPhone 4 (got it on launch day) and I can replicate the "signal loss" by bridging those pieces of metal. I do lose bars (so my signal wasn't fantastic to begin with).

I've been watching this whole thing with interest. I've seen a ton of reports that the 4 is better at keeping calls when in a low signal area, at that seems to match my experience. It's a flaw, but really it's not that big. I've learned to keep my left hand (which I usually hold my phone with) about 1/2cm high

Oh yeah I feel so bad for them. They only made one little mistake, and all that mistake does is catastrophically degrade signal levels unless the user holds the phone in a way that may be uncomfortable to many or even most users.

All they did was design a device where you have to fundamentally change your own personal habits to fit their phone, rather than spending ten minutes of QA to notice and designing the phone to fit the users instead.

Is that the distortion field in action? First you telling me that the phone will drop the call if I hold it in my hand, but secondly you telling that's the iPhone is actually superior to other phones. Third, you telling it's actually my problem for holding the phone wrong in my hand and finally you telling me that with the next iPhone it will all be better and we should just wait and spend another 500$ (or whatever it will cost).

After I got my iPhone 3G the very next software update included a change to the "bar algorithm" that was marketed as "improving user understanding of the signal meter" or somesuch. It was in response to user complaints of low signal strength, and somehow (miraculously) the reception improved... more bars.

For all of the millions of dollars being lost on productivity aimlessly discussing 'bars'..

Can someone please dissect the antenna and then connect it to a calibrated spectrum analyser? This is so mindbogglingly trivial to do it is beginning to hurt my soul. I do similar exercises at work with new, untested antenna designs. I am sure I am not the only one.

For comparison, do the same to other phones and publish actual measurements of received signal drops and the effect from the disturbance caused from closing your hand around the antenna. This is similar to how touching an old rabbit-ears style antenna effects the picture on a analog TV broadcast, if the effect is as I suspect.

Voila! An actual, meaningful assessment of what the phone bars mean in real numbers from a calibrated instrument.

An uncalibrated receiver, such as the iphone, is not a proper tool to do this.

Personally I'm wondering why part of it (on many phones) seems to involve dropping a couple of bars whenever I press the "call" button, without moving the phone. Presumably doing so invokes the "slightly less BS" mode. The other thing I'm wondering is why the more expensive the phone, the crappier the signal. I picked up a spare PAYG phone for about the same as what lunch cost me that day and it makes a very clear call everywhere. My ol

Why do they use bars at all? Why don't they use numbers? I suspect it has something to do with early phones and a little dedicated LCD space of bars was cheaper than a full numerical display, but we're well beyond that now.

Anonymous Coward wrote:"iPhone offers more bars overall than Android to obviously it is going to report more bars at a given signal strength. "

I didn't think that sounded quite right, so I thought about it a bit. It turns out that your claim is partly true, iPhone OS will report a greater absolute number of "bars" about 1/2 the time, given a common baseline (both scales measure from the same zero, to the same peak "full" strength (which certainly isn't guaranteed).