Author
Topic: How to calibrate an iPhone ? (Read 34672 times)

Hi,I've just bought the SoundMeter app for my iPhone 3G. When I switch it on, it weights about 52.0 or so (dB ? the main number...), even if it is in a silent room where there is obvioulsy not so many dB. As soon as I speak normaly, it jumps to 90 or more. I have maybe a strong voice, but not so much !

When I switch it on, it weights about 52.0 or so (dB ? the main number...), even if it is in a silent room where there is obvioulsy not so many dB.

I assume that you are using the built-in microphone on the iPhone 3G. In that case, the default mic sensitivity should get you within 1 or 2 dB of the actual sound level (unless the spectral content of the noise you are measuring has a lot of very low frequency energy, but in that case, the measured level would be biased low, not high.) If you have adjusted the mic sensitivity, you might want to try resetting it to the default value with the button on the Calibration screen.

"Silent" rooms often exhibit higher sound levels than you might expect, so the 52 dB value doesn't seem unrealistic. However, speaking normally should not produce a 90 dB reading unless you have your mouth very close to the microphone.

For both Lp and Leq measurements, the displayed value is given in dB referenced to 20 micropascals, by definition.

Quote

How can I calibrate that app ?

Calibration of the built-in mic will need to be performed relative to a separate calibrated sound level meter.You might find this post helpful.

FaberAST: Out of curiosity, how does SoundMeter adjust the measured sound level with regards to the calibration?I've been looking at some SLM apps for iOS and Android and roughly speaking I found 2 methods:

Adding to or subtracting from a dB value is the same as multiplying or dividing the value upon which the dB value is based. Decibels (dB) simply provide a means of expressing values on a logarithmic scale.

In other words, scaling and trimming are different ways to accomplish the same task (although it should be noted that your description of scaling is incorrect--the dB value is not scaled, but it's non-dB counterpart is scaled before being converted to a dB value).

Faber apps scale raw input signals based on measured (calibrated) sensitivity values and then calculate levels in dB. In SignalScope Pro, this allows for consistency between the various tools. For example, you can look at the time waveform in the oscilloscope tool with instantaneous amplitude expressed in pascals (Pa). The level meter tool will show the level in dB SPL (relative to 20 micropascals, rms) that is consistent with the time signal levels you can see in the oscilloscope.

Thanks for clarifying.I knew about decibels being a logarithmic representation of the ratio between a measured pressure (or voltage) value and a reference value. And I also got that adding/subtracting from a dB value (i.e. "trimming") is equivalent to multiplying/dividing (i.e. scaling) the underlying pressure or voltage value.

However, I really did find some Android apps (at least 2) which applied calibration by scaling dB values themselves. When I filled in a calibration value of 2 (the default being 1), the dB readings in those apps exactly doubled. Another Android app I found explicitly mentioned in the UI that calibration was achieved as follows: dB' = (dB x scale) + trim.I am not saying that these are smart ways of doing it, only that it is being done. It is very likely that these apps are made by amateurs how do not know much about acoustics (while you obviously do).

So just so that I understand correctly, when you say your apps "scale raw input signals based on measured (calibrated) sensitivity values and then calculate levels in dB", does this mean that, before the dB value is calculated, the (voltage) signal is multiplied/divided by a constant device/model-specific scale factor? For example, on an iPhone1 you multiply the signal by a constant X, on an iPhone3G by a constant Y, on an iPhone3GS by a constant Z, etc.? And X, Y and Z were probably determined experimentally by comparing iphones against reference equipment?

Buyer beware of the various sound level meter apps out there. Some of them show in their screenshots a max sound level that exceeds the peak level. Conventional (even standardized) use of the terms max and peak in sound level measurements will never yield a max sound level that is greater than the peak. So, that is an immediate indication that a an app may not be reliable.

Regarding Faber apps, the input signal is multiplied by a constant scale factor, or sensitivity. The default sensitivity values are device-indpendent (i.e. they're the same for all iOS devices). This is because Apple's devices have been fairly consistent in their treatment of the audio input signals. If a user needs a more accurate sensitivity value, the Calibration screen within the app will let them specify an arbitrary sensitivity or automatically calculate a sensitivity based on a known input level.