What are the measurement units for power?
Optical power is measured in linear units of milliwatts (mW), microwatts (uW - really the greek letter "mu"W), nanowatts (nW) and decibels (dB).

What is the difference between "dBm" and "dB"?
dB is a ratio of two powers, for example the loss in a fiber optic cable. When power is measured in linear units (mW, uW or nW), dB is calculated on a log scale using this formula:

power (dB) = 10 log (power1/power2)

If we are measuring absolute power levels, the measurement is generally referenced to 1 milliwatt (mW), is expressed as "dBm" and the calculation becomes:

What power level should a source have?
It depends on the type of source. When coupled into a good test cable, the source output power will be in these ranges:
LED: -10 to -25 dBm into 62.5/125 fiber
Telcom/LAN laser: 0 to -13 dBm into singlemode fiber, to +20 dBm with DWDM and fiber amplifier systems
CATV Laser : +16 to 0 dBm into singlemode fiber

What power level should a receiver see?
It depends on the network and type of source. When measured at the end of the network cable, the source output power will usually be in these ranges:
LAN/LED: -20 to -35 dBm into 62.5/125 fiber
Telcom/LAN laser: -20 to -45 dBm into singlemode fiber
CATV Laser : 0 to -10 dBm into singlemode fiber

The loss budget is a calculation of how much attenuation a link should have. You compare that loss to the dynamic range of the networking equipment to see if the range and link loss are compatible.

How accurate are fiber optic power meters?
All optical power meters which are calibrated to NIST (the US standards body) or any national standards lab will measure optical power to an uncertainty of about +/- 0.2 dB or 5%. Therefore, since every power meter has an uncertainty of +/- 0.2 dB, any two meters can differ by 0.4 dB in the worst case (one at +0.2 dB and one at -0.2 dB) even if both are within their specification! More information on calibration uncertainty.

Are more complex or higher priced FO power meters more accurate?
The high priced meters offer better dynamic range and more features, but not better absolute measurement uncertainty.

Why is the measurement uncertainty so high? That is because there are three to four calibration transfers from the NIST absolute optical power standard before the meter is delivered to the customer. The NIST standard has an uncertainty of about 1% itself and every transfer adds errors of about 1%. More information on calibration uncertainty.

Why do most meters only offer calibrations at a few wavelengths?
NIST only offers calibrations at 850, 1300 and 1550 nm, so those meters that have calibrations at other wavelengths have to extrapolate to those values, increasing the measurement uncertainty at those wavelengths.

If my source is at a slightly different wavelength from the standard calibration wavelength, doesn't that add to measurement error?
Perhaps, but the wavelength of most sources is not known by the person making the measurement. If everyone uses meters calibrated at only a few specific wavelengths, everyone can be testing to the same standard and will get more closely correlated mesurements on sources of unknown wavelengths .

As of 2011, OFSTP-14 has been replaced by an international standard that as of mid-2011 is very controversial. First of all, it allows the use of either insertion loss testing with a light source and power meter or OTDR testing. This was predicated on comparisons of OTDR tests on cable plants for 10GbE of less than 2 dB loss. For other more typical multimode links of 5-10 dB, the two methods will generally give divergent results. Since insertion loss is designed to test according to how the link will be used, it should be the test used for longer links. The new standard also includes a new metric for measuring mode power distribution in multimode fiber called Encircled Flux (EF). It is also controversial, but international documents say the older source with mandrel wrap meets their requirements. Here is a more detailed explanation of all the options in cable testing.

Why do you use a launch cable on the source?
You use a launch cable to set the proper test conditions for testing another cable. The launch cable should match the fiber size and connector type of the cable you want to test, and be tested to insure it is a low loss connector.

Why can't I just attach my cable I'm testing directly to the source?
Sources have a great deal of variety in how they launch light into the cable, which can cause undesirable variations in loss measurements. Furthermore, the coupled power can vary considerably with each insertion, depending on the alignment of the connector ferrule in the source output connector.

What makes a launch or receive cable "good"?
A good launch or receive cable will have low loss - less than 0.5 dB loss when tested in a single-ended FOTP-171 test.

Do I always need a laser source to test singlemode?
No, you can use a LED source for short SM patchcords or cables up to about 5 km long. Longer SM cables will show higher loss with an LED due to the spectral width of the LED causing higher loss at the upper and lower ends of its spectral output. FOTP-171 actualy calls for LED sources to test SM patchcords to prevent problems with the interference caused by the coherent light of the laser.

What is a receive cable?
The receive cable is used in a double ended test to measure the connector loss on both ends of the cable.

When doing a double-ended loss test, why don't you set the reference with both launch and receive cables connected together?
There are several reasons why you use the same reference method for single-ended and double-ended tests.
If you want to measure the loss of connectors on both ends of the cable being tested, you need a reference with the launch cable only.

A two cable reference removes one connector loss from the measurement, since you include it in the reference. You can also reference with three cables, where you simply replace the middle cable with the cable under test, but the loss removes two connectors from the test value, as they are included in the reference. However all three methods are approved standard methods and the three cable reference is sometimes the only way you can test cable plants with connectors like the MT-RJ that cannot be directly connected to test instruments. Here is a more detailed explanation of all the options in cable testing.

What happens if the launch and/or receive cables have bad connectors with say 3 dB loss.

When the zero reference is set, it will include the loss of two bad connectors. When you attach a cable between them to test, you will measure erroneously high loss for one or both connectors on the cable being tested, invalidating the measurement, so it is very important to keep all test connectors clean.

That depends on the output power of the source and the sensitivity of the meter. For example, one of our LED sources will have a maximum output into 62.5/125 fiber of about -15dBm. Your meter should be used at power levels above about 10 dB higher than its minimum spec. A meter can easily read to -45 dBm (min spec is -55 dBm), giving us a range of 30 dB (-45 dBm from -15 dBm gives us 30 dB). At 850 nm and a loss of 3 dB/km, that's 10 km of fiber, less our connector and splice loss, and at 1300 nm,a loss of 1 dB/km, it's 30 km less connector and splice loss, both lots longer than any networks operating on multimode fiber.

For singlemode testing, lasers can give you 0 to -10 dBm output, giving a range of 35 to 55 dB, corresponding to over 100 km of fiber, even approaching 200 km at 1550 nm!

How Can An Insertion Loss Test Show "Gain", Not Loss!

When an insertion loss test shows gain not a loss, it is usually a problem with setting the "0dB" loss. If the reference cables are dirty when setting the "0dB" reference and then cleaned before testing (or the dirt falls off), the measurement may show a positive gain not a negative loss measurement. It's also possible it may indicate a problem with the instrument - source drifting

Can't I use an OTDR to test cable loss?
Well, Yes and No. The OTDR will measure the loss in the cable plant, but using a technique based on backscatter signals that indirectly measure loss, unlike a source and power meter that measure loss directly. The OTDR measurement technique doesn't correllate well with the source and meter. Since the source and meter tests loss just like the transmission link, all standard cable plant tests specify using a source and meter to measure loss. You must always use them to test the cable plant loss. Even the outside plant singlemode test standard, OFSTP-7, says you should accept only the source/meter results.

Why do I use an OTDR?
Use the OTDR for troubleshooting. If you have a cable break, especially in the outside plant, the OTDR is the best way to find it. You can also use it to verify splice loss (but test both ways and average to get a reliable measurement) or find problems with back reflection (optical return loss).

How do I see close features with an OTDR?
The blind spot of an OTDR caused by crosstalk from the test pulse can be overcome by using a "pulse suppressor", a long (1 km is normal) length of cable to allow the OTDR to settle down after the initial pulse.

Bandwidth Measurements

Do I need to test bandwidth?
Generally no. Most systems are specified for use with a minimum bandwidth fiber and most fiber is much better than minimum specification. Besides test equipment is not cost effective or readily available for field use. Manufacturers of fiber and cables have the expensive lab equipment to reliably test bandwidth (or actually dispersion), but there are no good field testers. If you need bandwidth data for an unusual application, ask the manufacturer of the fiber or cable to assist you or use a simulation program, which are available from some manufacturers of fiber.

One exception is long distance networks which need testing for chromatic dispersion (CD) and polarization mode dispersion (PMD). Here is more on CD and PMD.

What is Reflectance and Optical Return Loss?Reflectance is the light reflected back from a connector or splice. Optical Return Loss (ORL) is generally used to combine the reflectance from connectors or splices with the backscatter from the fiber, so the term is primarily used for longer cable runs. Reflectance was once called back reflection, but that term, which is really redundant, has lost favor.

When do I need to test optical return loss?
Reflectance or optical return loss mostly affects very high bitrate digital or analog singlemode systems. None of today's multimode systems are very sensitive to reflectance or ORL although high amounts can create background noise in short links adversely affecting BER or data transfer. For affecting laser sources, ORL is only important in the first few connectors in the cable nearest the laser transmitter. But some short SM cable plants used in premises systems are too short to attenuate the reflected power from connectors so undergoes multiple reflections until it builds up background noise that cab affect receivers.

How do I test optical return loss?
Use an OTDR on cable plants, OCWR on patchcords. ORL testing with what people call a ORL tester ( or what Telcordia/Bellcore calls an OCWR or optical continuous wave reflectomenter) is only applicable to short patchcords. If you try using them on an installed cable plant, the integrated backscatter from the length of the fiber will overwhelm the connector back reflection. Twenty km of fiber gives about the same backscatter as a connector with 20 dB ORL and you cannot tell where the backscatter comes from! It's better to use an OTDR to find ORL problems on installed cable plant.

How accurate are ORL measurements?
The measurement uncertainty of ORL is very high, about +/-1 dB for singlemode and +/-5dB for multimode, according to round robin results from standards committees. This is a function of the difficulty in creating a reference for the measurement, the reflection is very small compared to the test signal causing noise problems and the dependence of the measurement on the connector on the test apparatus is very high. To minimize uncertainty, keep the connections extremely clean and inspect the connectors continuously with a microscope and repolish when needed. Making measurements to a 0.01 dB resolution is ridiculous; remember this is a +/- 1 dB measurement.

Do I need special instruments to test ORL?
No, the special ORL tester is unnecessary. A good laser source and power meter, along with a coupler that costs a few hundred dollars will make a very good tester.

Note: Most instructions for using OCWRs suggest using a mandrel wrap to reduce the reflectance from the connector on the end of the cable. If the cable has bend-insensitive fiber, as do many patchcords, this method does not work. Instead dip the end connector in index matching gel or fluid (vaseline, alcohol or mineral oil works in a pinch.)

With fiber optics, our tolerance to dirt is near zero. Airborne particles are about the size of the core of SM fiber and are ususlly silica based- they may scratch PC connectors if not removed! Test equipment that has fiber-bulkhead outputs need periodic cleaning, since they may have hundreds of insertions of test cables in short time frames. Here's a summary of what we have learned.

1. Always keep protective "dust caps" on connectors, bulkhead splices, patch panels or anything else that is going to have a connection made with it. Dust caps themselves may contain dust so whenever a connector is to be used, clean it.

2. Use any of the commercial cleaning kits to clean connectors and mating adapters. Alternatively, use lint free pads and isoproply alcohol to clean the connectors. Some solvents MIGHT attack epoxy, so only pure alcohol should be used. Cotton swabs and cloth leave threads behind. Some optical cleaners leave residues. Residues usually attract dirt and make it stick.

3. All "canned air" has a liquid propellant. Years ago, you could buy a can of plain dry nitrogen to blow things out with, but it's long gone. Today's aerosol cleaners use non-CFC propellant and will leave a residue unless you 1. hold them perfectly level when spraying and 2. spray for 3-5 seconds before using to insure that any liquid propellant is expelled from the nozzle. These cans can be used to blow dust out of bulkheads with a connector in the other side or an active device mount (xmit/rcvr). NEVER use compressed air from a hose (they emit a fine spray of oil from the compressor!) or blow on them (you breath is full of moisture , not to mention all those yukky germs!)

4. A better way to clean these bulkheads is to remove both connectors and clean with Alco Pads, then use a swab made of the same material with alcohol on it to clean out the bulkhead.

5. Detectors on FO power meters should also be cleaned with the AlcoPads occasionally to remove dirt. Take the connector adapter off and wipe the surface, then air dry.

6. Ferrules on the connectors/cables used for testing will get dirty by scraping off the material of the alignment sleeve in the splice bushing. Some of these sleeves are molded glass-filled thermoplastic and sold for multimode applications. These will give you a dirty connector ferrule in 10 insertions! You can see the front edge of the connector ferrule getting black! The alignment sleeve will build up an internal ledge and create a gap between the mating ferrules - viola: a 1-2 dB attenuator! Use the metal or ceramic alignment sleeve bulkheads only if you are expecting repeated insertions. Cleaning the above reguires agressive scrubbing on the ferrules with the AlcoPad and tossing the bulkhead away.

7. You can buy a cleaning kit for fiber optics. They are good solutions but perhaps not as cost effective as making your own to meet your needs.