[SPONSORED CONTENT] The Energy Conservatory shares the basics of gauge calibration and specifications, and how these factors influenced its DG-1000 design

[SPONSORED CONTENT] The Energy Conservatory (TEC) continues to set the standard for specialized air flow and pressure measuring devices used to monitor and analyze the complex interactions which determine building performance.

A pressure and flow reading can only be as accurate as the gauge that gathers it.

Through steady use over time, all pressure and flow gauges – even the most modern digital solutions – will lose accuracy. The risks of readings from inaccurate gauges include inaccurate readings, quality assurance issues and code compliance issues.

By understanding how digital gauges work and when to have a gauge calibrated, home energy professionals can avoid these risks.

The inner workings of today’s digital pressure gauges

Digital pressure and flow gauges are increasingly popular for air tightness and duct testing because of their immediate feedback and reporting capabilities.

While the capabilities of gauges vary by manufacturer, each manometer includes some standard features such as additional channels, better accuracy and different flow calculations. These manometers are essential to standard pressure measurements, and suited for converting pressure from the fan into cubic feet per minute.

Digital gauges rely on electronic manometers to gather data, and silicon strain gauge pressure sensors (also known as piezoresistive sensors) are the most commonly used to measure pressure. A value is measured by converting the electronic signal from the sensor into a pressure reading as follows:

The pressure exerts a small force on the sensor, which creates an electrical signal.

The electrical signal is converted into a pressure value using several calculations that include:

The calibration parameters from the most recent factory calibration.

The most recent automatic zero correction.

The pressure value is then shown on the manometer display or can be output digitally to other software.

This process is nearly instantaneous thanks to today’s advanced technology. It also makes reading the output data simpler than analog versions, which rely on estimating a dial’s value. Digital gauge readings show exact values, usually with the appropriate decimal precision.

Research and experience has shown that sensors drift from factory calibration over time. In the case of silicon strain gauge pressure sensors, this drift varies from sensor to sensor within statistical limits. Regular recalibration, according to the manufacturer’s specifications, ensures the gauge’s accuracy.

Defining calibration and its role in accurate readings

Calibration is a process that corrects a gauge’s measurement by comparing the device’s measurement values to a known pressure reference.

Each digital gauge manufacturer calculates their gauges’ accuracy differently and recommends unique calibration intervals based on how long this accuracy will hold. Consulting your gauge’s specification sheet should reveal this interval.

Reputable companies arrive at their calibration interval through a series of rigorous factory tests and calculations. A statistical sample of gauges is analyzed against a known pressure reference, and drift over time should also be considered. When a gauge arrives for recalibration, its measurement value is corrected to more closely match the factory reference.

Manufacturers know that their gauges will require recalibration at some point, but each gauge is slightly different. This makes field calibration tests by home energy professionals critical. To test a gauge, compare its readings against a newly calibrated gauge. If you notice a gauge’s accuracy is drifting before the scheduled calibration interval, send it in to the manufacturer.

The impact of range, resolution and accuracy on gauge readings

Any gauge you buy from or recalibrate through a manufacturer should have a calibration certificate, which shows the gauge’s actual readings compared to the factory reference pressure. The instrument should always be within the accuracy limits after a factory calibration. Calibration certificates may include pressure, air flow, air velocity or electrical measurements, and when the device should be returned for recalibration.

Three items on the spec sheet are important to understand in choosing a digital gauge. These are range, resolution and accuracy.

Range specifies the maximum and minimum readings that can be measured by an instrument. For example, a thermometer intended for measuring outdoor air temperature might have a range of -40 to 120 degrees F, a range of 160 degrees F. A thermometer intended for measuring oven temperatures might have a range of 150 to 550 degrees F, a range of 400 degrees F. This idea applies to ranges of other types of instruments as well, such as pressure and air flow. Often, the closer a specific range matches your uses, the more accurate the results will be, but this is not always the case.

While often associated with analog gauges, resolution is also a factor for digital gauges. Resolution explains the smallest interval that can be measured between two readings. Often larger temperature ranges offer lower resolution, while tighter ranges offer higher resolutions, but this is not always true. An expensive laboratory instrument will often have both a larger range and a higher resolution than an inexpensive instrument intended for field use. If you need to understand air temperature measurements that may differ by only 0.1°F, your gauge’s resolution should match that need or be better.

Accuracy is a measure of how close an instrument’s reading is to the true value. It’s typically measured as a percent of full scale or percent of reading, which have fundamentally different results. In the table is a comparison of accuracy as ±1% of full scale and ±1% of reading, for a pressure measurement device with a range of 0-10 in. wc.

1% of Full Scale Accuracy

1% of Reading Accuracy

Reading

Expected Error

Error as % of reading

Expected True Value

Expected Error

Error as % of reading

Expected True Value

0.5 in. wc

0.1 in. wc

20%

0.4 - 0.6 in. wc

0.005 in. wc

1.0%

0.495 - 0.505 in. wc

6.5 in. wc

0.1 in. wc

1.5%

6.4 - 6.6 in. wc

0.065 in. wc

1.0%

6.435 - 6.565 in. wc

As seen in the table, percent of full scale accuracy is less precise at lower readings than higher readings, while percent of reading is more accurate overall. Some digital gauges blend these accuracy styles into a compound accuracy that uses a percent of full scale or percent of reading, depending on the reading.

TEC’s DG-1000 sets a new standard for accuracy measurements

TEC engineered its DG-1000 digital gauge around range, resolution and accuracy, to help home energy professionals gather more accurate measurements. To do so, the company built on and enhanced its existing gauge technology.

The gauge features a pressure range of -2,500 to +2,500 Pa, or -10 to +10 in. H2O, and an operating temperature range of 42 to 105 degrees F. The display resolution measures to 0.1 Pa for readings 0-999.9 Pa, and 1 Pa for readings at 1,000 Pa and higher.

To determine the gauge’s accuracy, TEC based the DG-1000 accuracy specification on an uncertainty analysis that was done in accordance with JCGM 100:2008, Evaluation of measurement data – Guide to the expression of uncertainty in measurement. This standard is also published as ISO/IEC Guide 98-3.

Through a series of tests and calculations, data showed the DG-1000 offers accuracy specifications of ±0.9% of pressure reading or ±0.12 Pa, whichever is greater. This tighter range helps professionals gather and report more precise data, as seen in the table below.

For the first time, TEC has also published an additional, detailed specification in a white paper that answers the question: How accurate is a brand-new DG-1000? The paper refers to this as “Laboratory Conditions,” but most office environments have these conditions. The accuracy specifications of all of TEC’s previous pressure gauges have all been specified for “Typical Use Conditions,” which included both the wider range of operating temperatures that would be encountered in field testing and assumes calibration every two years as recommended. The following table illustrates the performance difference between the DG-1000 and the previous pressure gauge.

To ensure the best operation of the gauge, TEC recommends recalibration every two years. A calibration certificate is always included with new and recalibrated gauges.

By understanding digital gauge specifications, home energy professionals can identify when a gauge may be losing accuracy and needs calibration to accurately gather flow and pressure readings. To learn how TEC tested its DG-1000 and calculated accuracy specifications, click here to download the white paper, "An Explanation of the DG-1000 Accuracy Specifications.”

Comments

Enter your comments in the box below:

(Please note that all comments are subject to review prior to posting.)

Posted By:

Email:

Your Comment:

While we will do our best to monitor all comments and blog posts for accuracy and relevancy, Home Energy is not responsible for content posted by our readers or third parties. Home Energy reserves the right to edit or remove comments or blog posts that do not meet our community guidelines.

While tight exterior envelopes have become standard for single-family homes, they have been slow to reach the multifamily sector. Multifamily buildings have many of the same leakage paths as houses, as well as additional paths ...