You are here

A mass spectrometer at the Environmental Molecular Sciences Laboratory.

The Science

PNNL researchers developed an analytics-based tool that intelligently combines individual and disparate data quality control metrics to identify anomalous instrument analyses faster and more accurately.

The Impact

The new tool assures the quality of collected data, and it has the potential to substantially reduce the costs associated with analyzing large cohort studies. For example, the tool will immediately identify when an instrument is no longer performing as expected, sample preparation has significantly changed, or a sample otherwise has anomalous behavior and should be reanalyzed.

Summary

Proteomics studies of large sample cohorts can easily take months—even years—to complete. Normal variations in instrument performance over time and artifacts within the samples derived from collection, storage, and processing make it challenging to acquire consistent, high-quality data in large-scale studies.

Existing quality control methods for proteomics data primarily focus on post-hoc analysis to remove low-quality data that would degrade downstream data analysis. However, these tools are not designed to evaluate the data in near real-time, which would allow immediate interventions in the experiment and timely assessment of instrument performance.

To address this gap, the research team developed Quality Control Analysis in Real-Time (QC-ART), a tool for evaluating data as they are acquired in order to dynamically flag potential issues with instrument performance or sample quality. QC-ART has similar accuracy as standard post-hoc analysis methods with the additional benefit of real-time analysis. The team demonstrated the utility of the tool in a large clinical study of proteomics data used to discover new biomarkers for type 1 diabetes.