"Data is a precious thing and will last longer than the systems themselves.” – Tim Berners-Lee, inventor of the World Wide Web.

Hopefully all of us in the industry would agree that our data is precious and requires the utmost level of integrity. Safe to say that we all agree as well that accomplishing this is much easier said than done. So where are you on your data integrity journey? Are you just beginning, somewhere in the middle or perhaps even have the end in sight? Is there really an end? And how do we know when we are there? The concept of data integrity is not new, it has been an integral part of GMPs since conception…so why do we still have so many questions? With the accelerating evolution of technology, the volume and sources of data have gone up exponentially. Are we prepared as an industry to address the challenges that go with it?

For most of us, the data integrity journey starts with a gap assessment and remediation effort. When we start remediation, we realize just how far and wide data integrity reaches and it can be quite overwhelming. As a result, many of us have started, stopped and redirected our path toward a more robust, holistic approach. Ensuring an understanding among our employees of what data integrity actually means is key to building a strong foundation. Still, based on trends in recent inspectional observations, confusion persists. How can we instill this understanding in our industry to ensure the data we generate and use for impactful decisions has integrity? How do we overcome these critical challenges?

As we transition from reactive to proactive, there are three areas that we need to discuss: quality culture, risk assessment and phase appropriate controls. As we build our foundation, we realize one key element will make or break our data integrity program—our quality culture. Does management understand and speak about data integrity? Do employees feel safe reporting data integrity issues? Is there even opportunity for doing such things? These, and many other cultural conditions, are a necessary part of the infrastructure needed to help data integrity programs succeed. A risk-based approach is also necessary to understand the criticality of data in a variety of settings, prioritize any CAPAs and meet regulatory expectations in as practical a manner as possible. Most companies start by addressing gaps found in the lab since the data generated there is critical and readily interpretable toward the quality of the product, but then what? Manufacturing? Supply chain and distribution data? Post-market surveillance? How do we “right-size” these concepts for early stage clinical studies and throughout the product life cycle? What other GxP areas need to be accounted for?

And if all of this this is not enough, our ability to generate and capture data from every corner of our quality system, manufacturing process, and clinical program (just to name a few) can make even the most honorable data management efforts feel ill-fated. But knowledge is power. We can do so much more with this data—streamline processing, improve formulations and even provide safer and more effective products—if only we could confidently get our arms around it all. So how do we leverage this big data boom with the integrity we require?

Fortunately, there is a place to seek some answers and share best practices. Attend the 2019 PDA Data Integrity Workshop this September in Washington, D.C. to hear industry and regulatory speakers. In addition, there will be panel Q&A for each session which will include multiple industry and regulator panelists. Come join us, set a course for your own journey, and help define the future of data integrity!