Guest Column
| March 7, 2014

Strong Data Is Generated By Strong People

In the event you’ve been living under a rock for the last several months, facility inspection information released by FDA has thrust data integrity into the spotlight. As someone who had responsibility for assuring laboratory data quality in the past, I find the recent spate of issues reported in FDA warning letters and 483s to be extremely disconcerting. Each subsequent revelation becomes more unbelievable.

We can have the best facilities, systems and products in the world, yet we rely on people to drive the things we do. This is why these issues are so problematic. The question I ask myself – and what we should be asking some of the involved parties – is: Why did no one realize that these practices were not appropriate in a regulated environment?

Oftentimes, when these practices are found, the company identifies one or two “rogue analysts” responsible, and the announcement of their termination is the end of the story. But truthfully, we need to look deeper. The presence of the practice is the issue – the degree to which it occurs almost doesn’t matter, because the end result is the same: Our data is questionable. The poor practices are what made it so.

Every facility faces examples of poor practices – just maybe not in ways that are this blatant or intentional. And not just by analysts in the labs. Poor practices can occur in any area where data is generated or reviewed. As examples: Someone who is crunched for time quickly reviews a document while trying to clear their workload, and misses an error. The last signatory on a document signs after only a cursory review because others reviewed it before them. A manufacturing operator completes several process steps without appropriate reviews, because a reviewer wasn’t available and they didn’t think they could stop the process to wait. Or a QA reviewer raises issues only to be regularly overruled, and subsequently stops their questioning and begins to let “little” things through.

How do we find ourselves here, where the so-very-wrong is “normal”? In some cases, it has always been this way because no one knows any better. In others, appropriate controls were in place but gradually disintegrated. In some cases, we may never know how we got here.

But we know we need to recover from it. We must identify what is causing or encouraging the behavior to occur. Is it a lack of education – do people not understand why this behavior is a problem, or why it’s wrong? Or is it a behavior that is being encouraged, through a lack of oversight, or lack of communication of expectations? Or worse, is it known to be happening and people are looking the other way – or even encouraging it?

To change the behavior, we must identify and understand the true causes, then address them through whatever means are appropriate, up to and including termination – as far up the chain as necessary - when warranted. Am I advocating termination? Only in the presence of intentional misbehavior. Companies must have a zero-tolerance policy for compliance data fraud in any form, whether generating it, allowing it, or encouraging it. Amazing things happen when we start to root out the perpetrators, and an example says things that words cannot.

Even better, can we avoid these issues at all? The good news is that steps to ensure data integrity are completely within our control – assuming we will put the time into executing them. If we want to avoid data integrity issues, we must look to strengthen these 5 areas:

1. How we prepare operators & analysts: We want to train people quickly to support operations. So we teach them the skills to perform tasks – but we may leave out the “big picture” elements of the job or skills to identify when something is out of the ordinary, because they take longer to learn or are harder to communicate. Without this, people are more likely to make inappropriate decisions - because they don’t know any better - while trying to do something positive, like be more efficient in their work. How many of these elements do you include in operational task training?

The importance of data integrity and how precious data is to our business

The purpose of the information being generated

What it will be used for, and what decisions will be based on it

What can happen to the above if the information is wrong, or incomplete

The impact (and potential consequences) of erroneous data/documentation on the company and the patients

The expectations for the task – from the standpoints of the company, their management, the regulators and patients

2. How we prepare managers: Managing a GMP function has unique challenges, and we typically don’t do enough to prepare our managers for them. Many of our managers were promoted due to technical competence, and often times we haven’t developed their management skills. Managers must foster the mindset that production and quality are equal, and they achieve success by obtaining results through their people – which means their focus must be on developing and overseeing their people. Do your managers have the people management skills they need for themselves, their people, and the function for which they have responsibility to be able to succeed?

3. Our environment: Our environment must encourage people to raise concerns or problems they’ve noticed, and to ask questions to improve understanding. Often, these are penalized or discouraged – even unintentionally. Again, an example here says more than words can. Highlight the individual who raised a concern, what was done about it, the good that came of it, and the problems that were avoided by raising it. Make it as easy as possible for someone to do the right things.

4. Individual workload: How much people have to do also contributes to data integrity issues. Most of us work under “do more with less” edicts, but overburdened people may, in an attempt to be more efficient, allow inappropriate practices to sneak in. Or they forget to raise a concern because they didn’t have the time to follow through on it. Multitasking causes less focus and can allow mistakes to be made and not realized. Overburdening managers is even worse, because they can’t spend sufficient time developing their people or overseeing their area operations. Personnel oversight and development falls by the wayside because they take too much time away from other tasks. Managers must be allowed to manage: They should always know what their people are doing and how they’re doing it – and be visible, available identify poor practices or potential problems, and deal with them appropriately to avoid larger problems later.

5. Our expectations: We must define and communicate quality expectations for work being produced in all areas, including those that ensure proper practices and data integrity. And this message can’t be silenced the minute a production demand arises. It needs to be clearly communicated by top management and cascaded – consistently – through all company personnel. Companies must be profitable, which is based on production and delivery speed, but if the information supporting production and decision-making is flawed, it will cause larger issues longer-term.

Other data integrity controls, like archiving and validation, haven’t been discussed. Failures with these usually relate to process design, and execution of controls. These also would be done better through addressing the above areas, minimizing these issues.

Most people want to, and will do the right thing if given the chance. If their situation doesn’t support it, or sends a message that it isn’t desired, their performance will reflect what is supported – including cutting corners, inappropriate practices, and in some cases outright fraud. Want to improve data integrity? Strengthen people and guide their performance. Your data will thank you for it.

Get the latest articles from Pharmaceutical Online delivered to your inbox.