Archive for December, 2011|Monthly archive page

Are we on the wrong track looking at diseases as distinct entities? In the hospital are we wrong to classify disease and co-morbidities? Does separating a patient’s issues into distinct diseases make them more susceptible to hospital error? Do medications get at the cause of disease or just mask symptoms? Should disease management be more personalized?

Like this:

The investigation of how a child in Boston received Hepatitis C via cardiac surgery in which blood vessel tissue was transplanted, revealed a case of human error in reading the hepatitis status of the donor back in March.

Human error WILL occur and resilience in catching and responding to these errors is what will keep patients safe. Resiliency is the ability of a system to adjust its functioning in the event of a mishap or under a state of continuous stress (Nemeth, Wares, Woods, Honagell & Cook, 2008). Even after the error in reading the tissue occurred there was opportunity to prevent the error from reaching the child. Another person who had received a kidney from the same donor tested positive for Hepatitis C but it was 11 days before a communication occurred with the Office of Blood, Organ, and Other Tissue Safety at the CDC (Conaboy, 2011, Boston Globe). The child’s surgery was performed 3 days before the official communication but 8 days after the kidney recipient tested positive. As soon as the first kidney recipient tested positive, the human error should have been discovered and further infections could have been prevented. A human error occurred but system problems and communication impairments made this a larger catastrophe than it should have been. This illustrates that while the sharp end workers are prone to human errors, the blunt end administrators can add resiliency by looking to build safer processes and systems. Compounding this error was the fact that organs and tissues are regulated by separate agencies. Tissue banks are overseen by the FDA and Organs by the Health Resources and Services Administration (Conaboy, 2011). The two have no protocols for sharing information. This is eerily similar to the situation prior to the 911 attacks in that the FBI and the CIA had no protocols for sharing information.

This is a lesson for all in terms of the open sharing of data. We must break down silos in healthcare where they occur and increase opportunity for feedback to those in the system as to the functioning of the system whenever possible. Putting the patient at the center of all we do is a first step in identifying how and where these silos exist. Human error will occur but monitoring the system and sharing information will create resiliency that will mitigate harm.

from
“Thinking fast and slow” a fantastic book by Pulitzer prize winner, Daniel Kahneman. The concepts in this book have implications for how we structure work processes so that our brains function they way we need them to to prevent safety transgressions.

Like this:

It is ironic that in healthcare cultures refer to the test that helps us identify pathogens that allow us to save lives and also to the environment in which we practice which can also help us save lives.
In a previous post, I presented the link to AHRQs latest Hospital SOPS (Survey of Patient Safety culture). Of particular concern were the safety culture perceptions of nurses in the northeast USA.
This latest article involves nurses in the Northeast UK.
“MORE than one in four nurses in the North-east have been discouraged from raising concerns about patient safety, according to a survey.
Nurses from across the North-east and Cumbria voiced their fears about reporting concerns over the quality of patient care as some of the 3,000 members of the Royal College of Nursing (RCN) nationwide who were surveyed. The research, involving private healthcare and NHS workers, highlighted a worrying number who said they had been told by their managers not to report their concerns. A significant number also said that when they reported their concerns over patient safety and quality of care issues, no action was taken by management.Read More

How powerful is hospital culture? So powerful it overrides racial and national culture as nurses in the UK and the USA are having the same perceptions around the existing culture of safety.

Like this:

Article by J. St Amand: Rolling Hills Hospital in Franklin, Tenn., recently refused a lesbian woman the right to visit her partner, reported the Tennessean in a Dec. 21 article. Franklin is located about 20 miles south of downtown Nashville.

The psychiatric hospital went against new federal anti-discrimination laws when Val Burke was not allowed to visit her partner who was in facility’s residential unit. The U.S. Department of Health and Human Services created the rules, which include equal visitation and representation rights, in September.

“It was human error,” said Richard Bangert, chief executive officer of Rolling Hills. “They made a mistake. When I learned of it, I immediately met with my staff on Monday. We immediately made the change in terms of making sure that our policy was very clear.”

Bangert plans to apologize to Burke

While it is nice to see the hospital endeavoring to comply with Federal regulations, this was not a case of HUMAN ERROR. Does labeling this as human error contribute to our understanding of it? This is another incident of blame and train. “we met with the staff and made sure our policy was clear.” Is the spirit of the policy “DON’T discriminate against federally protected groups?” Read the rest of this entry »

Like this:

“Don’t do this..” has never worked for safety in any industry. For example, in manufacturing, a machine called the punch press took many fingers of workers who were not responsive to warning notices and red lights etc. Safety was not achieved until the punch press was redesigned such that TWO hands were required on the buttons to start the action of the machine. Making sure the hands were being used to run the machine prevented hands from being inadvertently left inside the machine (Levinson, 2011). This is an example of CANT not DONT.

Safety is achieved by force function eg. can’t do it any other way but the right way. Worker vigilance is not reliable or sustainable. Some other examples of CANT not DONT include the SPIKERIGHT enteral system from Nestle which prevents a tube feeding from being connected to IV access. more—-Read the rest of this entry »

Like this:

About this blog: You’ve heard of Leapfrog now there’s SafetyDog!

This blog will merge ideas from management, nursing, medicine and psychology (and many others) to offer a different view of patient safety. The author has a Masters in Industrial-Organizational Psychology, a graduate certificate in Error Science and Patient Safety and also a BSN in Nursing and has worked as an RN since 1985. All comments are welcome..you never know when one of your thoughts might save a life!

Patient Safety

IOM
Institute of Medicine..their 1999 report “To Err is human” started it all.

Leap Frog Group
The Consumer Reports for hospitals. Encouraging transparency and comparison of quality and safety.

ISMP
Institute for Safe Medication Practices. If you are looking for information on safe medication practices (and unsafe ones) they have great newsletters and other resources.

IHI
The Institute for Healthcare Improvement has an entire section on patient safety.

AHRQ
The Agency for Healthcare Research and Quality. Great site from the Department of Health and Human services. Contains research articles and safety guidelines and tools. The link is to Patient safety net

Healthcare Quarterly
Best practices and peer reviewed articles. Editor is a PhD from the University of North Carolina.