Wound Preparation

Last week, I worked as a volunteer doctor at San Francisco’s Outside Lands Music Festival. We saw quite a few hand lacerations in what we presumed were “fence jumpers.” For the uninitiated, these are the unticketed patrons who decided to try their luck scaling the barbed fences to get in to the show. They presented to our tent with macerated, dirty hand lacerations. Through RockMed, the volunteer org that I work with, we have all the essential equipment at our disposal. I was able to repair most all these lacerations right there in the medical tent. But the med tent, as well-managed as it is, is nowhere near the cleanly, commotion-free environment that I’m used to in the ED. It got me thinking quite a bit about wound preparation practices in our field.

Hand laceration seen at the RockMed Tent at Outside Lands.

Traditionally, preparation for laceration repair in the ED has involved fastidious preparation & copious cleansing with sterile saline to reduce infection risk. I remember being a med student and being chastised by a hovering attending for “breaking sterile” during simple laceration repairs. But does all this sterility hullabaloo, when we are dealing with something that is by definition traumatic and dirty, really matter?

Mounting evidence suggests that many of these dogmatic wound preparation protocols we observe are unnecessary and wasteful.

By way of example, let’s focus this discussion on the practice of using sterile saline for laceration irrigation. The dogma driving the use of normal saline is based on a physiologic argument: plain water, being hypotonic, has the potential to cause cell lysis and thus impair wound healing. However, studies generated as long as two decades ago have shown the potential safety of the use of tap water as an irrigant. There have since been even larger and better studies including one published in Annals of EM in 2003 comparing tap water vs saline irrigation of wounds in children and a multi center trial published in Academic Emergency Medicine in 2007 again showing equivalence of these two irrigation solutions in terms of wound infection rates. These studies were not perfect–they were flawed by: lack of controlling for volume and pressure of irrigation, poorly defined inclusion/exclusion criteria, lack of blinding, and unclear outcome measures. Thus, a Cochrane library review updated in 2012 still gives only modest support for the use of saline as an irrigation fluid.

I bring all this up only because this year, an excellent study was published in BMJ Open giving the strongest evidence yet on this topic. The study was randomized, controlled, and double blinded; the authors performed a pre-planned sample size calculation to ensure adequate power; and irrigation pressure, technique, & volume were controlled, leaving irrigation fluid as the only variable. Lo and behold, the authors found NO DIFFERENCE in infection rates of wounds irrigated with either tap water or normal saline. Given, generalizability is a little limited by the fact that this is a 1st world study using potable water (it was performed in the ER at Stanford University Hospital, using a single sink monitored for bacterial colonization); there were also multiple exclusion criteria (pts under a year old, those with immuno-compromising illness, those already on abx, those with bite wounds, wounds older than 9 hours, bite wounds and tendon injuries). It’s also worth noting that they irrigated using an 18G angiocath mounted on a 35 ml syringe in order to generate 8 psi (currently thought to be the ideal irrigation pressure), so this is not exactly the same as irrigating directly from a tap. That said, with a tubing system, it would probably be fairly easy to re-create these pressures out of a regular sink. This study, in my opinion, is great evidence that in modern ERs, this is what we should be doing for the majority of our patients.

If adopted widely, this would make HUGE cost-effective and environmental impacts. Assuming ~8 million laceration are repairs in the US annually, we are looking at roughly $5.5 million estimated savings in saline and irrigation equipment and (for the environmentalist in you) 2.8 million fewer 500 cc plastic bottles or saline bags discarded annually. What’s the big hurdle to this? Ask most doctors what the irrigation protocol is in their ER. The response: “I don’t know, whatever the ED tech does.” We need to take ownership of this! As leaders in our respective EDs, I encourage all docs to speak with your admin to make this policy and procedure. On a patient-to-patient basis, using tap water here and there won’t make a difference. It’s only if we affect change as a field that it will matter.

To round out this discussion–this study joins a growing body of literature questioning our overly cautious wound preparation practices. Back in 1989, a modest (in size) study was published by a Canadian family practitioner who described his “surgically clean” rather than sterile technique for laceration repair in the office (no sterile gloves, gown or hat, hand washing) and found no difference in infection rates compared with sterile technique. An RCT published in Annals of EM in 2004 showed no difference in infection rates when sterile vs non-sterile gloves were used for uncomplicated laceration repairs in immunocompetent patients.

Bottom line: we need to re-examine our wound care practices. I’ve certainly been reexamining mine. After having this patient irrigate his hand wound with the water dispenser (operated by a foot pump, as you see here)

Irrigation with tap water before a laceration repair.

I proceeded to sew him up in the tent, using what I would consider a clean (but certainly not sterile) technique, as shown here:

Clean, not sterile.

I called this patient back today, 10 days later, to see how he was doing. He had just had the sutures removed. He was feeling fine, with no signs or symptoms of infection. He was pleased with the wound appearance and thankful for the care. I definitely learned some lessons here which I will be taking back to the ED.