Abstract

Although controlled donation after circulatory determination of death (cDCDD) could increase the supply of donor lungs within the United States, the yield of lungs from cDCDD donors remains low compared with donation after neurologic determination of death (DNDD). To explore the reason for low lung yield from cDCDD donors, Scientific Registry of Transplant Recipient data were used to assess the impact of donor lung quality on cDCDD lung utilization by fitting a logistic regression model. The relationship between center volume and cDCDD use was assessed, and the distance between center and donor hospital was calculated by cDCDD status. Recipient survival was compared using a multivariable Cox regression model. Lung utilization was 2.1% for cDCDD donors and 21.4% for DNDD donors. Being a cDCDD donor decreased lung donation (adjusted odds ratio 0.101, 95% confidence interval [CI] 0.085-0.120). A minority of centers have performed cDCDD transplant, with higher volume centers generally performing more cDCDD transplants. There was no difference in center-to-donor distance or recipient survival (adjusted hazard ratio 1.03, 95% CI 0.78-1.37) between cDCDD and DNDD transplants. cDCDD lungs are underutilized compared with DNDD lungs after adjusting for lung quality. Increasing transplant center expertise and commitment to cDCDD lung procurement is needed to improve utilization.

Abstract

Rationale: In 2005, the lung allocation score (LAS) was implemented to prioritize organ allocation to minimize waiting-list mortality and maximize one-year survival. It resulted in transplantation of older and sicker patients without changing one-year survival. Its effect on resource utilization is unknown. Objective: To determine changes in resource utilization over time in lung transplant admissions Methods: Solid organ transplant recipients were identified within the Nationwide Inpatient Sample (NIS) data from 2000 to 2011. Joinpoint regression methodology was performed to identify a time point of change in mean total hospital charges amongst lung transplant and other solid organ transplant recipients. Two temporal lung transplant recipient cohorts identified by joinpoint regression were compared for baseline characteristics and resource utilization, including total charges for index hospitalization, charges per day, length of stay, discharge disposition, tracheostomy, and need for extracorporeal membrane oxygenation (ECMO). Measurements and Main Results: A significant point of increased total hospital charges occurred for lung transplant recipients in 2005, corresponding to LAS implementation, that was not seen in other solid organ transplant recipients. Total transplant hospital charges increased by 40% in the post-LAS cohort [$569,942 ($53,229) vs. $407,489 ($28,360)] along with an increased median length of stay, daily charges, and discharge disposition other than to home. Post-LAS recipients also had higher post-transplant utilization of ECMO (OR 2.35, 95% CI 1.56, 3.55) and higher incidence of tracheostomy (OR 1.52, 95% CI 1.22, 1.89). Conclusions: LAS implementation is associated with a significant increase in resource utilization during index hospitalization for lung transplant.

Abstract

Chronic rejection, manifested pathologically as airway fibrosis, is the major problem limiting long-term survival in lung transplant recipients. Airway hypoxia and ischemia, resulting from a failure to restore the bronchial artery (BA) circulation at the time of transplantation, may predispose patients to chronic rejection. To address this possibility, clinical information is needed describing the status of lung perfusion and airway oxygenation after transplantation.To determine the relative pulmonary arterial blood flow, airway tissue oxygenation and BA anatomy in the transplanted lung was compared with the contralateral native lung in lung allograft recipients.Routine perfusion scans were evaluated at 3 and 12 months after transplantation in 15 single transplant recipients. Next, airway tissue oximetry was performed in 12 patients during surveillance bronchoscopies in the first year after transplant and in 4 control subjects. Finally, computed tomography (CT)-angiography studies on 11 recipients were reconstructed to evaluate the post-transplant anatomy of the BAs.By 3 months after transplantation, deoxygenated pulmonary arterial blood is shunted away from the native lung to the transplanted lung. In the first year, healthy lung transplant recipients exhibit significant airway hypoxia distal to the graft anastomosis. CT-angiography studies demonstrate that BAs are abbreviated, generally stopping at or before the anastomosis, in transplant airways.Despite pulmonary artery blood being shunted to transplanted lungs after transplantation, grafts are hypoxic compared with both native (diseased) and control airways. Airway hypoxia may be due to the lack of radiologically demonstrable BAs after lung transplantation.

Abstract

Implemented in 2005, the lung allocation score (LAS) aims to distribute donor organs based on overall survival benefits for all potential recipients, rather than on waiting list time accrued. While prior work has shown that patients with scores greater than 46 are at increased risk of death, it is not known whether that risk is equivalent among such patients when stratified by LAS score and diagnosis. We retrospectively evaluated 5331 adult lung transplant recipients from May 2005 to February 2009 to determine the association of LAS (groups based on scores of < or =46, 47-59, 60-79 and > or =80) and posttransplant survival. When compared with patients with LAS < or = 46, only those with LAS > or = 60 had an increased risk of death (LAS 60-79: hazard ratio [HR], 1.52; 95% confidence interval [CI], 1.21-1.90; LAS > or = 80: HR, 2.03; CI, 1.61-2.55; p < 0.001) despite shorter median waiting list times. This risk persisted after adjusting for age, diagnosis, transplant center volume and donor characteristics. By specific diagnosis, an increased hazard was observed in patients with COPD with LAS > or = 80, as well as those with IPF with LAS > or = 60.

Abstract

Heart lung transplantation is a viable treatment option for patients with many end stage heart and lung pathologies. However, given the complex nature of the procedure, it is imperative that patients are selected appropriately and the clinician is aware of the many unique aspects in management of this population. This review seeks to describe updated organ selection policies, peri and postoperative management strategies, monitoring of graft function, and clinical outcomes for patients following combined heart-lung transplantation in the current era.

Abstract

This study determined whether novel right heart echocardiography metrics help to detect pulmonary hypertension (PH) in patients with advanced lung disease (ALD). We reviewed echocardiography and catheterization data of 192 patients from the Stanford ALD registry and echocardiograms of 50 healthy controls. Accuracy of echocardiographic right heart metrics to detect PH was assessed using logistic regression and area under the ROC curves (AUC) analysis. Patients were divided into a derivation (n?=?92) and validation cohort (n?=?100). Experimental validation was assessed in a piglet model of mild PH followed longitudinally. Tricuspid regurgitation (TR) was not interpretable in 52% of patients. In the derivation cohort, right atrial maximal volume index (RAVI), ventricular end-systolic area index (RVESAI), free-wall longitudinal strain and tricuspid annular plane systolic excursion (TAPSE) differentiated patients with and without PH; 20% of patients without PH had moderate to severe RV enlargement by RVESAI. On multivariate analysis, RAVI and TAPSE were independently associated with PH (AUC?=?0.77, p?0.001), which was confirmed in the validation cohort (0.78, p?0.001). Presence of right heart metrics abnormalities did not improve detection of PH in patients with interpretable TR (p?>?0.05) and provided moderate detection value in patients without TR. Only two patients with more severe PH (mean pulmonary pressure 35 and 36 mmHg) were missed. The animal model confirmed that right heart enlargement discriminated best pigs with PH from shams. This study highlights the frequency of right heart enlargement and dysfunction in ALD irrespectively from presence of PH, therefore limiting their use for detection of PH.

Abstract

Non-tuberculous mycobacteria (NTM) are important pathogens in lung transplant recipients. This study describes the spectrum of NTM respiratory tract infections and examines the association of NTM infections with lung transplant complications.Data from 208 recipients transplanted from November 1990 to November 2005 were analyzed. Follow-up data were available to November 2010. Lung infection was defined by bronchoalveolar lavage, sputum, or blood cultures in the appropriate clinical setting. All identified NTM respiratory tract infections were tabulated. The cohort of patients with NTM lung infections (NTM+) were compared to the cohort without infection (NTM-). Univariate and multivariate analysis was performed to determine characteristics associated with NTM infection. Survival analyses for overall survival and development of bronchiolitis obliterans syndrome (BOS) were also performed.In total, 52 isolates of NTM lung infection were identified in 30 patients. The isolates included Mycobacterium abscessus (46%), Mycobacterium avium complex (MAC) (36%), Mycobacterium gordonae (9%), Mycobacterium chelonae (7%), and Mycobacterium fortuitum (2%), with multiple NTM isolates seen on 3 different occasions. The overall incidence was 14%, whereas cumulative incidences at 1, 3, and 5 years after lung transplantation were 11%, 15%, and 20%, respectively. Comparisons between the NTM+ and NTM- cohorts revealed that NTM+ patients were more likely to be African-American and have cytomegalovirus mismatch. Although no difference was seen in survival, the NTM+ cohort was more likely to develop BOS (80% vs. 58%, P = 0.02). NTM+ infection, however, was not independently associated with development of BOS by multivariate analysis.With nearly 20 years of follow-up, 14% of lung recipients develop NTM respiratory tract infections, with M. abscessus and MAC more commonly identified. M. gordonae was considered responsible for nearly 10% of NTM infections. Although survival of patients with NTM infections is similar, a striking difference in BOS rates is present in the NTM+ and NTM- groups.

Abstract

Patients with idiopathic pulmonary arterial hypertension (IPAH) have improved survival after heart-lung transplantation (HLT) and double-lung transplantation (DLT). However, the optimal procedure for patients with IPAH undergoing transplantation remains unclear. We hypothesized that critically-ill IPAH patients, defined by admission to the intensive care units (ICU), would demonstrate improved survival with HLT versus DLT. All adult IPAH patients (>18 years) in the Scientific Registry of Transplant Recipients (SRTR) database, who underwent either HLT or DLT between 1987 and 2012, were included. Baseline characteristics, survival, and adjusted survival were compared between the HLT and DLT groups. Similar analyses were performed for the sub-groups as defined by the recipients' hospitalization status. 928 IPAH patients (667 DLT, 261 HLT) were included in this analysis. The HLT recipients were younger, more likely to be admitted to the ICU, and have had their transplant in previous eras. Overall the adjusted survivals after HLT or DLT were similar. The recipients who were hospitalized in the ICU, DLT was associated with worse outcomes (HR 1.827; 95% CI 1.018-3.279). In IPAH patients, the overall survival after HLT or DLT is comparable. HLT may provide improved outcomes in critically-ill IPAH patients admitted to the ICU at time of transplantation. This article is protected by copyright. All rights reserved.

Abstract

Recipients of lung transplantation (LT) and heart-lung transplantation (HLT) are at increased risk of infection, including invasive mold infections (IMIs). The clinical presentation, radiographic correlates, and outcomes of Aspergillus and non-Aspergillus IMIs in this population have not been well documented.LT and HLT recipients diagnosed with IMIs between 1990 and 2012 were identified using the Stanford Translational Research Integrated Database Environment and Stanford LT and HLT clinical database. Recipient clinical and radiographic characteristics were obtained via retrospective review of medical records and compared between Aspergillus and non-Aspergillus mold recipients. Risk factors for mortality were identified using multivariate logistic regression analysis.During the study period, 87 (14%) transplant recipients were diagnosed with IMIs. Aspergillus species were isolated in 63 (72%) and non-Aspergillus molds in 24 (28%) recipients. No significant difference was seen in presenting symptoms or radiographic findings between Aspergillus and non-Aspergillus mold recipients. Median time to diagnosis was 363 days in the Aspergillus group and 419 days in the non-Aspergillus group, with dissemination occurring only within the non-Aspergillus group (12.5%). Overall 90-day and 1-year mortality following IMI was 24% and 44%. One-year mortality was increased in the non-Aspergillus group (39.5% vs. 60.5%, P = 0.03).There is significant overlap in risk factors, presentation, and radiographic patterns in IMI in LT or HLT recipients. Non-Aspergillus molds were more likely to present late, with disseminated disease, and portend increased 1-year mortality. This article is protected by copyright. All rights reserved.

Addressing the Controversy of Estimating Pulmonary Arterial Pressure by Echocardiography.Journal of the American Society of Echocardiography : official publication of the American Society of Echocardiography2015

Abstract

There is currently controversy over whether echocardiography provides reliable estimations of pulmonary pressures. The objective of this study was to determine the factors influencing the accuracy and reliability of estimating right ventricular systolic pressure (RVSP) using echocardiography in patients with advanced lung disease or pulmonary arterial hypertension.Between January 2001 and December 2012, 667 patients with advanced lung disease or pulmonary arterial hypertension underwent right heart catheterization and transthoracic echocardiography. Of those, 307 had both studies within 5 days of each other. The correlation and bias in estimating RVSP according to tricuspid regurgitation (TR) signal quality and reader expertise were retrospectively determined. Reasons for under- and overestimation were analyzed. The diagnostic performance of estimated RVSP, relative right ventricular size, eccentricity index, and tricuspid annular plane systolic excursion was compared for classifying patients with pulmonary hypertension (mean pulmonary artery pressure ? 25 mm Hg).Invasive mean and systolic pulmonary artery pressures were strongly correlated (R(2) = 0.95, P .05). When TR signals were uninterpretable, eccentricity index and right ventricular size were independently associated with pulmonary hypertension (area under the curve, 0.77).Echocardiography reliably estimates RVSP when attention is given to simple quality metrics.

Abstract

: We describe four solid-organ transplant recipients with donor-derived West Nile virus (WNV) infection (encephalitis 3, asymptomatic 1) from a common donor residing in a region of increased WNV activity. All four transplant recipients had molecular evidence of WNV infection in their serum and/or cerebrospinal fluid (CSF) by reverse transcription polymerase chain reaction (RT-PCR) testing. Serum from the organ donor was positive for WNV IgM but negative for WNV RNA, whereas his lymph node and spleen tissues tested positive for WNV by RT-PCR. Combination therapy included intravenous immunoglobulin (4 cases), interferon (3 cases), fresh frozen plasma with WNV IgG (2 cases), and ribavirin (1 case). Two of the four transplant recipients survived.Review of the 20 published cases of organ-derived WNV infection found that this infection is associated with a high incidence of neuroinvasive disease (70%) and severe morbidity and mortality (30%). Median time to onset of symptomatic WNV infection was 13 days after transplantation (range 5-37 days). Initial unexplained fever unresponsive to antibiotic therapy followed by rapid onset of neurologic deficits was the most common clinical presentation. Confirmation of infection was made by testing serum and CSF for both WNV RNA by RT-PCR and WNV IgM by serological assays. Treatment usually included supportive care, reduction of immunosuppression, and frequent intravenous immunoglobulin. The often negative results for WNV by current RT-PCR and serological assays and the absence of clinical signs of acute infection in donors contribute to the sporadic occurrence of donor-derived WNV infection. Potential organ donors should be assessed for unexplained fever and neurological symptoms, particularly if they reside in areas of increased WNV activity.

Abstract

We identified West Nile virus (WNV) RNA in skin, fat, muscle, tendon, and bone marrow from a deceased donor associated with WNV transmission through solid organ transplantation. WNV could not be cultured from the RNA-positive tissues. Further studies are needed to determine if WNV can be transmitted from postmortem tissues.

Abstract

Pulmonary hypertension (PH) is a serious condition that affects mainly young and middle-aged women, and its etiology is poorly understood. A prominent pathological feature of PH is accumulation of macrophages near the arterioles of the lung. In both clinical tissue and the SU5416 (SU)/athymic rat model of severe PH, we found that the accumulated macrophages expressed high levels of leukotriene A4 hydrolase (LTA4H), the biosynthetic enzyme for leukotriene B4 (LTB4). Moreover, macrophage-derived LTB4 directly induced apoptosis in pulmonary artery endothelial cells (PAECs). Further, LTB4 induced proliferation and hypertrophy of human pulmonary artery smooth muscle cells. We found that LTB4 acted through its receptor, BLT1, to induce PAEC apoptosis by inhibiting the protective endothelial sphingosine kinase 1 (Sphk1)-endothelial nitric oxide synthase (eNOS) pathway. Blocking LTA4H decreased in vivo LTB4 levels, prevented PAEC apoptosis, restored Sphk1-eNOS signaling, and reversed fulminant PH in the SU/athymic rat model of PH. Antagonizing BLT1 similarly reversed established PH. Inhibition of LTB4 biosynthesis or signal transduction in SU-treated athymic rats with established disease also improved cardiac function and reopened obstructed arterioles; this approach was also effective in the monocrotaline model of severe PH. Human plexiform lesions, one hallmark of PH, showed increased numbers of macrophages, which expressed LTA4H, and patients with connective tissue disease-associated pulmonary arterial hypertension exhibited significantly higher LTB4 concentrations in the systemic circulation than did healthy subjects. These results uncover a possible role for macrophage-derived LTB4 in PH pathogenesis and identify a pathway that may be amenable to therapeutic targeting.

Abstract

A dual circulation, supplied by bronchial and pulmonary artery-derived vessels, normally perfuses the airways from the trachea to the terminal bronchioles. This vascular system has been highly conserved through mammalian evolution and is disrupted at the time of lung transplantation. In most transplant centers, this circulation is not restored. The Papworth Hospital Autopsy study has revealed that an additional attrition of periairway vessels is associated with the development of chronic rejection, otherwise known as the bronchiolitis obliterans syndrome (BOS). Experimental studies subsequently demonstrated that airway vessels are subject to alloimmune injury and that the loss of a functional microvascular system identifies allografts that cannot be rescued with immunosuppressive therapy. Therefore, surgical and medical strategies, which preserve the functionality of the existent vasculature in lung transplant patients, may conceivably limit the incidence of BOS. Given these unique anatomic and physiological considerations, there is an emerging rationale to better understand the perfusion and oxygenation status of airways in transplanted lungs. This article describes novel methodologies, some newly developed by our group, for assessing airway tissue oxygenation and perfusion in experimental and clinical transplantation.

Abstract

Bronchiolitis obliterans syndrome (BOS) is the major limitation to long-term survival following lung transplantation and strategies to reduce its incidence have remained elusive. Macrolides may stabilize lung function in patients with established BOS. Their role, however, in prevention of BOS remains unexamined.Survival and BOS-free survival of 102 lung allograft recipients (LARs), transplanted at a single center between July 1995 and December 2001 who routinely received clarithromycin, were compared with two different control groups. The first control group consisted of 44 LARs from the same center who were transplanted from January 2002 onwards and did not receive clarithromycin. The second control group consisted of a contemporaneous cohort of 5089 recipients, transplanted between 1995 and 2001, reported to the United Network for Organ Sharing database.When compared with the first control group, BOS-free survival was reduced in LARs receiving clarithromycin. Univariate (hazard ratio [HR] 3.13, p-value = 0.004) and multivariate (HR 3.49, p-value = 0.04) analyses showed that routine use of clarithromycin was associated with an increased risk of developing BOS. When compared with the second control group, the five-yr survival of clarithromycin group was similar (p-value = 0.24).Routine use of clarithromycin does not delay development of BOS or improve survival.

Abstract

While microvascular injury is associated with chronic rejection, the cause of tissue ischemia during alloimmune injury is not yet elucidated.We investigated the contribution of T lymphocytes and complement to microvascular injury-associated ischemia during acute rejection of mouse tracheal transplants.Using novel techniques to assess microvascular integrity and function, we evaluated how lymphocyte subsets and complement specifically affect microvascular perfusion and tissue oxygenation in MHC-mismatched transplants. To characterize T cell effects on microvessel loss and recovery, we transplanted functional airway grafts in the presence and absence of CD4(+) and CD8(+) T cells. To establish the contribution of complement-mediated injury to the allograft microcirculation, we transplanted C3-deficient and C3-inhibited recipients. We demonstrated that CD4(+) T cells and complement are independently sufficient to cause graft ischemia. CD8(+) T cells were required for airway neovascularization to occur following CD4-mediated rejection. Activation of antibody-dependent complement pathways mediated tissue ischemia even in the absence of cellular rejection. Complement inhibition by CR2-Crry attenuated graft hypoxia, complement/antibody deposition on vascular endothelium and promoted vascular perfusion by enhanced angiogenesis. Finally, there was a clear relationship between the burden of tissue hypoxia (ischemia×time duration) and the development of subsequent airway remodeling.These studies demonstrated that CD4(+) T cells and complement operate independently to cause transplant ischemia during acute rejection and that sustained ischemia is a precursor to chronic rejection.

Abstract

In 2000, the Agency for Toxic Substances and Disease Registry (ATSDR; Atlanta, GA, USA) investigated lung disease in those exposed to the tremolite-contaminated vermiculite mine in Libby, MT, USA. Previously unreported spirometric results are presented here in relation to exposure and radiographic findings. 4,524 study participants were assigned to one of seven mutually exclusive exposure categories. Associations among radiographic findings, spirometric results and exposure were investigated, along with the effect of a reduction in exposure potential when production was moved to a wet process mill in the mid 1970s. Spirometry data for the total population by smoking status and age were within the normal range. Prevalence of pleural plaque increased with age, but was lowest in the environmentally exposed group (0.42-12.74%) and greatest in the W.R. Grace & Co. mineworkers (20-45.68%). For males, there was a significant (4.5%) effect of pleural plaques on forced vital capacity. For W.R. Grace & Co. workers and household contacts, a reduction in plaque (0.11 versus 1.64%) and in diffuse pleural thickening or costophrenic angle obliteration (1.94 and 0.13%) was noted for those exposed after 1976. These analyses do not support a clinically important reduction in spirometry of this cohort. The 1976 reductions in exposure have led to decrease in radiographic changes.

Abstract

Long-term survival after heart-lung transplantation was first achieved in 1981 at Stanford and a total of 217 heart-lung transplantations had been performed by June 2008. This review summarizes Stanford's cumulative experience with heart-lung transplantation, demonstrates the progress that has been made, and discusses past and persistent problems. Diagnostic tools and treatment options for infectious diseases and rejection have changed and patient survival markedly improved over the almost three decades. Eight patients lived longer than 20 years. Further options to treat infections and strategies to control bronchiolitis obliterans syndrome, the main causes of early and long-term mortality, respectively, are required to achieve routine long-term survival.

Abstract

Respiratory syncytial virus (RSV) and parainfluenza virus (PIV) can cause significant morbidity and mortality in lung and heart-lung transplant recipients. We evaluated the utility of a multi-drug protocol for the treatment of RSV- and PIV-related infections.RSV or PIV was identified in 25 patients with a total of 29 infectious episodes between January 2006 and December 2007. The study included 20 women and 5 men, mean age 42 +/- 13 years. Fifteen patients had received bilateral lung transplant and the remainder either received single lung or heart-lung transplant. Mean time from transplant to infection was 1192 days. RSV was identified in 23 cases, PIV in 7 cases. Patients underwent treatment with inhaled ribavirin, methylprednisolone, and intravenous immunoglobulin (IVIG). RSV-positive patients were also treated with palivizumab. We retrospectively evaluated their clinical status and pulmonary function for a 1-year interval before and after the date of infection.Average baseline forced expiratory volume in 1 s (FEV(1)) before infection was 2.14 +/- 0.68 L/min. Average decline in FEV(1) was 5.7% at the time of infection. Average FEV(1) during post-treatment follow-up was not significantly different than baseline (2.16 +/- 0.80 L/min). Among patients with bronchiolitis obliterans syndrome (BOS) stages 1, 2, or 3 at the time of infection, average FEV(1) declined by 14.8% and remained lower at 9.1% during follow-up when compared with patients with BOS stages 0 or 0p. No complications resulted from treatment. One patient died during follow-up as a result of pre-existing liver failure.This study of lung and heart-lung transplant recipients infected with RSV and PIV shows that a multi-drug regimen including inhaled ribavirin, corticosteroids, and IVIG (with or without palivizumab) is safe and effective. Prompt diagnosis and therapy for patients with RSV or PIV infections are critical for maintaining lung function.

Abstract

Gram-positive (GP) organisms are among the most common cause of infections in early postsurgical and immunocompromised populations. Patients recovering from lung transplantation (LT) are particularly susceptible owing to the physiologic stress imposed by surgery and induction with intense immunosuppression. Sites, types, and timing of GP infections following LT are not well documented. This report describes the clinical spectrum of GP infections and their effects on surgical airway complications (SAC) and bronchiolitis obliterans syndrome (BOS) following LT.Data were collected from 202 patients undergoing 208 LT procedures at a single institution between November 1990 and November 2005. Data were retrospectively analyzed according to timing, location, and causative pathogen.In the median follow-up period of 2.7 years (range, 0-13.6 years), 137 GP infections were confirmed in 72 patients. Sites of infection included respiratory tract (42%), blood (27%), skin, wound and catheter (21%), and other (10%). GP pathogens identified were Staphylococcus species (77%), Enterococcus species (12%), Streptococcus species (6%), Pneumococcus (4%), and Eubacterium lentum (1%). The likelihood of SAC and BOS was increased in lung allograft recipients with GP pneumonia as compared with those without (hazard ratio 2.1; 95% confidence interval=1.5-3.1).GP organisms were responsible for infections in 40% of lung allograft recipients and most commonly isolated from the respiratory tract and blood stream. Staphylococcal species were most frequently identified, 42% of which were methicillin-resistant Staphylococcus aureus (MRSA). Given the strong association of respiratory tract infections with the development of SAC and BOS, empiric antimicrobial strategies after LT should include agents directed against GP organisms, especially MRSA.

Abstract

The availability of suitable lung and heart-lung allografts for transplantation remains poor. Accepting organs from donors with positive serological studies for hepatitis B could potentially expand the donor pool. The aim of this study was to assess the impact of donor hepatitis B core antibody (HBcAb) status on outcomes of lung and heart-lung transplant recipients.Using United Network for Organ Sharing/Organ Procurement and Transplantation Network data, we compared outcomes of 13,233 recipients of HBcAb negative organs with 333 recipients of HBcAb positive donor organs.We found that the unadjusted 1-year survival of recipients of HBcAb positive donor was worse, but there was no difference in survival after adjusting for baseline donor and recipient differences. On multivariate analysis, recipient and donor age, procedure type, era of transplant, baseline medical condition, diagnosis, and donor hepatitis C antibody status impacted 1- and 5-year survival. However, donor HBcAb status did not impact 1- or 5-year survival posttransplant.Lung and heart-lung allografts from HBcAb positive donors may be safely used, which would increase the number of transplants performed without compromising recipient outcomes.

Abstract

Among the many potential risk factors influencing the development of bronchiolitis obliterans syndrome (BOS), acute cellular rejection is the most frequently identified. Despite the unique susceptibility of the lung allograft to pathogens, the association with respiratory tract infections remains unclear. In this study we analyze the role respiratory tract infections have on the development of BOS after lung transplantation.Data from a single center were analyzed from 161 lung recipients transplanted from November 1990 to November 2005, and who survived >180 days. Univariate and multivariate Cox regression analyses were performed using BOS development and the time-scale was reported with hazard ratios (HRs) and confidence intervals (CIs).Significant findings by univariate analysis per 100 patient-days prior to BOS onset included acute rejection, cytomegalovirus (CMV) pneumonitis, Gram-negative respiratory tract infections, Gram-positive respiratory tract infections and fungal pneumonias. Multivariate analysis indicated acute rejection, Gram-negative, Gram-positive and fungal pneumonias with HRs (CI) of 84 (23 to 309), 6.6 (1.2 to 37), 6,371 (84 to 485,000) and 314 (53 to 1,856) to be associated with BOS, respectively. Acute rejection, CMV pneumonitis, Gram-positive pneumonia and fungal pneumonitis in the first 100 days had HRs (CI) of 1.8 (1.1 to 3.2), 3.1 (1.3 to 6.9), 3.8 (1.5 to 9.4) and 2.1 (1.1 to 4.0), respectively, and acute rejection and fungal pneumonitis in the late post-operative period with HRs (CI) of 2.3 (1.2 to 4.4) and 1.5 (1.1 to 1.9), respectively.In addition to acute rejection, pneumonias with GP, GN and fungal pathogens occurring prior to BOS are independent determinants of chronic allograft dysfunction. Early recognition and treatment of these pathogens in lung transplant recipients may improve long-term outcomes after transplantation.

Abstract

Clostridium difficile colitis (CDC) is the most common nosocomial infection of the gastrointestinal tract in patients with recent antibiotic use or hospitalization. Lung transplant recipients receive aggressive antimicrobial therapy postoperatively for treatment and prophylaxis of respiratory infections. This report describes the epidemiology of CDC in lung recipients from a single center and explores possible associations with bronchiolitis obliterans syndrome (BOS), a surrogate marker of chronic rejection.Patients were divided into those with confirmed disease (CDC+) and those without disease (CDC-) based on positive C. difficile toxin assay. Because of a bimodal distribution in the time to develop CDC, the early postoperative CDC+ group was analyzed separately from the late postoperative CDC+ cohort with respect to BOS development.Between 1990 and 2005, 202 consecutive patients underwent 208 lung transplantation procedures. Of these, 15 lung recipients developed 23 episodes of CDC with a median follow-up period of 2.7 years (range, 0-13.6). All patients with confirmed disease had at least 1 of the following 3 risk factors: recent antibiotic use, recent hospitalization, or augmentation of steroid dosage. Of the early CDC+ patients, 100% developed BOS, but only 52% of the late CDC+ patients developed BOS, either preceding or following infection.CDC developed in 7.4% of lung transplant patients with identified risk factors, yielding a cumulative incidence of 14.7%. The statistical association of BOS development in early CDC+ patients suggests a relationship between early infections and future chronic lung rejection.

Abstract

Universal ganciclovir (GCV) prophylaxis is a strategy aimed at reducing cytomegalovirus (CMV) infection and delaying the development of bronchiolitis obliterans syndrome (BOS). However, the optimal duration of GCV prophylaxis remains unclear. We report our experience with GCV prophylaxis administered indefinitely and its effect on CMV pneumonitis, BOS and survival after lung transplantation (LT).One hundred fifty-one patients surviving >100 days after LT were analyzed. GCV was given to 130 CMV donor- or recipient-seropositive patients. Data from 90 patients who received indefinite GCV prophylaxis (IND) and 40 patients who discontinued their GCV prophylaxis (STOP) were compared.CMV pneumonitis occurred in 16%, 8%, 17% and 19% of patients in the D+R+, D-R+, D+R- and D-R- groups, respectively. In the STOP cohort, 15 of 40 patients developed CMV pneumonitis (median time 79 days) after GCV was stopped. Ten of these 15 patients developed BOS (median time 116 days) after discontinuing GCV. The risk of CMV pneumonitis in the STOP cohort was significantly higher when GCV prophylaxis was discontinued within the first year. Cumulative incidence of CMV pneumonitis in the IND and STOP groups at 5 years was 2% and 57%, respectively (p < 0.001). BOS-free survival and survival were similar across both groups.Indefinite GCV prophylaxis prevents CMV pneumonitis in 98% of LT recipients. Thirty-eight percent of patients discontinuing prophylaxis developed CMV pneumonitis, 50% of whom progressed to BOS within 1 year. Continuing ganciclovir prophylaxis indefinitely after lung transplantation should be considered.

Abstract

Infections are common after lung transplantation. This report analyzes infections and associated pathogens identified in 202 lung transplant recipients.Infections were tallied according to sites of infection and associated pathogen(s). Infection events were also categorized by post-operative Days 0 to 100, 101 to 365, and after 365, and normalized to 100 patient-days before and after bronchiolitis obliterans syndrome (BOS).From November 1990 to November 2005, 202 patients received 208 lung transplants. The follow-up was 702.4 patient-years. A total of 178 lung transplant patients developed 859 infections, with 944 pathogens identified. Infections were in the lung in 559 (65.1%), mucocutaneous (skin, wound, catheter-related, and oral) in 88 (10.2%), in the blood in 85 (9.8%), and in other sites (urine, bowel, eye, and peritoneum) in 127 (14.8%). Most lung pathogens were bacterial (83.6%), and 57.9% were Pseudomonas aeruginosa. Fungi comprised 10.6%, with Aspergillus spp the most common (67.1%) isolate. Cytomegalovirus pneumonitis was seen in 4.3% of respiratory infections. BOS was diagnosed in 87 patients (43.1% of the total). Of all infections seen in the BOS population, there were 0.42 episodes/100 patient-days and 0.70 episodes/100 patient-days before and after BOS, respectively (p = 0.5).These data provide an updated infection profile in the ganciclovir era after lung transplantation. When compared with pre-ganciclovir times, post-transplant cytomegalovirus infection incidence has notably declined, with filamentous fungi emerging as prevalent pathogens in its place. Such findings are important for refining management of infections in order to offer more stringent treatment against aggressive pathogens.

Abstract

Previous investigations have identified significant interobserver variability in the measurements of central venous pressure and pulmonary artery occlusion pressure in critically ill patients. Large interobserver variability in the measurement of vascular pressures could potentially lead to inappropriate treatment decisions.We postulated that adding an airway pressure signal (Paw) to pressure tracings of central venous pressure and pulmonary artery occlusion pressure would improve interobserver agreement by facilitating identification of end-expiration.To test this hypothesis, six independent experts used a standard protocol to interpret strip-chart recordings of central venous pressure and pulmonary artery occlusion pressure with or without Paw. Two observers were said to agree if their measurements were within 2 mm Hg of each other. SETTING/SUBJECTS/INTERVENTIONS: A total of 459 strip-chart recordings (303 without Paw and 156 with Paw) were obtained from 121 patients enrolled in the ARDSnet Fluids and Catheters Treatment Trial (FACTT) in 16 different hospitals.Agreement within 2 mm Hg between two measurements was 79% for central venous pressure strips without Paw vs. 86% with Paw. For pulmonary artery occlusion pressure, agreement increased from 71% without Paw to 83% with Paw. The increase in agreement with the addition of Paw was greater for strips demonstrating >8 mm Hg phasic respiratory variation compared with strips demonstrating less phasic respiratory variation.Paw display is a simple, inexpensive method to facilitate the identification of end-expiration that can significantly improve interobserver agreement.

Abstract

We hypothesized that an artificial neural network, interconnected computer elements capable of adaptation and learning, could accurately estimate pulmonary artery occlusion pressure from the pulsatile pulmonary artery waveform.University medical center.Nineteen closed-chest dogs.Pulmonary artery waveforms were digitally sampled before conventional measurements of pulmonary artery occlusion pressure under control conditions, during infusions of serotonin or histamine, or during volume loading. Individual beats were parsed or separated out. Pulmonary artery pressure, its first time derivative, and the beat duration were used as neural inputs. The neural network was trained by using 80% of all samples and tested on the remaining 20%. For comparison, the regression between pulmonary artery diastolic pressure and pulmonary artery occlusion pressure was developed and tested using the same data sets. As a final test of generalizability, the neural network was trained on data obtained from 18 dogs and tested on data from the remaining dog in a round-robin fashion.The correlation coefficient between the pulmonary artery diastolic pressure estimate of pulmonary artery occlusion pressure and measured pulmonary artery occlusion pressure was.75, whereas that for the neural network estimate of pulmonary artery occlusion pressure was.97 (p

Abstract

No current evidence demonstrates improved survival or decreased rate of bronchiolitis obliterans syndrome (BOS) despite regularly scheduled fiberoptic bronchoscopy (FOB) with transbronchial biopsy and bronchoalveolar lavage (TBB/BAL) after lung transplantation. Reduced lung function detected with spirometry or oximetry in symptomatic and asymptomatic lung allograft recipients (LARs) may be a more appropriate indication for bronchoscopic sampling.Clinically indicated TBB/BAL without routine invasive surveillance sampling of the transplanted lung does not decrease survival or increase the rate of BOS in LARs.We reviewed 91 consecutive LARs transplanted at Ochsner Clinic between January 1995 and December 1999. Clinical indications for FOB with TBB/BAL include 10% decline in forced expiratory volume in 1 second below baseline; 20% decrease in forced expiratory flow rate between 25% and 75% of the forced vital capacity; or unexplained respiratory symptoms, signs, or fever. Along with demographic and clinical data, 1-year and 3-year survival rates for these 91 LARs were compared with 5,430 LARs from the International Society for Heart and Lung Transplantation (ISHLT) Registry transplanted during the same 60-month period. Ten of the 91 patients did not survive to hospital discharge after transplantation. We divided the remaining 81 LARs into 2 subsets: Group A patients (n = 43) underwent zero to 1 TBB/BAL and Group B patients (n = 38) required more than 1 procedure. Demographic data, rejection, infection, and incidence of BOS were compared between groups.The 1-year and 3-year survival rates in the Ochsner LAR cohort were 85% and 73%, respectively, vs 72% and 57% in the ISHLT cohort p < 0.01. The relative risks of death in the Ochsner group at 1- and 3-years were 0.56 (0.35-0.91) and 0.66 (0.48-0.92), respectively, p < 0.05. The median (range) follow-up was 910 days (60-1,886) for Group A and 961 days (105-1,883) for Group B, p = not significant. We observed twice as many patients with cystic fibrosis and twice as many pneumonia episodes in Group B. The rate of acute rejection in each group was not statistically different. The cumulative incidence of BOS was increased in Group B at 1 year and at 3 years (5% and 56%) when compared with Group A (3% and 13%), p < 0.01.Based on the findings from this observational, single-institution study, clinically indicated TBB/BAL without routine surveillance sampling of the lung allograft is unlikely to pose greater risk than does regularly scheduled bronchoscopy after lung transplantation.

Abstract

A number of hematological abnormalities are associated with both human immunodeficiency virus type 1 (HIV-1) infection and alcohol abuse. There is little information on how alcohol abuse might further influence the survival and growth of hematopoietic progenitors in HIV-infected individuals in the presence of immune system abnormalities and anti-HIV drugs. Because there is evidence that viral transactivator Tat itself can induce hematopoietic suppression, in this study we examined the role of ethanol as a cofactor in transgenic mice that expressed HIV-1 Tat protein.Tat transgenic mice and nontransgenic littermates were given ethanol (20% v/v) and the anti-HIV drug 3'-azido-3'-deoxythymidine (AZT; 1 mg/ml) in drinking water. Immunosuppression in mice was induced by weekly intraperitoneal injections of anti-CD4 antibody. Hematopoiesis was examined by erythroid colony forming unit (CFU-E) and granulocyte/macrophage colony-forming unit (CFU-GM) assays of the bone marrow progenitor cells.Administration of ethanol for 7 weeks resulted in a 50% decrease in the proliferative capacity of CFU-E- and CFU-GM-derived progenitors from transgenic mice compared with that of ethanol-treated nontransgenic controls. Similar decreases also were observed in transgenic mice treated with AZT or a combination of AZT and ethanol. Furthermore, ethanol and AZT were significantly more toxic to the granulopoietic progenitors (40-50% inhibition) than to the erythropoietic progenitors (10-20% inhibition) in Tat transgenic mice. Although a 10 day exposure of Tat transgenic and nontransgenic mice to a combination of ethanol and AZT had no suppressive effect on the erythropoietic and granulopoietic progenitor cells, there was a marked decrease (40-60%) in CFU-GM in mice made immunodeficient by CD4+ T-lymphocyte depletion. The ethanol-treated Tat transgenic mice but not the nontransgenic litter-mates also showed a significant decrease (25%) in CFU-GM.Our in vivo study strongly suggests that ethanol ingestion in HIV-1-infected individuals, particularly those on antiretroviral drugs, might increase bone marrow toxicity and contribute to HIV-1-associated hematopoietic impairment.