Research & Scholarship

Current Research and Scholarly Interests

Prior to my expanded administrative role, my lab focused on understanding the mechanism mediating acute and chronic allograft failure, in particular on the role of microvascular injury in acute allograft failure and the mechanisms of mediating transplant coronary artery disease.

1. Role of microvascular injury in acute allograft failure. We observed decreased cardiac allograft function in patients to be significantly associated with loss of microvascular cell surface markers, consistent with altered biology of the vascular endothelium or injury, and with up-regulation of cytokines such as IL-6, IL-10, TGF-b, and TNF-a. Decreased cardiac allograft function, in particular diastolic dysfunction, was highly predictive of allograft vascular disease and poor outcome in long-term patients. To further characterize cellular and molecular mechanisms we developed quantitative methods to monitor allograft function and correlate it with cytokine expression in a rat heterotopic transplant model. We developed echocardiographic markers of systolic and diastolic function and found decreasing systolic and diastolic function highly correlated with up-regulation of IL-6 expression.

2. Mechanisms mediating allograft dysfunction. We observed the major risk factors for transplant atherosclerosis in patients to be metabolic (hyperglycemia, hypertriglyceridemia, and low HDL). To further study cellular and molecular mechanisms mediating this process, we recapitulated metabolic abnormalities in the rat heart transplant model and confirmed rapid development of transplant atherosclerosis.

My current focus is on clinical and translational research, expanding on my earlier laboratory-based research on the mechanisms of allograft failure and vasculopathy. I study the role of sub-clinical cytomegalovirus infection in the development of cardiac allograft vasculopathy (CAV), and the effect of antiviral agents in preventing CAV, work that has been supported by a program project grant from NIH. Translating my earlier work on gene expression profiling, I conduct clinical trials on the application of this technology for non-invasive diagnosis of transplant rejection. This work, recently published in the New England Journal of Medicine, showed that peripheral blood leukocyte gene expression can be used safely for monitoring rejection in heart transplant recipients, with a substantial decrease in the number of endomyocardial biopsies performed. On going studies in this area will include application of the technology to diagnose CAV; monitor level of immunosuppression in a patient to guide therapy; and individualization of immunosuppression. My current research, in collaboration with bio-X faculty, uses the new technologies of whole genome scans, to detect the appearance of donor DNA in the recipients blood as a novel, non-invasive marker of allograft damage due to acute rejection. Ongoing studies include confirmation studies of the utility, and then external validation in a randomized controlled trial.

Clinical Trials

Cardiac transplantation is the ultimate treatment option for patients with end stage heart
failure.
Cardiac allograft vasculopathy remains a leading cause of morbidity and mortality after
transplantation.
Angiotensin converting enzyme inhibitors are used in less than one half of transplant
recipients. Preliminary data suggest that angiotensin converting enzyme inhibitors retard
the atherosclerotic plaque development that is the hallmark of cardiac allograft
vasculopathy. Moreover, this class of drug appears to increase circulating endothelial
progenitor cell number and has anti-inflammatory properties, both of which improve
endothelial dysfunction, the key precursor to the development of cardiac allograft
vasculopathy.
The objective of this project is to investigate the role of an angiotensin converting enzyme
inhibitor, ramipril, in preventing the development of cardiac allograft vasculopathy.
During the first month after cardiac transplantation subjects will undergo coronary
angiography with intravascular ultrasound measurements of plaque volume in the left anterior
descending coronary artery. Using a coronary pressure wire, epicardial artery and
microvascular physiology will be assessed. Finally, endothelial function and mediators of
endothelial function, including circulating endothelial progenitor cells, will be measured.
Subjects will then be randomized in a double blind fashion to either ramipril or placebo.
After 1 year, the above assessment will be repeated. The primary endpoint will be the
development of cardiac allograft vasculopathy based on intravascular ultrasound-derived
parameters. The second aim will be to assess the effect of ramipril on endothelial
dysfunction early after transplantation. The final aim is to determine the impact of
ramipril on coronary physiology early after transplantation.

Stanford is currently not accepting patients for this trial.For more information, please contact William Fearon, (650) 725 - 2621.

The purpose of this study is to determine the benefit of using the FDA-approved
insulin-sensitizing agent, Pioglitazone, on human heart transplant recipients. The
objectives of this project are to (1) determine if pioglitazone effectively treats insulin
resistance in heart transplant recipients, and (2) to determine whether pioglitazone therapy
after heart transplantation impacts the development or progression of cardiac allograft
vasculopathy (CAV), a form of chronic rejection after heart transplantation.

Stanford is currently not accepting patients for this trial.For more information, please contact Nicole Constantz, BSc, 650-724-4740.

Abstract

Medical researchers have called for new forms of translational science that can solve complex medical problems. Mainstream science has made complementary calls for heterogeneous teams of collaborators who conduct transdisciplinary research so as to solve complex social problems. Is transdisciplinary translational science what the medical community needs? What challenges must the medical community overcome to successfully implement this new form of translational science? This article makes several contributions. First, it clarifies the concept of transdisciplinary research and distinguishes it from other forms of collaboration. Second, it presents an example of a complex medical problem and a concrete effort to solve it through transdisciplinary collaboration: for example, the problem of preterm birth and the March of Dimes effort to form a transdisciplinary research center that synthesizes knowledge on it. The presentation of this example grounds discussion on new medical research models and reveals potential means by which they can be judged and evaluated. Third, this article identifies the challenges to forming transdisciplines and the practices that overcome them. Departments, universities and disciplines tend to form intellectual silos and adopt reductionist approaches. Forming a more integrated (or 'constructionist'), problem-based science reflective of transdisciplinary research requires the adoption of novel practices to overcome these obstacles.

Abstract

Although malignancy is a major threat to long-term survival of heart transplant (HT) recipients, clear strategies to manage immunosuppression in these patients are lacking. Several lines of evidences support the hypothesis of an anticancer effect of proliferation signal inhibitors (PSIs: mammalian target of rapamycin [mTOR] inhibitors) in HT recipients. This property may arise from PSI's ability to replace immunosuppressive therapies that promote cancer progression, such as calcineurin inhibitors or azathioprine, and/or through their direct biological actions in preventing tumor development and progression. Given the lack of randomized studies specifically exploring these issues in the transplant setting, a collaborative group reviewed current literature and personal clinical experience to reach a consensus aimed to provide practical guidance for the clinical conduct in HT recipients with malignancy, or at high risk of malignancy, with a special focus on advice relevant to potential role of PSIs.

Abstract

Recent studies suggest that students' feelings of fit with a residency program substantially influence students' ranking of the program. As diversity issues become increasingly focal concerns, we investigate how perception of gender and racial diversity of a program influences students' rankings of the program. We focus on students pursuing surgical specialties and ask whether diversity concerns are more prominent among applicants to surgical programs than among applicants to nonsurgical programs.We invited all interviewees at all residency programs at the Stanford University School of Medicine to participate in our study in the spring of 2009. Nineteen residency programs, amounting to 1,657 residency interviewees, participated. Sixty-eight percent (n = 1,132) responded to the survey.Women and under-represented minority applicants differ in their assessments from male and non-under-represented minority applicants because women applying to surgical programs and under-represented minority students are less likely than others to perceive their prospective programs as diverse. However, perceived program diversity is an important factor that positively influences the program ranking decision for women and minorities pursuing surgical training.Surgical training programs that promote gender and racial diversity will likely be more successful in attracting women and minority students because women and minorities are especially sensitive to program diversity in both their perceptions and rankings of programs. Promoting women and minorities within programs and connecting women and minority applicants to outreach programs and mentors is pertinent to the recruitment of these traditionally under-represented groups to surgical programs.

Abstract

The influence of donor-transmitted coronary atherosclerosis (DA) on plaque progression during the first year after cardiac transplantation (Tx) is unknown.Serial 3-dimensional intravascular ultrasound (IVUS) studies were performed within 8 weeks (baseline; BL) and at 1 year after Tx in 38 recipients. On the basis of maximum intimal thickness (MIT) at BL, recipients were divided into DA group (DA+; MIT?0.5 mm, n=23) or non-DA group (DA-; MIT<0.5 mm, n=15). Plaque, lumen, and vessel volume indexes were calculated by volume/measured length (mm/mm) in the left anterior descending artery. Univariate and multivariate regression analyses were attempted to reveal clinical predictors of change in coronary dimensions.During the first year after Tx, plaque volume index increased significantly in DA+ group, but did not change in DA- Group (DA+, 3.0±1.5 to 4.1±1.5 mm/mm, P<0.0001: DA-, 1.2±0.4 to 1.3±0.5 mm/mm, P=0.53). In both groups vessel volume index decreased significantly (DA+, 16.3±3.6 to 14.6±3.3 mm/mm, P=0.003: DA-, 13.5±4.1 to 12.0±3.3 mm/mm, P=0.01), as did lumen volume index (DA+, 13.2±3.1 to 10.5±2.7 mm/mm, P<0.0001: DA-, 12.2±3.7 to 10.7±3.0 mm/mm, P=0.004). Univariate and multivariate regression analyses revealed that DA was one of the strongest predictors for plaque progression.DA was associated with significant plaque progression during the first year after Tx, and in conjunction with negative remodeling, may be an important determinant of cardiac allograft vasculopathy.

Abstract

Treatment of acute rejection (AR) in heart transplantation relies on histopathological grading of endomyocardial biopsies according to International Society for Heart and Lung Transplantation guidelines. Intragraft gene expression profiling may be a way to complement histological evaluation.Transcriptional profiling was performed on 26 endomyocardial biopsies, and expression patterns were compared with the 1990 International Society for Heart and Lung Transplantation AR grades. Importantly, transcriptional profiles from settings with an equivalent AR grade appeared the same. In addition, grade 0 profiles could not be distinguished from 1A profiles, and grade 3A profiles could not be distinguished from 3B profiles. Comparing the AR groupings (0+1A, 1B, and 3A+3B), 0+1A showed more striking differences from 1B than from 3A+3B. When these findings were extrapolated to the 2005 revised guidelines, the combination of 1A and 1B into a single category (1R) appears to have brought together endomyocardial biopsies with different underlying processes that are not evident from histological evaluation. Grade 1B was associated with upregulated immune response genes, as 1 categorical distinction from grade 1A. Although grade 1B was distinct from the clinically relevant AR grades 3A and 3B, all of these grades shared a small number of overlapping pathways consistent with common physiological underpinnings.The gene expression similarities and differences identified here in different AR settings have the potential to revise the clinical perspective on acute graft rejection, pending the results of larger studies.

Abstract

It is challenging to monitor the health of transplanted organs, particularly with respect to rejection by the host immune system. Because transplanted organs have genomes that are distinct from the recipient's genome, we used high throughput shotgun sequencing to develop a universal noninvasive approach to monitoring organ health. We analyzed cell-free DNA circulating in the blood of heart transplant recipients and observed significantly increased levels of cell-free DNA from the donor genome at times when an endomyocardial biopsy independently established the presence of acute cellular rejection in these heart transplant recipients. Our results demonstrate that cell-free DNA can be used to detect an organ-specific signature that correlates with rejection, and this measurement can be made on any combination of donor and recipient. This noninvasive test holds promise for replacing the endomyocardial biopsy in heart transplant recipients and may be applicable to other solid organ transplants.

Abstract

Academic couples constitute 36% of the US professoriate. Universities are in the midst of a major transition in hiring practices to support these and other faculty with working partners. However, less is known about academic couples among medical school faculty and surgical specialties specifically. This study was designed to address this gap.In 2006-2007, the Michelle R Clayman Institute for Gender Research at Stanford University designed and administered the "Managing Academic Careers Survey" to nearly 30,000 full-time faculty across all academic fields at leading research universities nationwide. This study included 2,475 medical school faculty survey respondents at 12 participating institutions. Main outcomes measures were academic partner status; number of journal articles/chapters during career; and applications to other academic position(s) in last 5 years.A total of 73.3% of medical school faculty respondents were in dual-career partnerships (where both partners actively pursue employment) and 32.2% had an academic partner. Sixty-nine percent of academic partners were also in medical schools. Women faculty were more likely than men to have an academic partner. Among surgery faculty, 40% of women had an academic partner, as compared with 29.3% of men. In fully adjusted regression models, faculty with academic partners had higher publication counts than other faculty, and had higher odds of applying to other academic positions.Academic couples constitute one-third of all medical school faculty. They represent a productive and potentially mobile component of the medical faculty workforce. Because women had a higher rate of academic partnering, dual-career academic hiring policies are especially important for recruitment and retention of female faculty in surgical specialties.

Abstract

Serum proteins are routinely used to diagnose diseases, but are hard to find due to low sensitivity in screening the serum proteome. Public repositories of microarray data, such as the Gene Expression Omnibus (GEO), contain RNA expression profiles for more than 16,000 biological conditions, covering more than 30% of United States mortality. We hypothesized that genes coding for serum- and urine-detectable proteins, and showing differential expression of RNA in disease-damaged tissues would make ideal diagnostic protein biomarkers for those diseases. We showed that predicted protein biomarkers are significantly enriched for known diagnostic protein biomarkers in 22 diseases, with enrichment significantly higher in diseases for which at least three datasets are available. We then used this strategy to search for new biomarkers indicating acute rejection (AR) across different types of transplanted solid organs. We integrated three biopsy-based microarray studies of AR from pediatric renal, adult renal and adult cardiac transplantation and identified 45 genes upregulated in all three. From this set, we chose 10 proteins for serum ELISA assays in 39 renal transplant patients, and discovered three that were significantly higher in AR. Interestingly, all three proteins were also significantly higher during AR in the 63 cardiac transplant recipients studied. Our best marker, serum PECAM1, identified renal AR with 89% sensitivity and 75% specificity, and also showed increased expression in AR by immunohistochemistry in renal, hepatic and cardiac transplant biopsies. Our results demonstrate that integrating gene expression microarray measurements from disease samples and even publicly-available data sets can be a powerful, fast, and cost-effective strategy for the discovery of new diagnostic serum protein biomarkers.

Abstract

Endomyocardial biopsy is the standard method of monitoring for rejection in recipients of a cardiac transplant. However, this procedure is uncomfortable, and there are risks associated with it. Gene-expression profiling of peripheral-blood specimens has been shown to correlate with the results of an endomyocardial biopsy.We randomly assigned 602 patients who had undergone cardiac transplantation 6 months to 5 years previously to be monitored for rejection with the use of gene-expression profiling or with the use of routine endomyocardial biopsies, in addition to clinical and echocardiographic assessment of graft function. We performed a noninferiority comparison of the two approaches with respect to the composite primary outcome of rejection with hemodynamic compromise, graft dysfunction due to other causes, death, or retransplantation.During a median follow-up period of 19 months, patients who were monitored with gene-expression profiling and those who underwent routine biopsies had similar 2-year cumulative rates of the composite primary outcome (14.5% and 15.3%, respectively; hazard ratio with gene-expression profiling, 1.04; 95% confidence interval, 0.67 to 1.68). The 2-year rates of death from any cause were also similar in the two groups (6.3% and 5.5%, respectively; P=0.82). Patients who were monitored with the use of gene-expression profiling underwent fewer biopsies per person-year of follow-up than did patients who were monitored with the use of endomyocardial biopsies (0.5 vs. 3.0, P<0.001).Among selected patients who had received a cardiac transplant more than 6 months previously and who were at a low risk for rejection, a strategy of monitoring for rejection that involved gene-expression profiling, as compared with routine biopsies, was not associated with an increased risk of serious adverse outcomes and resulted in the performance of significantly fewer biopsies. (ClinicalTrials.gov number, NCT00351559.)

Abstract

During the past 25 years, advances in immunosuppression and the use of selective anti-microbial prophylaxis have progressively reduced the risk of infection after heart transplantation. This study presents a historical perspective of the changing trends of infectious disease after heart transplantation.Infectious complications in 4 representative eras of immunosuppression and anti-microbial prophylaxis were analyzed: (1) 38 in the pre-cyclosporine era (1978-1980), (2) 72 in the early cyclosporine era (1982-1984), where maintenance immunosuppression included high-dose cyclosporine and corticosteroid therapy; (3) 395 in the cyclosporine era (1988-1997), where maintenance immunosuppression included cyclosporine, azathioprine, and lower corticosteroid doses; and (4) 167 in the more recent era (2002-2005), where maintenance immunosuppression included cyclosporine and mycophenolate mofetil.The overall incidence of infections decreased in the 4 cohorts from 3.35 episodes/patient to 2.03, 1.35, and 0.60 in the more recent cohorts (p < 0.001). Gram-positive bacteria are emerging as the predominant cause of bacterial infections (28.6%, 31.4%, 51.0%, 67.6%, p = 0.001). Cytomegalovirus infections have significantly decreased in incidence and occur later after transplantation (88 +/- 77 days, pre-cyclosporine era; 304 +/- 238 days, recent cohort; p < 0.001). Fungal infections also decreased, from an incidence of 0.29/patient in the pre-cyclosporine era to 0.08 in the most recent era. A major decrease in Pneumocystis jiroveci and Nocardia infections has also occurred.The overall incidence and mortality associated with infections continues to decrease in heart transplantation and coincides with advances in immunosuppression, the use of selective anti-microbial prophylaxis, and more effective treatment regimens.

Abstract

Hemodynamically compromising rejection (HCR) is a major cause of mortality and morbidity after heart transplantation. Right ventricular (RV) function is a strong predictor of outcome in patients with heart failure and myocarditis. The objective of the current study is to determine whether RV dysfunction predicts event-free survival in patients with HCR.Medical records of 548 heart transplant patients followed at Stanford University between January 1998 and January 2007 were reviewed. HCR was defined as a rejection episode requiring hospitalization for heart failure. Univariate and multivariate analyses were performed to identify risk factors for death or retransplantation at 1 year.HCR occurred in 71 patients (12.9%). Death or retransplantation at 1 year occurred in 28 patients (39%). Univariate analysis identified non-cellular rejection (odds ratio [OR] = 3.20, p = 0.021), the need for inotropic support (OR = 4.80, p = 0.007), RV dysfunction (OR = 4.63, p = 0.006), left ventricular ejection fraction (OR = 0.941, p = 0.031) and acute renal failure (OR = 3.82, p = 0.010) as predictors of death or retransplantation at 1 year. Multivariate analysis identified RV dysfunction (OR = 4.80, p = 0.007) and the need for inotropic support (OR = 5.00, p = 0.009) as predictors of death or retransplantation at 1 year.In the modern era of immunosuppression, HCR remains a major complication after heart transplantation. RV dysfunction was identified as a novel risk factor for death or retransplantation following HCR.

Abstract

Heart transplantation is a well-established therapeutic option for many patients with end-stage heart disease. A major challenge in heart transplantation today is providing effective immunosuppression to prevent graft rejection while minimizing the many adverse effects of currently available therapies.To systematically review current immunosuppressive treatment strategies after heart transplantation and to review emerging drugs in various stages of development.A comprehensive literature review was performed using the online PubMed and Pharmaprojects databases.This article gives an overview of the immunosuppressive agents in current use, with a detailed review of emerging drugs with novel therapeutic targets.

Abstract

Human heart transplantation started 40 years ago. Medical records of all cardiac transplants performed at Stanford were reviewed. A total of 1446 heart transplantations have been performed between January 1968 and December 2007 with an increase of 1-year survival from 43.1% to 90.2%. Sixty patients who were transplanted between 1968 and 1987 were identified who survived at least 20 years. Twenty-year survivors had a mean age at transplant of 29.4 +/- 13.6 years. Rejection-free and infection-free 1-year survivals were 14.3% and 18.8%, respectively. At their last follow-up, 86.7% of long-term survivors were treated for hypertension, 28.3% showed chronic renal dysfunction, 6.7% required hemodialysis, 10% were status postkidney transplantation, 13.3% were treated for diabetes mellitus, 36.7% had a history of malignancy and 43.3% had evidence of allograft vasculopathy. The half-life conditional on survival to 20 years was 28.1 years. Eleven patients received a second heart transplant after 11.9 +/- 8.0 years. The most common causes of death were allograft vasculopathy (56.3%) and nonlymphoid malignancy (25.0%). Twenty-year survival was achieved in 12.5% of patients transplanted before 1988. Although still associated with considerable morbidity, long-term survival is expected to occur at much higher rates in the future due to major advances in the field over the past decade.

Abstract

Rapamycin has been shown to reduce anatomical evidence of cardiac allograft vasculopathy, but its effect on coronary artery physiology is unknown.Twenty-seven patients without angiographic evidence of coronary artery disease underwent measurement of fractional flow reserve (FFR), coronary flow reserve (CFR), and the index of microcirculatory resistance (IMR) within 8 weeks and then 1 year after transplantation using a pressure sensor/thermistor-tipped guidewire. Measurements were compared between consecutive patients who were on rapamycin for at least 3 months during the first year after transplantation (rapamycin group, n = 9) and a comparable group on mycophenolate mofetil (MMF) instead (MMF group, n = 18).At baseline, there was no significant difference in FFR, CFR, or IMR between the 2 groups. At 1 year, FFR declined significantly in the MMF group (0.87 +/- 0.06 to 0.82 +/- 0.06, P = .009) but did not change in the rapamycin group (0.91 +/- 0.05 to 0.89 +/- 0.04, P = .33). Coronary flow reserve and IMR did not change significantly in the MMF group (3.1 +/- 1.7 to 3.2 +/- 1.0, P = .76; and 27.5 +/- 18.1 to 19.1 +/- 7.6, P = .10, respectively) but improved significantly in the rapamycin group (2.3 +/- 0.8 to 3.8 +/- 1.4, P < .03; and 27.0 +/- 11.5 to 17.6 +/- 7.5, P < .03, respectively). Multivariate regression analysis revealed that rapamycin therapy was an independent predictor of CFR and FFR at 1 year after transplantation.Early after cardiac transplantation, rapamycin therapy is associated with improved coronary artery physiology involving both the epicardial vessel and the microvasculature.

Abstract

Cardiac allograft vasculopathy (CAV) is a major cause of death after heart transplantation (HT). The reduced bioavailability of endothelium-derived nitric oxide may play a role in endothelial vasodilator dysfunction and thus in the structural changes characterizing CAV. A potential contributor to endothelial pathobiology is asymmetric dimethylarginine (ADMA), an endogenous nitric oxide synthase inhibitor. It was hypothesized that ADMA concentrations may influence CAV progression during the first postoperative year.Thirty-two consecutive HT recipients underwent intravascular ultrasound evaluation at month 1 and year 1 after HT. Immunosuppression included mycophenolate mofetil (MMF, n=16) and sirolimus (n=16). Change in intimal volume greater than the median and vascular remodeling were major outcome measures.Plasma ADMA levels were associated with subsequent development of intimal hyperplasia (risk ratio [95% confidence interval] =2.72 [1.06-6.94]; P=0.038), and plasma ADMA levels greater than 0.70 micromol/L most accurately identified patients who would have developed intimal hyperplasia. However, ADMA levels did not correlate with negative coronary remodeling. Treatment with sirolimus, as compared with MMF, was associated with significantly lower ADMA levels (0.65+/-0.12 vs. 0.77+/-0.10 micromol/L; P<0.01) and less intimal hyperplasia (risk ratio [95% confidence interval] = 0.08 [0.01-0.56]; P=0.01).Elevated plasma ADMA is associated with coronary intimal hyperplasia, supporting the importance of nitric oxide synthase inhibition in CAV pathogenesis. Treatment with sirolimus (rather than MMF) is associated with lower ADMA levels and reduced risk of accelerated CAV.

Abstract

We previously reported that negative remodeling, not plaque progression, correlated with lumen loss during the first year after cardiac transplantation and that cytomegalovirus antibody seropositivity correlated with increased negative remodeling and greater lumen loss. Whether these findings persist between years 1 and 2 after transplantation is unknown.Serial 3-dimensional intravascular ultrasound analysis in the left anterior descending coronary artery was performed in 30 cardiac transplant recipients at year 1 and 2 after transplantation. Vessel, lumen, and plaque area were determined at 0.5-mm axial intervals in the first 50 mm of the left anterior descending coronary artery, and volumes were computed using Simpson's method. Univariate and multivariate regression analyses were performed to identify clinical predictors of change in coronary dimensions.Although mean vessel area did not change (13.6+/-3.4 to 13.4+/-3.3 mm/mm(3), P=0.45), mean plaque area increased (3.4+/-2.3 to 3.8+/-2.2 mm/mm(3), P=0.012), resulting in significant mean lumen area loss (10.3+/-2.5 to 9.6+/-2.3 mm/mm(3), P=0.016). However, the degree of luminal change strongly correlated with the degree of change in vessel size (R=0.81, P<0.0001), but not with change in plaque amount (R=-0.19, P=0.32). In fact, in 57% of the patients who demonstrated lumen loss, negative remodeling contributed more to lumen loss than did plaque progression. Diabetes at 2 years was the only significant independent clinical predictor of plaque progression and lumen loss.Despite significant plaque progression, negative remodeling correlated with coronary lumen loss between years 1 and 2 after cardiac transplantation.

Abstract

The ongoing shortage of donors for cardiac transplantation has led to a trend toward acceptance of donor hearts with some structural abnormalities including left ventricular hypertrophy. To evaluate the outcome in recipients of donor hearts with increased left ventricular wall thickness (LVWT), we retrospectively analyzed data for 157 cardiac donors and respective recipients from January 2001 to December 2004. There were 47 recipients of donor heart with increased LVWT >or=1.2 cm, which constituted the study group and 110 recipients of a donor heart with normal LVWT < 1.2 cm that formed the control group. At 3 +/- 1.5 years, recipient survival was lower (50% vs. 82%, p = 0.0053) and incidence of allograft vasculopathy was higher (50% vs. 22%, p = 0.05) in recipients of donor heart with LVWT > 1.4 cm as compared to LVWT 1.4 cm (p = 0.003), recipient preoperative ventricular assist device (VAD) support (p = 0.04) and bypass time > 150 min (p = 0.05) were predictors of reduced survival. Our results suggest careful consideration of donor hearts with echocardiographic evidence of increased LVWT in the absence of hypovolemia, because they may be associated with poorer outcomes; such hearts should potentially be reserved only for the most desperately ill recipients.

Abstract

Acute rejection continues to occur beyond the first year after cardiac transplantation, but the optimal strategy for detecting rejection during this late period is still controversial. Gene expression profiling (GEP), with its high negative predictive value for acute cellular rejection (ACR), appears to be well suited to identify low-risk patients who can be safely managed without routine invasive endomyocardial biopsy (EMB).The Invasive Monitoring Attenuation Through Gene Expression (IMAGE) study is a prospective, multicenter, non-blinded, randomized clinical trial designed to test the hypothesis that a primarily non-invasive rejection surveillance strategy utilizing GEP testing is not inferior to an invasive EMB-based strategy with respect to cardiac allograft dysfunction, rejection with hemodynamic compromise (HDC) and all-cause mortality.A total of 199 heart transplant recipients in their second through fifth post-transplant years have been enrolled in the IMAGE study since January 13, 2005. The study is expected to continue through 2008.The IMAGE study is the first randomized, controlled comparison of two rejection surveillance strategies measuring outcomes in heart transplant recipients who are beyond their first year post-transplant. The move away from routine histologic evaluation for allograft rejection represents an important paradigm shift in cardiac transplantation, and the results of this study have important implications for the future management of heart transplant patients.

Abstract

Modern antiviral strategies are effective in controlling the clinical syndromes associated with acute cytomegalovirus infection in heart transplant recipients. Despite this effectiveness, subclinical cytomegalovirus infection is a common finding in these patients and its impact on long-term graft outcome is currently underestimated.Recent studies provide evidence implicating subclinical cytomegalovirus infection in the pathogenesis of allograft rejection and cardiac allograft vasculopathy. In this process, cytomegalovirus interacts with local inflammatory pathways, and systemic immune-regulation mechanisms, which may lead to graft damage, even in the absence of cytomegalovirus replication within the graft. Consequently, in addition to pharmacologic strategies that inhibit viral replication, immune-based therapies that abrogate host immune response may provide an effective tool to prevent the indirect impact of cytomegalovirus on graft function.Current evidence suggests that subclinical cytomegalovirus infection plays an important role in the pathogenesis of long-term graft dysfunction in heart transplant recipients and in other solid organ transplant recipients. Pending the availability of definitive data from randomized trials, we propose that the use of pharmacologic and immune-based approaches, directed at complete suppression of cytomegalovirus infection, represents the best strategy for prevention of cytomegalovirus-induced rejection, cardiac allograft vasculopathy and chronic allograft damage.

Abstract

Cardiac allograft vasculopathy (CAV) is a progressive process involving the epicardial and microvascular coronary systems. The timing of the development of abnormalities in these 2 compartments and the correlation between changes in physiology and anatomy are undefined. The invasive evaluation of coronary artery anatomy and physiology with intravascular ultrasound, fractional flow reserve, coronary flow reserve, and the index of microcirculatory resistance (IMR) was performed in the left anterior descending coronary artery during 151 angiographic evaluations of asymptomatic heart transplant recipients from 0 to >5 years after heart transplantation (HT). There was no angiographic evidence of significant CAV, but during the first year after HT, fractional flow reserve decreased significantly (0.89 +/- 0.06 vs 0.85 +/- 0.07, p = 0.001), and percentage plaque volume derived by intravascular ultrasound increased significantly (15.6 +/- 7.7% to 22.5 +/- 12.3%, p = 0.0002), resulting in a significant inverse correlation between epicardial physiology and anatomy (r = -0.58, p <0.0001). The IMR was lower in these patients compared with those > or =2 years after HT (24.1 +/- 14.3 vs 29.4 +/- 18.8 units, p = 0.05), suggesting later spread of CAV to the microvasculature. As the IMR increased, fractional flow reserve increased (0.86 +/- 0.06 to 0.90 +/- 0.06, p = 0.0035 comparing recipients with IMRs < or =20 to those with IMRs > or =40), despite no difference in percentage plaque volume (21.0 +/- 11.2% vs 20.5 +/- 10.5%, p = NS). In conclusion, early after HT, anatomic and physiologic evidence of epicardial CAV was found. Later after HT, the physiologic effect of epicardial CAV may be less, because of increased microvascular dysfunction.

Abstract

Despite antiviral prophylaxis, a high percentage (over 90%) of heart transplant patients experience active cytomegalovirus (CMV) infection, diagnosed by detection of viral DNA in peripheral blood polymorphonuclear leukocytes within the first few months posttransplantation. Viral DNA was detected in mononuclear cells prior to detection in granulocytes from CMV-seropositive recipients (R+) receiving a heart from a CMV-seropositive donor (D+). Based on assessment of systemic infection in leukocyte populations, both R+ subgroups (R+/D- and R+/D+) experienced a greater infection burden than the R-/D+ subgroup, which was aggressively treated because of a higher risk of acute CMV disease. Despite widespread systemic infection in all at-risk patient subgroups, CMV DNA was rarely (< 3% of patients) detected in transplanted heart biopsy specimens. The R+ patients more frequently exceeded the 75th percentile of the CMV DNA copy number distribution in leukocytes (110 copies/10(5) polymorphonuclear leukocytes) than the R-/D+ subgroup. Therefore, active systemic CMV infection involving leukocytes is common in heart transplant recipients receiving prophylaxis to reduce acute disease. Infection of the transplanted organ is rare, suggesting that chronic vascular disease attributed to CMV may be driven by the consequences of systemic infection.

Is there a role for proliferation signal/mTOR inhibitors in the prevention and treatment of de novo malignancies after heart transplantation? Lessons learned from renal transplantation and oncologyJOURNAL OF HEART AND LUNG TRANSPLANTATIONValantine, H.2007; 26 (6): 557-564

Abstract

With the development of new immunosuppressive agents, the majority of transplant recipients are surviving for over a decade, and malignancy has become a major burden on long-term survival. Reducing the incidence of post-transplant malignancies is especially important in heart transplantation where the risk of malignancies is higher than in other organ transplants. Everolimus and sirolimus, the proliferation signal inhibitors (PSIs) or mammalian target-of-rapamycin (mTOR) inhibitors, now provide new strategies for immunosuppression because of their proven efficacy that translates to a reduction in doses of calcineurin inhibitors needed to prevent acute rejection. In addition, the anti-proliferative effects of this class of drugs raise the possibility that they may be effective for reducing the risk of malignancies after solid-organ transplantation. Despite the paucity of direct clinical evidence for this effect in heart transplant patients, observations from renal transplant recipients suggest that the anti-proliferative actions of PSIs/mTOR inhibitors may also protect against malignancies in heart transplant recipients. This potential for an anti-cancer effect is further supported by the emerging data on the use of PSIs/mTOR inhibitors in non-transplant oncology patients. Reviewed in this article are the incidence rates of malignancies after solid-organ transplantation, and the evidence for anti-cancer effects of PSIs/mTOR inhibitors in renal transplant recipients and in non-transplant patients. Also discussed are the implications of these observational data for future studies on the reduction of malignancies after heart transplantation.

Abstract

Metabolic and immuno-inflammatory risk factors contribute to cardiac allograft vasculopathy (CAV) pathogenesis. Although systemic inflammation, as detected by C-reactive protein (CRP), predicts CAV development, the relationship between CRP and markers of metabolic abnormalities remains unexplored.CRP and the entire metabolic panel were evaluated in 98 consecutive heart transplant recipients at the time of annual coronary angiography, 5.8 years after transplant (range, 1-12 years). A ratio of triglycerides (TG) to high-density lipoproteins (HDL) of 3.0 or more was considered a marker of insulin resistance. CAV prevalence was defined by angiography, and subsequent prognosis was evaluated as incidence of major cardiac adverse events.CRP was higher in the 34 patients with angiographic CAV than in those without CAV (1.10 +/- 0.20 vs 0.50 +/- 0.05 mg/dl, p < 0.001). Patients with insulin resistance had higher CRP concentrations (p = 0.023) and higher CAV prevalence (p = 0.005). High CRP and a TG/HDL of 3.0 or more were independently associated with an increased likelihood of CAV (odds ratio, > or = 3.9; p = 0.02) and predicted an increased risk of major cardiac adverse events. The combination of high CRP and a TG/HDL of 3.0 or more identified a subgroup of patients having a 4-fold increased risk for CAV and a 3-fold increased risk for major cardiac adverse events compared with patients with low CRP and normal values for metabolic indicators.Both CRP and insulin resistance, as estimated by TG/HDL, appear to be strong, synergic risk factors for CAV and for major cardiac adverse events. These findings support the hypothesis that in heart transplant recipients, systemic inflammation may be an important mediator of graft vascular injury associated with metabolic syndrome.

Abstract

Significant changes in coronary artery structure, including intimal thickening and vessel remodeling, occur early after cardiac transplantation. The degree to which these changes compromise coronary lumen dimensions, and the clinical factors that affect these changes, remain controversial.Thirty-eight adult cardiac transplant recipients underwent coronary angiography and volumetric intravascular ultrasound (IVUS) evaluation of the left anterior descending artery within 8 weeks of transplantation and at 1 year. Clinical parameters including donor and recipient characteristics, rejection episodes, and serology were prospectively recorded. Two-dimensional IVUS measurements and vessel, lumen and plaque volume were calculated at both time points and compared. Multivariate regression analysis was performed to reveal clinical predictors of change in coronary dimensions.During the first year after transplantation, significant decreases in vessel size (negative remodeling) and lumen size were observed with significant increases in plaque burden based on IVUS analyses. Loss of lumen volume correlated significantly with the degree of negative remodeling (R=0.82, P<0.0001), but not with changes in plaque burden (R=0.08, P=0.64). Patients with the greatest increase in plaque volume had significantly less negative remodeling (R=0.53, P=0.0006). Transplant recipient cytomegalovirus (CMV) antibody seropositivity and lack of aggressive prophylaxis against CMV infection/reactivation were significant independent predictors of greater negative remodeling (P<0.01 and P=0.03, respectively) and greater lumen loss (P=0.02 and P=0.03, respectively).Negative remodeling is primarily responsible for coronary artery lumen loss during the first year after cardiac transplantation. CMV seropositivity and lack of aggressive CMV prophylaxis correlate with increased negative remodeling, resulting in greater lumen loss.

Abstract

Pulmonary infection with Nocardia is an uncommon but serious infection found in immunocompromised patients. We describe a rapidly progressive pulmonary nocardiosis in a heart transplant patient. We then review the common clinical features of Nocardia infection in transplant recipients, outlining the challenges in its diagnosis and management. We also review the differences between Pneumocystis jiroveci prophylaxis regimens with respect to concomitant prophylaxis of Nocardia and other opportunistic infections.

Abstract

Cytomegalovirus (CMV)-associated leucopenia in heart transplant patients is poorly characterized.We conducted a retrospective analysis of timing, degree, and type of leukopenia in four groups of patients: cases (n=20); controls (n=20); subclinical early infection (n=21), and subclinical late infection (n=22). In the cases, white blood cells (WBC) count at diagnosis was compared to prediagnosis; and cases were compared to controls. Subclinical cases (early and late) were identified by measurement of CMV DNA in peripheral blood mononucleocytes, and WBC was compared to those of the cases and controls.First, in human heart transplant recipients the total leukocyte count decreased prior to the time of diagnosis of CMV disease: cases: 5.4+/-2.1 x 10/microL vs. 3.7+/-2.1x10/muL (P<0.01); subclinical early: 8.1+/-4.1 x 10/microL vs. 6.9+/-1.6 x 10/microL (P<0.01). Second, the leukocyte populations most reduced during CMV disease are the neutrophils: 4.4 x 10/microL (78%) to 2.5 x 10/microL (69%) (P<0.05), and monocytes 0.6 x 10/microL (11%) to 0.3 x 10/microL (7.5%) (P<0.05). Third, the reduction in leukocyte count that occurs during CMV disease appears to be independent of immunosuppressive therapy (using cyclosporine A, mycophenolate mofetil, or azathioprine and prednisone). Finally, subclinical CMV infection in stable long-term heart transplant patients without disease is unassociated with a reduction in the leukocyte count.Aside from implications for early diagnosis, CMV-associated decrease in monocytes is important because viral infections like Epstein-Barr virus cause monocytosis. The absence of leucopenia in subclinical late infections is a new important finding.

Abstract

Cardiac allograft vasculopathy is a leading cause of death during long-term follow-up of heart transplant recipients. We report 2 cases of cardiac allograft vasculopathy associated with giant coronary aneurysms. To our knowledge, these are the first reported cases of spontaneous giant coronary aneurysms in heart transplant recipients.

Abstract

Asymptomatic cytomegalovirus (CMV) replication is frequent after cardiac transplantation in recipients with pretransplantation CMV infection. How subclinical viral replication influences cardiac allograft disease remains poorly understood, as does the importance of T-cell immunity in controlling such replication.Thirty-nine cardiac recipients who were pretransplantation CMV antibody positive were longitudinally studied for circulating CMV-specific CD4 and CD8 T-cell responses, CMV viral load in blood neutrophils, and allograft rejection during the first posttransplantation year. Nineteen of these recipients were also analyzed for changes of coronary artery intimal, lumen, and whole-vessel area. All recipients received early prophylactic therapy with ganciclovir. No recipients developed overt CMV disease. Those with detectable levels of CMV-specific CD4 T cells in the first month after transplantation were significantly protected from high mean and peak posttransplantation viral load (P<0.05), acute rejection (P<0.005), and loss of allograft coronary artery lumen (P<0.05) and of whole-vessel area (P<0.05) compared with those who lacked this immune response. The losses of lumen and vessel area were both significantly correlated with the time after transplantation at which a CD4 T-cell response was first detected (P<0.05) and with the cumulative graft rejection score (P<0.05).The early control of subclinical CMV replication after transplantation by T-cell immunity may limit cardiac allograft rejection and vascular disease. Interventions to increase T-cell immunity might be clinically useful in limiting these adverse viral effects.

Abstract

Sirolimus was introduced in de novo immunosuppression at Stanford University in view of its favorable effects on reduced rejection and cardiac allograft vasculopathy. After an apparent increase in the incidence of post-surgical wound complications as well as symptomatic pleural and pericardial effusions, we reverted to a mycophenolate mofetil (MMF)-based regimen. This retrospective study compared the outcome in heart transplant recipients on sirolimus (48 patients) with those on MMF (46 patients) in de novo immunosuppressive regimen. The incidence of any post-surgical wound complication (52% vs. 28%, p=0.019) and deep surgical wound complication (35% vs. 13%, p=0.012) was significantly higher in patients on sirolimus than on MMF. More patients on sirolimus also had symptomatic pleural (p=0.035) and large pericardial effusions (p=0.033) requiring intervention. Logistic regression analysis showed sirolimus (p=0.027) and longer cardiac bypass time (OR=1.011; p=0.048) as risk factors for any wound complication. Sirolimus in de novo immunosuppression after cardiac transplantation was associated with a significant increase in the incidence of post-surgical wound healing complications as well as symptomatic pleural and pericardial effusions.

Abstract

Certain clinical risk factors are associated with significant coronary artery disease in kidney transplant candidates with diabetes mellitus. We sought to validate the use of a clinical algorithm in predicting post-transplantation mortality in patients with type 1 diabetes. We also examined the prevalence of significant coronary lesions in high-risk transplant candidates.All patients with type 1 diabetes evaluated between 1991 and 2001 for kidney with/without pancreas transplantation were classified as high-risk based on the presence of any of the following risk factors: age >or=45 yr, smoking history >or=5 pack years, diabetes duration >or=25 yr or any ST-T segment abnormalities on electrocardiogram. Remaining patients were considered low risk. All high-risk candidates were advised to undergo coronary angiography. The primary outcome of interest was all-cause mortality post-transplantation.Eighty-four high-risk and 42 low-risk patients were identified. Significant coronary artery stenosis was detected in 31 high-risk candidates. Mean arterial pressure was a significant predictor of coronary stenosis (odds ratio 1.68; 95% confidence interval 1.14-2.46), adjusted for age, sex and duration of diabetes. In 75 candidates who underwent transplantation with median follow-up of 47 months, the use of clinical risk factors predicted all eight deaths. No deaths occurred in low-risk patients. A significant mortality difference was noted between the two risk groups (p = 0.03).This clinical algorithm can identify patients with type 1 diabetes at risk for mortality after kidney with/without pancreas transplant. Patients without clinical risk factors can safely undergo transplantation without further cardiac evaluation.

Abstract

We report the case of a 36-year-old woman with a diagnosis of idiopathic dilated cardiomyopathy who underwent cardiac transplantation. The results of her initial iron studies were normal, but hemochromatosis was suspected after microscopy of the explanted heart revealed iron deposition. By 6 months post-transplantation, iron deposition was detected in her surveillance endomyocardial biopsy specimens and studies then confirmed the existence of non-HFE hemochromatosis. The patient has been stable on treatment with regular phlebotomies and a low vitamin C diet.

Abstract

Orthotopic heart transplantation is considered an effective treatment for patients with refractory heart failure. The long-term survival of orthotopic heart transplantation recipients has increased over the last several decades, but many long-term survivors of orthotopic heart transplantation develop graft atherosclerosis and associated left ventricular dysfunction. The risk of sudden cardiac death in long-term survivors of orthotopic heart transplantation with these complications is believed to be high. There are no data on the usefulness of implantable cardioverter-defibrillators (ICDs) in this population; therefore, we report our early experience with ICD placement in such patients.The purpose of this study was to examine the use of ICDs in adults who are long-term survivors of heart transplantation.We retrospectively reviewed all adult patients who underwent orthotopic heart transplantation at Stanford University Hospital (Stanford, CA, USA) from 1980 to 2004. All patients who received an ICD after transplant were included in this study. We reviewed demographic data, medical history, ejection fraction, presence of graft atherosclerosis, indication for ICD placement, and any device therapy delivered.Of the 925 patients who had orthotopic heart transplantation during this time period, 493 patients were alive at the beginning of the year 2000. Of these patients, 10 ( approximately 2%) had subsequent placement of an ICD. All 10 patients were male. The average age at orthotopic heart transplantation was 37.8 years. The average age at ICD placement was 50.5 years. The average time from orthotopic heart transplantation to ICD placement was 14.6 years. The average ejection fraction at the time of implant was 46.5%. Five of the 10 patients had a low ejection fraction (within this subgroup, the average ejection fraction was 31%, range 15%-45%) and graft atherosclerosis. ICDs were placed because of symptomatic episodes of ventricular tachycardia (3 patients), low ejection fraction and severe graft atherosclerosis without symptoms (3 patients), and after thorough evaluation for otherwise unexplained syncope (4 patients). The average follow-up after device implantation was 13 months. Complications related to ICD placement were an infected ICD system requiring explant in one patient and a lead fracture in another patient. Three patients had subsequent appropriate shocks for ventricular arrhythmias, and one patient underwent a second orthotopic heart transplantation. One patient died of malignancy.Use of the ICD in long-term survivors of orthotopic heart transplantation should be considered in appropriately selected patients. Further data are needed regarding ICD use in this population.

Abstract

Induction therapy can reduce morbidity and early mortality in pediatric and adult heart transplant recipients. Monoclonal and polyclonal agents are most widely used; they nonspecifically deplete the T-cell pool and are thus associated with drug-induced side effects. The cytokine release syndrome is one of the most problematic events associated with induction. Daclizumab, a highly humanized, specific interleukin-2 receptor blocker, may be efficacious to the monoclonal agent, OKT3. Due to its specific action and properties, the safety profile of this agent may be superior to OKT3.Forty subjects received daclizumab and their clinical outcomes were compared against a historical group of 40 subjects who received OKT3. Three- and six-month outcome measures included survival, rejection history, steroid burden, and complications.Mortality was low between the groups with equivalent 6-month survival. No differences in rejection profile or time to the first significant rejection event were detected; no subject had severe acute rejection within the first 180 days. Steroid requirement for maintenance immunosuppression and treatment of rejection was also similar between the groups. Six-month prevalence for complications were significantly different; 55% of OKT3-treated subjects having at least one event compared to 33% of daclizumab-treated subjects (P=0.04). The likelihood of complications occurred within the first month after transplantation.Daclizumab induction therapy is as efficacious as OKT3 in the prevention of early acute rejection after heart transplantation among pediatric and adult subjects. Complications related to the induction agent are significantly lower in the humanized product.

Abstract

Cardiac allograft vasculopathy (CAV) is the primary cause of late morbidity and mortality in heart transplant patients and remains a major challenge to further improvements in long-term graft survival in this population. Clearly, there is a need for immunosuppressive regimens that reduce the risk of CAV. Certican (everolimus) is a proliferation signal inhibitor developed for the prevention of acute and chronic rejection after solid-organ transplantation. Pre-clinical studies suggest that everolimus prevents vascular remodeling and neointimal proliferation, which are key components of CAV. In a pivotal trial in heart transplantation, everolimus at 1.5 or 3.0 mg plus standard-dose cyclosporine (CsA; Neoral) and corticosteroids demonstrated superior efficacy to azathioprine (AZA) by decreasing the incidence of biopsy-proven acute rejection (BPAR) and the composite end-point, efficacy failure. Importantly, in this trial, everolimus was also associated with a significant reduction in both the incidence and severity of CAV in recipients of heart transplants. Furthermore, cytomegalovirus (CMV) infection rates were significantly lower with everolimus than with AZA. The study suggests that everolimus has the ability to target the primary causes of chronic allograft dysfunction by reducing acute rejection and CMV infection, and preventing CAV. Moreover, these findings indicate that use of everolimus as part of the primary immunosuppression regimen, could provide a major benefit for heart transplant patients, offering a real hope of alleviating CAV in the long term. Few large-scale trials have been conducted in heart transplant patients, so their value must therefore be maximized with findings being effectively translated into clinical practice.

Abstract

A large Phase III clinical trial with the novel proliferation signal inhibitor, Certican (everolimus), has shown this agent to be associated with lower rates of acute rejection, cardiac allograft vasculopathy (CAV) and cytomegalovirus (CMV) infection when compared with azathioprine (AZA) at 12 months post-transplant. Given that CAV is the main risk factor for mortality after the first year post-transplant, and that acute rejection and CMV infection play a key role in the development of this disease, the findings suggest that everolimus has an important role as part of the primary immunosuppression of this population. Consideration of the presentation and outcome of patients from Stanford University who were enrolled in the pivotal trial with everolimus in heart transplantation has highlighted the efficacy of everolimus in this setting. Analysis of the angiographic outcome data of these patients demonstrates that the efficacy of everolimus observed in the large, multicenter trial involving heart transplant patients was replicated in the findings of a single center. This finding is important for the interpretation of clinical trial data, offering reassurance that data from large trials are applicable to an individual center. The results from Stanford also reveal an important difference between everolimus and AZA with regard to intimal thickening and the incidence of abnormal left ventricular ejection fraction (LVEF), suggesting that everolimus may improve left ventricular function. Because abnormal LVEF has been associated with greater risk of vascular rejection and allograft vascular disease, use of everolimus could well improve long-term outcomes in heart transplant recipients.

Abstract

Cardiovascular disease post-transplant, particularly ischemic heart disease, is a significant problem for all transplant recipients. The major risk factors-smoking, obesity, diabetes, dyslipidemia and hypertension-are often more prevalent in heart transplant populations than in the general population. One of the main risk factors influencing graft loss and patient survival is cardiac allograft vasculopathy (CAV). Because CAV affects between 30% and 60% of cardiac transplant recipients within 5 years of surgery, prevention is a key focus for cardiac transplant teams today. CAV is caused by both immunologic mechanisms (e.g., acute rejection and anti-HLA antibodies) and non-immunologic mechanisms relating to the transplant itself or the recipient (e.g., donor age, hypertension, hyperlipidemia and pre-existing diabetes) or to the side effects often associated with immunosuppression with calcineurin inhibitors or corticosteroids (e.g., cytomegalovirus infection, nephrotoxicity and new-onset diabetes after transplantation). The calcineurin inhibitors, cyclosporine and tacrolimus, effectively prevent acute rejection, but do not prevent the development of CAV. CAV prevention will require a combined approach of new adjunct immunosuppressant agents (e.g., the proliferation signal inhibitors) and reduction in cardiovascular risk. Hypertension, hyperlipidemia and diabetes are also associated with the immunosuppression required to prevent organ rejection. Some studies have shown that hypertension is present more frequently in cyclosporine-treated patients than in tacrolimus-treated patients and that tacrolimus may be associated with a more favorable lipid profile. On the other hand, tacrolimus may be more diabetogenic than cyclosporine with current data suggesting a trend but no statistically significant supporting evidence. New-onset diabetes after transplantation is at times difficult to manage and may be an important determinant along with hypertension and hyperlipidemia of ischemic heart disease, cerebrovascular disease and peripheral vascular disease. The choice of calcineurin inhibitor for an immunosuppressive regimen in heart transplantation should consider the associated relative cardiovascular risks.

Abstract

The possible effect of plasma hemoglobin A(1c) (HbA(1c)) on the development of transplant coronary artery disease (TxCAD) was investigated.Glucose intolerance is implicated as a risk factor for TxCAD. However, a relationship between HbA(1c) and TxCAD has not been demonstrated.Plasma HbA(1c) was measured in 151 adult patients undergoing routine annual coronary angiography at a mean period of 4.1 years after heart transplantation. Intracoronary ultrasound (ICUS) was also performed in 42 patients. Transplant CAD was graded by angiography as none, mild (stenosis in any vessel < or =30%), moderate (31% to 69%), or severe (> or =70%) and was defined by ICUS as a mean intimal thickness (MIT) > or =0.3 mm in any coronary artery segment. The association between TxCAD and established risk factors was examined.Plasma HbA(1c) increased with the angiographic grade of TxCAD (5.6%, 5.8%, 6.4%, and 6.2% for none, mild, moderate, and severe disease, respectively; p < 0.05 for none vs. moderate or severe) and correlated with disease severity (r = 0.24, p < 0.05). The HbA(1c) level was higher in patients with MIT > or =0.3 mm than in those with MIT <0.3 mm (6.4% vs. 5.7%, p < 0.05). Multivariate logistic regression analysis identified HbA(1c) as an independent predictor of TxCAD, as detected by angiography or ICUS (odds ratios 1.9 and 2.4, 95% confidence intervals 1.5 to 6.3 [p = 0.010] and 1.3 to 4.2 [p < 0.005], respectively).Persistent glucose intolerance, as reflected by plasma HbA(1c), is associated with the occurrence of TxCAD and may play an important role in its pathogenesis.

The role of viruses in cardiac allograft vasculopathyAMERICAN JOURNAL OF TRANSPLANTATIONValantine, H. A.2004; 4 (2): 169-177

Abstract

Considerable evidence suggests a role for viruses in transplant arteriosclerosis (TA), including observational data, experimental models and therapeutic trials implicating human cytomegalovirus (HCMV) in the progression to TA. In pediatric heart transplant patients, adenoviral genome in endomyocardial biopsies (EMB) is an important predictor of TA and graft loss. During CMV viremia, EMBs from adult patients demonstrate endothelialitis and vascular smooth muscle cell proliferation. These changes are predictors of subsequent diffuse TA. HCMV immediate early proteins (IE-1 and IE-2) increase the constitutive expression of intercellular adhesion molecule-1 (ICAM-1) independent of other intracellular cytokines. Likewise, viral chemokines such as US28 have been implicated in vascular disease because of their ability to induce smooth muscle cell migration. Recent data suggests that CMV might accelerate TA through its ability to abrogate the vascular protective effects of the endothelium-derived nitric oxide system (eNOS). Confirmation of causality requires clinical trials demonstrating that antiviral agents such as ganciclovir inhibit TA. Such studies in patients though limited to retrospective analyses, suggest that ganciclovir prophylaxis early after heart transplantation reduces the risk of TA. These observations emphasize the need for randomized controlled clinical trials to confirm a causal role for CMV (and other viruses) in TA.

Abstract

Truly long term survival post heart transplantation has become increasingly frequent over the past two decades.We analyzed multiple clinical outcomes in the cohort of 140 patients in the Stanford database who underwent heart transplantation after the introduction of cyclosporine-based immunosuppression in 1980 and survived >10 years after transplantation.We found generally excellent functional status in these patients, but a high incidence of hypertension, renal dysfunction, and graft CAD as well as malignancy.With continued improvement in post-transplant survival rates, providing complex care for such long-term recipients as these will assume increasing clinical importance in the everyday practice of transplant medicine and these data highlight the problems to be anticipated.

Abstract

Long-term survival after heart transplantation is common in the cyclosporine era. However, there are few data documenting pre-transplant/peri-operative factors predictive of truly long-term survival (>10 years). The purpose of this study is to identify factors associated with 10-year survival after heart transplantation.Our study population included 197 adults who survived >6 months and died <10 years after heart transplant (medium-term group) and 140 adults who survived >10 years after heart transplant (long-term group) between December 1980 and May 2001. A comparison was done between the two groups and we used multivariate analysis to identify which factors predicted 10-year survival.The long-term group had younger recipient and donor age, lower recipient body mass index at transplant, shorter waiting time and lower percentages of ischemic etiology/male recipient/non-white recipient. Kaplan-Meier plots of freedom from graft coronary artery disease and malignancy showed later onset patterns in the long-term group compared with the medium-term group. Multivariate analysis showed that white recipient, younger recipient and lower recipient body mass index at heart transplant were factors significantly associated with 10-year survival.Several pre-transplant/peri-operative factors were associated with survival beyond 10 years after heart transplantation. Stratified/tailored strategies based on these factors may be helpful to attain longer-term survival of recipients with higher risks.

Abstract

Endothelial injury plays a central role in the pathophysiologic mechanisms underlying cardiac allograft vasculopathy (CAV). Although the accelerated course of CAV and its localization to the allograft support an important role for the alloimmune response, there is considerable evidence implicating lipoprotein abnormalities, metabolic disturbances, viral infections, and systemic inflammation in the process. This multifactorial basis for CAV may be put into a pathophysiologic context in which endothelial cell injury is the triggering event that initiates and drives the proliferative and fibrotic processes characteristic of CAV. In the transplant setting, endothelial cell injury is induced by multiple factors, including brain death, ischemia-reperfusion, alloimmune responses, and viral infections. Once initiated, propagation of the proliferative processes that ultimately lead to vascular occlusion is enhanced by the abnormal metabolic environment of elevated lipoproteins and insulin resistance encountered in most patients. This review examines the evidence for the role of potential triggers of endothelial injury in the pathophysiology of CAV and discusses the central role of the nitric oxide pathway in the disease process.

Abstract

Tacrolimus is a potent calcineurin inhibitor that was introduced to heart transplantation in the early 1990s. The side-effect profile of tacrolimus is more favorable than that of cyclosporine and some reports have suggested an advantage of tacrolimus in the treatment of rejection. The present study was undertaken to determine whether a late conversion to tacrolimus affords these benefits to heart transplant recipients.Charts from 109 patients who underwent conversion from cyclosporine to tacrolimus for recurrent rejection or adverse effects were retrospectively reviewed.During the year after conversion to tacrolimus, there was a significant decrease in treated rejection episodes. Conversion to tacrolimus rapidly resulted in an improved lipid profile. Two years after conversion blood pressure was significantly reduced. Apart from rejection, these benefits were found mainly among individuals converted to tacrolimus within 1 year of heart transplantation.Conversion from cyclosporine to tacrolimus is safe and results in a more favorable risk factor profile. However, most of the benefits are seen in individuals converted within 1 year of transplantation.

Abstract

Using previously described models of diabetes-induced transplant coronary artery atherosclerosis (TxCAD), we quantitatively assessed TxCAD using computer-assisted morphometric measurements. More than 95% of the evaluated vessels were intramyocardial vessels. The first and last tertile of the vessel size distribution were evaluated for the presence of TxCAD. Severe TxCAD, defined as a luminal occlusion > or =75%, was more prevalent in the larger vessels. We observed a differential involvement based on vessel size in diabetes-induced TxCAD.

Abstract

When used in conjunction with steroids and cyclosporin, mycophenolate mofetil (MMF) has been shown to significantly reduce mortality and incidence of rejection in the first year after heart transplantation. It also appears that in this early post-transplantation period, the monitoring of immunosuppressive therapies may be warranted. The current study was undertaken to determine if such monitoring is still useful more than 1 yr after heart transplantation.Twenty-six patients who had survived the first year after orthotopic heart transplantation and had been on MMF therapy for more than 3 months were prospectively followed. At the time of their routine endomyocardial biopsy blood samples were taken to monitor immunosuppressive therapy. Most patients had two samples taken, on average 109 d apart.There were 22 episodes of asymptomatic rejection documented on a total of 48 biopsies. Of these, only two were of ISHLT (International Society for Heart and Lung Transplantation) grade 3A the remainder being of ISHLT grades 1 or 2. There was no relation between immunosuppressive regimen (tacrolimus and MMF or cyclosporin and MMF) and rejection. There was no relation between monitored immunosuppressive levels and rejection. Patients with the combination of MMF and tacrolimus had significantly higher plasma mycophenolic acid levels despite significantly lower daily MMF dose.There does not appear to be a benefit in continued monitoring of plasma mycophenolic acid levels beyond the first year of heart transplantation. There were significant differences in plasma mycophenolic acid levels depending on the type of calcineurin inhibitor concomitantly used.

Abstract

Induction of diabetes in rat heterotopic heart transplantation models leads to an accelerated form of severe transplant coronary artery disease (TxCAD). We undertook this study to determine whether treatment of diabetes with metformin would favorably affect TxCAD.Heterotopic abdominal heart transplantation was performed in rat isograft and allograft models. After transplantation, diabetes was induced with streptozotocin. Fifty percent of the animals received metformin at 500 mg/kg twice daily. We quantitatively assessed TxCAD using histologic sections of harvested hearts at 30 and 60 days with computer-assisted morphometry. We compared vessels in the first tertile of the area distribution with vessels in the last tertile.Fasting glucose levels in metformin-treated animals were 161 +/- 45 mg/dl compared with 400 +/- 120 mg/dl (p < 0.05) in untreated rats. Treatment with metformin led to decreased diabetes-induced TxCAD in the larger vessels. This effect was sustained during the study course in the isografts but not in the allografts. Treatment with metformin did not prevent progression of TxCAD in the smaller vessels at 60 days.Metformin reduced luminal occlusion and severe TxCAD in the larger vessels but did not alter the course of TxCAD in the smaller vessels. These results may have therapeutic implications for patients.

Abstract

Cytomegalovirus (CMV) disease was previously shown to be unaltered by a 28-day course of ganciclovir compared with placebo in seronegative recipients of hearts from seropositive donors (D+/R-). This study tests the hypothesis that a combination of ganciclovir plus CMV hyperimmune globulin (CMVIG) is more effective than ganciclovir alone for preventing acute CMV illness and its long-term sequelae.The study population receiving CMVIG (n=80) included 27 heart transplant recipients (D+/R-) and 53 heart-lung and lung transplant recipients (R+ and/or D+). Each group was matched with historical controls who underwent transplantation within the preceding 2-3 years. Outcome measures compared were as follows: 3-year incidence of CMV disease; fungal infection; acute rejection; survival; rates and severity of transplant coronary artery disease (in heart patients) defined by intimal thickness (ultrasound) and coronary artery stenosis (angiographic); and incidence and death from obliterative bronchiolitis defined by pathological criteria on endobronchial biopsy specimens (in heart-lung/lung patients).Patients treated with CMVIG had a higher disease-free incidence of CMV, lower rejection incidence, and higher survival rate compared with the patients treated with ganciclovir alone. The coronary artery intimal thickness and the prevalence of intimal thickening were lower in the patients receiving CMVIG. Heart-lung and lung transplant patients treated with CMVIG had lower incidences of obliterative bronchiolitis and death from obliterative bronchiolitis and longer survival compared with the patients treated with ganciclovir alone.CMVIG plus ganciclovir seems to be more effective that ganciclovir alone for preventing the sequelae of CMV infection. A prospective randomized study is required to confirm these observations.

Abstract

Allograft coronary atherosclerosis (TxCAD) is the leading cause of death after the first year after transplantation. TxCAD is believed to be a form of chronic rejection of the cardiac allografts. This study was undertaken to determine whether TxCAD could develop in the absence of a cellular alloimmune response.Inbred lean Zucker rats (>26 generations) served as donors and recipients of the cardiac grafts. Donor hearts were explanted at 60 or 90 days. Explanted hearts were processed for coronary artery histological analysis. Cytokine expression was determined by reverse transcription-polymerase chain reaction, and the presence of T cells within the explanted hearts was evaluated by immunohistochemistry. Forty-six transplantations were made, and TxCAD developed in all but one of the transplanted hearts. Overall, one third of the vessels examined were affected by TxCAD, and in roughly half of these vessels, the disease was severe. Native hearts were free of atherosclerosis. Interleukin-2 was absent from the transplanted hearts, and T cells were present in minimal amounts (<1 per low-power field).TxCAD developed in the absence of a cellular alloimmune response in these genetically similar donors and recipients. The observed TxCAD was significant and comparable to what is found in rat allografting models.

Abstract

Patients with end-stage liver disease and coronary artery disease (CAD) being considered for orthotopic liver transplantation (OLT) present a difficult dilemma. The availability of multiple screening tests and newer treatment options for CAD prompted this review. Recent data suggest that the prevalence of CAD in patients with cirrhosis is much greater than previously believed and likely mirrors or exceeds the prevalence rate in the healthy population. The morbidity and mortality of patients with CAD who undergo OLT without treatment are unacceptably high, making identification of patients with CAD before OLT an important consideration. Patients with documented CAD or major clinical predictors of CAD should undergo cardiac catheterization before OLT. Those with advanced CAD not amenable to interventional therapy or with poor cardiac function are not candidates for OLT. Dobutamine stress echocardiogram appears to be an excellent means of screening patients with intermediate or minor clinical predictors of CAD before OLT. Patients found to have mild or moderate CAD should be aggressively treated medically and, if necessary and feasible based on hepatic reserve, by percutaneous or, less likely, surgical intervention pre-OLT to correct obstructive coronary lesions. Prospective studies regarding optimal screening strategies for the presence of CAD and the indications, timing, and outcomes of interventional therapy in patients with advanced cirrhosis are lacking and much needed.

Abstract

Hyperhomocysteinemia is an independent risk factor for coronary disease and elevated plasma homocysteine levels have been documented in heart transplant recipients. The aim of this study was to test the hypothesis that homocysteine levels are associated with presence or absence of transplant coronary artery disease.Forty-three non-smoking adults were recruited, all of whom had received a heart transplant between 2 and 7 yr previously. All 43 had blood drawn for fasting homocysteine level on the day of presentation. All patients had undergone diagnostic coronary angiography within the past 6 months.For all patients, the average fasting plasma homocysteine level was 17.0+/-SD 6.6 micromol/L with a range from 6.0 to 36.9 micromol/L. Twenty-six patients (60%) had fasting plasma homocysteine levels above 15.0 micromol/L. On the basis of arteriography, patients were categorized as those with angiographically normal (n=22) or abnormal (n=21) coronary arteries. There was no difference in the mean plasma homocysteine level comparing patients with angiographically normal (17.2+/-SD 7.0 micromol/L) to those with abnormal (16.8+/-SD 6.2 micromol/L) coronary arteries. Plasma homocysteine levels increased with increasing plasma creatinine levels (r=0.63, p<0.0001) and with decreasing vitamin B6 levels (r=-0.56, p<0.0001).Mild hyperhomocysteinemia is a consistent finding among heart transplant recipients. This finding was not associated with transplant coronary artery disease in our patients. The combination of renal dysfunction and vitamin B6 deficiency may explain the unusual prevalence of hyperhomocysteinemia in heart transplant recipients.

Abstract

This study examines the hypothesis that metabolic abnormalities of dysmetabolic syndrome are risk factors for transplant coronary artery disease (TxCAD).Sixty-six patients without overt diabetes, 2 to 4 years after surgery, underwent intracoronary ultrasound (ICUS), measurement of plasma glucose and insulin after oral glucose (75 g), and fasting lipid and lipoproteins. TxCAD incidence by angiography or autopsy was prospectively determined during subsequent follow-up (8 years). Coronary artery intimal thickness (IT) and subsequent outcomes were compared in patients stratified as having "high" versus "low" plasma glucose (>8.9 mmol/L) and insulin (>760 pmol/L) 2 hours after glucose challenge; and "abnormal" versus "normal" fasting lipid and lipoprotein concentrations as defined by the National Cholesterol EducationPatients with high glucose or insulin concentrations had greater IT: 0.38+/-0.05 versus 0.22+/-0.02 mm, P=0.05, and 0.39+/-0.05 versus 0.20+/-0.02 mm, P=0.01, respectively. Freedom from TxCAD was 56+/-11% versus 81+/-6% (P<0.01) in patients with high versus low glucose and 57+/-10% versus 82+/-7% (P<0.05) in patients with high versus low insulin. Actuarial survival was 60+/-12% versus 92+/-5% (P<0.005) in patients with high versus low glucose and 72+/-9% versus 88+/-6% (P<0.05) in patients with high versus low insulin. Triglycerides and VLDL were higher and HDL was lower in patients with IT >0.3 mm than with IT =0.3 mm. TxCAD incidence was higher in patients with high plasma TG and VLDL and low HDL.These data suggest that insulin resistance plays a role in TXCAD:

Impact of type 1 diabetes on microvascular vs. macrovascular involvement with TXCAD: is there a difference in models of the disease?The Journal of heart and lung transplantation : the official publication of the International Society for Heart TransplantationValantine, H., Zhu, D., Wen, P., Panchal, S. N., Dai, X.2001; 20 (2): 186-187

Abstract

Coronary artery disease occurs in an accelerated fashion in the donor heart after heart transplantation (TxCAD), but the cause is poorly understood. The risk of developing TxCAD is increased by cytomegalovirus (CMV) infection and decreased by use of calcium blockers. Our group observed that prophylactic administration of ganciclovir early after heart transplantation inhibited CMV illness, and we now propose to determine whether this therapy also prevents TxCAD.One hundred forty-nine consecutive patients (131 men and 18 women aged 48+/-13 years) were randomized to receive either ganciclovir or placebo during the initial 28 days after heart transplantation. Immunosuppression consisted of muromonab-CD3 (OKT-3) prophylaxis and maintenance with cyclosporine, prednisone, and azathioprine. Mean follow-up time was 4.7+/-1.3 years. In a post hoc analysis of this trial designed to assess efficacy of ganciclovir for prevention of CMV disease, we compared the actuarial incidence of TxCAD, defined by annual angiography as the presence of any stenosis. Because calcium blockers have been shown to prevent TxCAD, we analyzed the results by stratifying patients according to use of calcium blockers. TxCAD could not be evaluated in 28 patients because of early death or limited follow-up. Among the evaluable patients, actuarial incidence of TxCAD at follow-up (mean, 4.7 years) in ganciclovir-treated patients (n=62) compared with placebo (n=59) was 43+/-8% versus 60+/-10% (P<0.1). By Cox multivariate analysis, independent predictors of TxCAD were donor age >40 years (relative risk, 2.7; CI, 1.3 to 5.5; P<0.01) and no ganciclovir (relative risk, 2.1; CI, 1.1 to 5.3; P=0.04). Stratification on the basis of calcium blocker use revealed differences in TxCAD incidence when ganciclovir and placebo were compared: no calcium blockers (n=53), 32+/-11% (n=28) for ganciclovir versus 62+/-16% (n=25) for placebo (P<0.03); calcium blockers (n=68), 50+/-14% (n=33) for ganciclovir versus 45+/-12% (n=35) for placebo (P=NS).TxCAD incidence appears to be lower in patients treated with ganciclovir who are not treated with calcium blockers. Given the limitations imposed by post hoc analysis, a randomized clinical trial is required to address this issue.

Abstract

The development of arteriosclerosis after heart transplantation plays a major role in decreased graft and patient survival. Although several pathogenic mechanisms have been proposed for the development of posttransplant coronary artery disease (CAD), a significant amount of accumulating evidence suggests that cytomegalovirus is a critical factor in this disease. Because post-transplant CAD is often a silent disease and early detection escapes conventional angiography, physicians must maintain close surveillance of heart transplant patients and institute prophylaxis in high-risk groups. In addition to targeting conventional risk factors, prophylaxis against CMV disease should decrease the incidence of the disease.

Abstract

Clinical observations suggest that transplant coronary artery disease (TxCAD) is immunologically mediated but may be accelerated by metabolic derangements. We developed a rat model of heterotopic heart transplantation in the presence of diabetes and dyslipidemia to further study their role in TxCAD development.Major histocompatibility complex-mismatched strains of inbred rats underwent heterotopic heart transplantation (ACI-to-Lewis allografts). Diabetes (DM) was induced by streptozotocin injection (80 mg/kg) after transplantation; dyslipidemia was worsened by feeding of a 60% high-fructose diet (+F). Allograft transplants were divided into four groups: (1) +DM/+F; (2) +DM/-F; (3) -DM/+F; and (4) -DM/-F. Isograft transplants (Lewis to Lewis, +DM/+/-F) were controls. All animals received daily cyclosporine (5 mg/kg). Grafts surviving > 30 days were evaluated for TxCAD on histological sections and graded 0 to 5 for intimal thickness. All streptozotocin-treated animals were diabetic within 2 weeks, with fourfold increases in plasma glucose concentrations versus nondiabetics. Severe TxCAD was observed in diabetic allografts only. The mean grade of TxCAD in diabetic allografts was 3.2 +/- 0.5 versus 1.1 +/- 0.4 in diabetic isografts (P < 0.03) and zero TxCAD in nondiabetic allografts (P < or = 0.0001). Fructose feeding resulted in a 1.5-fold higher triglyceride and a 1.3-fold higher cholesterol level versus the regular diet (-F) but showed no independent contribution to the development of TxCAD.These findings suggest that metabolic derangements associated with diabetes play an important role in TxCAD development in heterotopic ACI-to-Lewis rat heart transplantation. In this model of TxCAD in major histocompatibility complex-mismatched, diabetic, and dyslipidemic rats, immunologic and metabolic mechanisms that contribute to TxCAD can be further delineated and approaches to its prevention assessed.

Abstract

We report a case of fatal central nervous system infection with Scedosporium apiospermum (Pseudallescheria boydii) in a heart transplant recipient. This ubiquitous fungus is known to cause mycetoma and localized infections in patients with otherwise normal conditions. Disseminated infections occur rarely and are seen primarily in patients who are receiving immunosuppressive medications or who have neutropenia. Often life-threatening when infection is disseminated and involves the central nervous system, this diagnosis is difficult to make rapidly because S. apiospermum (P. boydii) mimics Aspergillus spp. and Fusarium spp., both clinically and histopathologically. Imidazoles such as miconazole, but not amphotericin B, are considered the therapeutic compounds of choice. Improved diagnostic and treatment options are needed to optimize management of infections with S. apiospermum (P. boydii).

Abstract

Studies in animals and humans have demonstrated that an increased heart rate is a predictor for the development of coronary atherosclerosis and overall cardiovascular mortality. In contrast, we have previously reported that the need for pacemaker implantation because of bradycardia in heart transplant recipients is associated with an increased prevalence of transplant coronary artery disease (TxCAD). Hence, the relevance of changes in heart rate to the development of TxCAD remains unclear. Intra-coronary ultrasound examinations (ICUS) were therefore analyzed in 130 heart transplant recipients (age 50 +/- 11 yr) studied at annual evaluations (3.7 +/- 3.0 yr after transplantation). Quantitative ultrasound measurements were obtained by calculating mean coronary artery intimal thickness (MIT) obtained by examination of the left anterior descending artery. The presence of TxCAD was defined as MIT > 0.3 mm. Resting heart rates (HR) were recorded with the patients in the supine position during routine echocardiography. Based on HR recordings, two groups were defined: group 1, HR below; or group 2, HR above the median. TxCAD was detected in 40% of the ICUS studies overall. The prevalence of TxCAD was higher in group 1 (49%) compared with group 2 (33%), p < 0.05. There was no significant difference in donor ischemic time or donor gender, recipient age, gender, body weight, CMV status, creatinine, total cholesterol, use of lipid lowering drugs or diltiazem. Donor age and use of beta-blockers were higher in group 1 compared with group 2 (29 +/- 10 vs. 25 +/- 9 yr, and 15% vs. 5%, for donor age and beta-blocker use, respectively). By multivariate regression analysis only donor age and years after transplantation were independently correlated with TxCAD. After excluding patients taking beta-blockers and diltiazem, the prevalence of CAD was still higher in group 1 (50%) vs. group 2 (34%). In conclusion, transplant coronary artery disease is more prevalent in patients with lower, rather than higher, heart rates. The reason for this is unclear, but may reflect impaired blood flow to the sinoatrial node.

Abstract

Methotrexate and total lymphoid irradiation (TLI) have been used successfully for treatment of recurrent and persistent rejection in orthotopic heart transplant recipients; however, there has been no comparison of these two modalities.We retrospectively compared the efficacy of methotrexate (n = 29) versus TLI (n = 28) in heart transplant recipients with recurrent or persistent rejection. All patients received induction therapy (rabbit anti-thymocyte globulin or OKT3) and standard triple immunosuppressive therapy. Methotrexate (7.5 mg to 22.5 mg per wk) or TLI (80 cGy x 10 fractions) was used for the treatment of recurrent or persistent rejection on the basis of clinical indications. Average biopsy scores (International Society of Heart and Lung Transplantation biopsy score/total number of biopsies performed) calculated over 3-month periods, daily maintenance prednisone dose before and after methotrexate or TLI treatment, and actuarial survival and freedom from angiographic coronary artery disease and infection were compared. To control for the general decrease in prednisone with increased time from transplantation, a control group matched for time from transplantation was selected.Recipient sex and age at transplant, donor age, and donor ischemic time were similar in both groups. Days after transplantation to start of therapy was longer in patients receiving methotrexate; however, this did not reach statistical significance. Patients receiving TLI had received more cumulative corticosteroids and OKT3 before the start of TLI therapy (p < 0.001). There were no differences in actuarial freedom from infection or coronary artery disease between the two groups and between the treatment groups and the control group. Actuarial survival was reduced in patients receiving TLI 3 years after transplantation (p < 0.05). Maintenance prednisone doses from 3 months before until 9 months after therapy (mg/kg) were not different between patients receiving TLI and methotrexate and were significantly greater than the prednisone doses in the control group. Four months after treatment initiation, the prednisone dose was significantly reduced in both treatment groups compared with the pretherapy dose (methotrexate 0.28 +/- 0.16 to 0.22 +/- 0.13, p = 0.05; TLI 0.36 +/- 0.16 to 0.22 +/- .07, p < 0.001). The average biopsy score was significantly reduced by both methotrexate and TLI therapy (methotrexate 1.8 +/- 0.7 to 0.83 +/- 0.9, p = 0.0001; TLI 2.1 +/- 0.8 to 1.0 +/- 0.9, p = 0.0001).Methotrexate and TLI are both effective for the treatment of recurrent or persistent rejection after heart transplantation, reducing average biopsy scores and daily maintenance prednisone doses. There was a reduction in actuarial survival rates in patients treated with TLI, possibly reflecting the greater rejection therapy received before TLI initiation. Because both agents are effective, the choice of methotrexate or TLI may be based on clinical indications, as well as other issues, such as cost, compliance, and availability.

Abstract

Pulmonary hypertension, defined as mean pulmonary artery pressure (mPAP) greater than or equal to 25 mmHg, is a recognized complication of hepatic dysfunction with portal hypertension and is considered a relative contraindication to liver transplantation. To characterize pulmonary hemodynamic responses in OLT candidates without pre-existing primary pulmonary hypertension, 22 consecutive patients referred for OLT at the Stanford University Hospital underwent prospective right heart catheterization with pressure determinations at baseline and following infusion of 11 crystalloid over 10 min. In addition, EKG, chest X-ray and transthoracic echocardiograms were performed as a part of the routine evaluation. Eleven non-cirrhotic patients served as controls. At baseline, 1/22 (4.5%) OLT patients had pulmonary hypertension while 9/22 (41%) developed pulmonary hypertension following volume infusion (p < 0.0001). In contrast, 0/11 controls manifested elevated pulmonary pressures at baseline or following volume challenge. OLT candidates were found to have significant increases in mean pulmonary pressure and capillary wedge pressure (PCWP) compared to controls, suggesting intravascular volume overload or left ventricular dysfunction as potential causes. OLT candidates who manifested volume-dependent pulmonary hypertension (a) had a 2-fold higher baseline PCWP, (b) currently smoked, and (c) had previously undergone portosystemic shunts. Aggregate analysis of EKG, echo and CXR for determination of volume-mediated pulmonary hypertension revealed a sensitivity of 25%, specificity of 75% and a positive predictive value of 40%. Preoperative identification of patients with a predisposition to manifesting elevated pulmonary pressures in the context of rapid volume infusion offers the potential for improved risk stratification and optimized clinical management.

Abstract

Doppler echocardiographic (DE) diastolic dysfunction has been correlated with rejection after orthotopic cardiac transplantation (Tx). However, the relationship of early diastolic dysfunction to late outcome is unknown. The purpose of this study was to assess the correlation between early DE diastolic dysfunction and outcome after heart Tx.Of 133 patients undergoing heart Tx between October 1990 and April 1994, 83 were identified with > or = 4 routine DE performed during the first 6 months. Assessment of diastolic function included measurement of isovolumic relaxation time (IVRT), pressure half-time (PHT), and peak early mitral inflow velocity (M1). Diastolic dysfunction was defined as a decrease of 15% from baseline (IVRT and PHT) or an increase of 20% (M1). A mean dysfunction score (MDS) was calculated for each patient (number of episodes of dysfunction by Doppler total number of echocardiograms performed). The population diastole MDS was determined and two groups established (group 1, MDS < mean; group 2, MDS > mean). Actuarial survival, rejection, and transplant coronary artery disease (TxCAD) were compared between groups. Actuarial survival was significantly reduced in patients with greater early diastolic dysfunction (P < .05). There were 17 deaths overall: 5 in group 1 (mean, 786 days) and 12 in group 2 (mean, 384 days). There were no significant differences in treated rejection episodes, actuarial freedom from rejection or TxCAD, immunosuppression, sex, donor age, donor ischemic time, or cytomegalovirus between the two groups.Diastolic dysfunction within 6 months of transplant was associated with an increased late mortality.

Abstract

The mechanisms underlying cardiac contractile dysfunction after transplantation remain poorly defined. Previous work has revealed that inducible nitric oxide synthase (iNOS) is expressed in the rat heterotopic cardiac allograft during rejection; resultant overproduction of nitric oxide (NO) might cause cardiac contractile dysfunction via the negative inotropic and cytotoxic actions of NO. In this investigation, we tested the hypothesis that induction of iNOS may occur and be associated with cardiac allograft contractile dysfunction in humans.We prospectively studied 16 patients in the first year after cardiac transplantation at the time of serial surveillance endomyocardial biopsy. Clinical data, the results of biopsy histology, and echocardiographic and Doppler evaluation of left ventricular systolic and diastolic function were recorded. Total RNA was extracted from biopsy specimens, and mRNA for beta-actin, detected by reverse transcription-polymerase chain reaction (RT-PCR) using human specific primers, was used as a constitutive gene control; iNOS mRNA was similarly detected by RT-PCR using human specific primers. iNOS protein was detected in biopsy frozen sections by immunofluorescence. Myocardial cGMP was measured by radioimmunoassay, and serum nitrogen oxide levels (NOx = NO2 + NO3) were measured by chemiluminescence. iNOS mRNA was detected in allograft myocardium at some point in each patient and in 59 of 123 biopsies (48%) overall. In individual patients, iNOS mRNA expression was episodic and time dependent; the frequency of expression was highest during the first 180 days after transplant (P = .0006). iNOS protein associated with iNOS mRNA was detected by immunofluorescence in cardiac myocytes. iNOS mRNA expression was not related to the ISHLT histological grade of rejection or to serum levels of NOx but was associated with increased levels of myocardial cGMP (P = .01) and with both systolic (P = .024) and diastolic (P = .006) left ventricular contractile dysfunction measured by echocardiography and Doppler.These data support a relation between iNOS mRNA expression and contractile dysfunction in the human cardiac allograft.

Abstract

The introduction of cyclosporine into widespread clinical use has resulted in improved patient survival following cardiac transplantation. As a result of increased numbers of cardiac transplants, the inherent nephrotoxicity of cyclosporine, and prolonged patient survival, cardiac transplant recipients commonly present with renal dysfunction. In the subgroup who ultimately develop end-stage renal disease (ESRD), therapeutic options include renal transplantation. However, the clinical course associated with this treatment modality is unknown. From 1980 to 1993, 430 cardiac transplants were performed with cyclosporine-based immunosuppression at the Standard University Medical Center. Fourteen (3.3%) patients developed ESRD, requiring chronic dialysis or renal transplantation. The cause of ESRD was cyclosporine nephropathy (13/14; 93%) and glomerulonephritis (1/14; 7%). The average time interval to the development of ESRD was 82 +/- 42 months. Nine patients underwent renal transplantation. During the period of followup (38 +/- 27 months; range 6-89 months) after renal transplantation, cardiac function remained stable. There were no episodes of primary nonfunction of the renal allograft. Patient and renal allograft survival was 89% at both 1 and 3 years after renal transplant. Average serum creatinine was 1.3 +/- 0.6 mg/dl at 1 year and 1.6 +/- 0.8 mg/dl at 3 years post-transplant. The incidence of infectious complications was not statistically different when compared to that of the heart transplant controls and that of a group of cadaveric renal transplant controls (n = 20). Surprisingly, the incidence of renal allograft rejection in the heart transplant patients was 10-fold less than that of the renal transplant controls (0.006 +/- 0.02/patient-year vs. 0.062 +/- 0.05/patient-year; p < 0.01).(ABSTRACT TRUNCATED AT 250 WORDS)

Abstract

Cardiac allograft vascular disease is characterized by accelerated and diffuse intimal proliferation involving both the microvasculature and epicardial vessels. Because in vivo documentation of this complication is now possible with intracoronary ultrasound imaging, we can examine the relationship of intimal proliferation to markers of immunity and endothelial activation. We hypothesize that alterations of microvascular cell surface markers likely mirror changes in the epicardial vessels that may be important in the pathophysiology of intimal proliferation.Forty-three heart transplant patients were examined by intracoronary ultrasound more than 1 year after transplantation, and these images were analyzed to obtain mean intimal thickness and intimal thickness class (I through IV), calculated from the mean thickness and circumferential involvement. Right ventricular endomyocardial biopsies obtained at the time of intracoronary ultrasound were examined by immunohistochemistry to detect microvascular expression of histocompatibility leukocyte antigen (HLA) classes I and II (HLA ABC, DR, DP, and DQ); endothelial-specific antigen detected by the monoclonal antibody E 1.5; intercellular adhesion molecules (ICAM-1); CD4+ and CD8+ lymphocytes and macrophages (CD 14+). Microvascular antigen expression was graded 1 through 5 on the basis of the diffuseness of positive staining. The number of each inflammatory cell phenotype present per high-power field was counted. By ANOVA, scores for HLA DR, HLA DQ, and E1.5 expression were lower in intimal thickness classes II, III, and IV compared with class I. This inverse relationship was significant by linear regression analysis of mean intimal thickness. Inflammatory cells were not significantly correlated with intimal thickening. Rejection incidence was higher, and time since transplantation longer, in intimal thickness classes II, III, and IV compared with class I.Transplant coronary artery intimal proliferation is associated with alteration of microvascular endothelial cell surface markers. These changes in cell surface antigen expression could provide the substrate for coronary artery intimal proliferation and narrowing.

Abstract

The longitudinal distribution and circumferential pattern of coronary intimal proliferation were studied with intravascular ultrasonography in 135 patients after heart transplantation. Eighty-seven (64%) of 135 patients had significant intimal thickening, with most lesions (63%) concentric and free of fibrosis or calcification. Both diffuse and nonuniform longitudinal patterns of intimal thickening were found.

Abstract

The purpose of this study was to quantify the severity of transplant coronary artery disease and to assess lesion characteristics early and up to 15 years after heart transplantation by using intracoronary ultrasound.Intravascular ultrasound has the ability to measure the components of the arterial wall and has been shown to be a sensitive method for detection of transplant coronary artery disease.A total of 304 intracoronary ultrasound studies were performed in 174 heart transplant recipients at baseline and up to 15 (mean 3.3 +/- 0.2) years after transplantation. Mean intimal thickness and an intimal index were calculated, and lesion characteristics (eccentricity, calcification) were assessed for all coronary sites imaged (mean 3.0 +/- 0.1 sites/study). The Stanford classification was used to grade lesion severity.Compared with findings in patients studied at baseline (< 2 months after transplantation, n = 50), mean intimal thickness (0.09 +/- 0.02 vs. 0.16 +/- 0.02 mm, p < 0.01), intimal index (0.07 +/- 0.01 vs. 0.14 +/- 0.02, p < 0.01) and mean severity class (1.5 +/- 0.2 vs. 2.3 +/- 0.2, p < 0.01) were significantly higher at year 1 (n = 52) after transplantation. Thereafter, all three variables further increased over time and reached highest values between years 5 and 15. Calcification of lesions was detected in 2% to 12% of studies up to 5 years after transplantation, with a significant increase to 24% at years 6 to 10 (p < 0.05).Severity of transplant coronary artery disease appeared to progress with time after transplantation in this cross-sectional study. This characteristic was most prominent during the 1st 2 years after transplantation, whereas calcification of plaques occurred to a significant extent only later in the process. These data may serve as a reference for comparison of intravascular ultrasound findings in other studies of patients with transplant coronary artery disease.

Abstract

Serial quantitative coronary angiography is used to assess progression of coronary disease; however, pathology studies have demonstrated angiographic insensitivity for determining atheroma. Intracoronary ultrasound (ICUS) can define and measure the components of the arterial wall and offers the potential for precise quantitative assessment of disease progression on serial examinations. The present study was done to test the feasibility of serially assessing intimal proliferation at the same coronary site with ICUS imaging in cardiac transplant recipients.ICUS imaging was done with a 30-MHz, 5F or 4.3F ultrasound imaging catheter at the time of angiography in 70 cardiac allografts (3.8 sites per patient) initially and 1 year later. Mean intimal thickness (IT), luminal area (LA), and total area (TA) of lumen plus intima and an index of intimal thickness (II = TA - LA/TA) were measured at each site. Additionally, vessels were graded using a scale incorporating criteria of intimal thickness and circumferential involvement. Side-by-side comparisons of paired angiograms were performed both to verify the similarity of ICUS imaging site and to detect new angiographic abnormalities. At least one site could be assessed serially by ICUS in 100% of patients, but only 189 of the original 263 coronary sites (72%) (2.7 sites per patient) could be matched satisfactorily on the second study. Thirty-nine patients (56%) had mild IT and 31 patients (44%) had moderate or severe IT on the initial study. Both groups showed the same IT progression the following year (delta = 0.05 +/- 0.13 versus 0.07 +/- 0.15 mm; P = NS). Twenty-seven of the 70 patients (39%) showed progression by ICUS. The 23 patients with ICUS progression and angiographically normal vessels had the same progression in intimal thickening as the 4 patients with ICUS progression but showing angiographic disease (delta = 0.17 +/- 0.13 versus 0.22 +/- 0.10 mm; P = NS).Replication of the intracoronary imaging site by judgment of two observers at an initial study and at a second study 1 year later was possible in at least one vessel site in 100% of the 70 patients and in 72% (189 of 263) of the original imaging sites (2.7 sites per patient). Serial ICUS demonstrates progression of intimal thickening at specific sites in only some cardiac transplant patients. Progression of intimal proliferation can occur in individuals in the presence or absence of initially increased intimal thickening or of angiographic disease at the time of the initial studies. Angiography is insensitive for recognizing early intimal thickening of the coronary vessels.

Abstract

The long-term success of heart transplantation for end-stage heart disease has been hindered by the problems associated with acute and chronic graft rejection, opportunistic infections and potentially fatal complications of intensive immunosuppression. A more complete understanding of the biology of transplant rejection should provide the basis for the development of improved methods for controlling and monitoring rejection. Cytokines, the soluble factors which regulate the immune response, are central to the rejection process. The objective of this study was to analyse cytokine mRNA transcripts in 99 biopsy samples and 89 blood samples from 65 and 35 Stanford Medical Center cardiac transplant recipients, respectively, gathered between January 1990 and January 1992. Following RNA extraction and conversion to cDNA, samples were amplified with cytokine-specific primers for interleukins (IL) 1 to 8, TNF-beta (tumour necrosis factor-beta) and IFN-gamma (interferon-gamma) and were analysed by gel electrophoresis and Southern blot hybridization. Our results demonstrate that despite chronic immunosuppressive therapy, the peripheral blood of transplant recipients expressed a higher combined percentage of different cytokine transcripts than did peripheral blood obtained from normal volunteers. In transplant patients, detection of cytokine transcripts for IL-1 alpha, IL-1 beta and IL-2 increased with time after transplantation. Intragraft IL-7 gene expression was significantly increased in biopsies diagnosed with mild (grade 1) rejection when compared to those with no evidence of rejection or with moderate to severe rejection. Implications of these results in light of possible mechanisms of rejection and of new approaches to immunotherapy are discussed.

Abstract

Transplant coronary artery disease is the leading cause of allograft failure in heart transplant recipients surviving beyond 1 year. Coronary angiography still remains the major technique for surveillance of these patients, with recent use of intracoronary ultrasonography to detect the early stages of intimal thickening. We evaluated exercise echocardiography to screen for the presence or absence of angiographic evidence of transplant coronary artery disease in any vessel, defined as follows: absent; stenosis 39% or less = mild; stenosis 40% to 69% = moderate; or stenosis > or = 70%, or more = severe. Fifty-one consecutive heart transplant recipients undergoing routine annual evaluation were included in the study. Of thirty-seven patients with no coronary artery disease, thirty-two had a normal and five had an abnormal exercise echocardiogram. Fourteen patients (27%) had transplant coronary artery disease by angiographic criteria; six had mild, six had moderate, and two had severe stenosis. One patient with mild and the two patients with severe transplant coronary artery disease had abnormal exercise echocardiograms. None of the patients with moderate disease had an abnormal exercise echocardiogram (false negative). Of forty-three patients with no or mild stenosis, 19 patients had moderate to severe intimal proliferation as seen with intracoronary ultrasonography. Of eight patients with moderate or severe stenosis, four were tested with intracoronary ultrasonography and all had moderate to severe intimal proliferation. Six patients had a "false positive" exercise echocardiogram, and of four who were tested with intracoronary ultrasonography, two had mild and two had moderate to severe intimal thickening. In summary, exercise echocardiography correctly excluded the presence of transplant coronary artery disease in 86% of patients but was associated with a high false negative rate for detection of moderate coronary stenosis. A false positive exercise echocardiogram was associated with intimal proliferation by intracoronary ultrasonography in several patients and suggests that coronary angiography may underestimate significant coronary artery disease.

Abstract

Development of coronary artery disease (CAD) in the cardiac allograft limits long-term survival after heart transplantation. Previous studies, focusing on lipoprotein metabolism, have paid little attention to changes in glucose and insulin metabolism that increase the risk of CAD in these patients. To address this issue, plasma glucose and insulin responses to an oral glucose load and lipid and lipoprotein concentrations were measured in male normal volunteers (n = 40) and cardiac transplant recipients with pretransplant diagnoses of either idiopathic cardiomyopathy (n = 24) or ischemic heart disease (n = 28), matched for age and body mass index. Patients with a pretransplant diagnosis of ischemic heart disease had higher plasma glucose and insulin concentrations in response to oral glucose as well as higher fasting plasma triglyceride, cholesterol, and low-density lipoprotein cholesterol concentrations than did the control group (p < 0.005 to p < 0.001). In addition, high-density lipoprotein cholesterol concentrations were lower and the ratio of cholesterol to high-density lipoprotein cholesterol higher than control values in those with a pretransplant diagnosis of ischemic heart disease (p < 0.001). Values for almost all variables were intermediate in patients with a pretransplant diagnosis of idiopathic cardiomyopathy and in most instances were significantly different from both. Thus, male cardiac transplant recipients are dyslipidemic, relatively glucose intolerant, and hyperinsulinemic compared to normal volunteers. These changes, observed in patients with a pretransplant diagnosis of either ischemic heart disease or idiopathic cardiomyopathy, emphasize the important role of immunosuppression in the development of metabolic risk factors for CAD in these individuals.

Abstract

Single-lung transplantation has been successfully performed in patients with pulmonary fibrosis and emphysema. In contrast, patients with end-stage pulmonary hypertension (either primary or secondary to Eisenmenger's syndrome) have conventionally been offered heart-lung transplantation. The rationale underlying this approach is that chronic pulmonary hypertension results in irreversible right ventricular dilatation and failure. Recovery of the right ventricle has previously been reported after thromboendarterectomy for chronic large-vessel pulmonary embolism, correction of atrial septal defect or mitral valve replacement. The evolution of right ventricular morphology and function after lung transplantation has not been previously described. This study examines the reversibility of right ventricle dysfunction following normalization of pulmonary artery pressure after single-lung transplantation in 4 patients with pulmonary hypertension. Cardiac function was assessed using electrocardiography, echocardiography and radionuclide angiography. Pulmonary hemodynamic measurements, including pulmonary artery pressure and pulmonary vascular resistance, decreased in all patients after single-lung transplantation. Electrocardiographic changes observed were leftward shift in the QRS axis, and a decrease in P-wave amplitude and in right ventricular force. Echocardiographic examination revealed decreased right atrial, right ventricular and tricuspid valve annular dimensions, normalization of septal motion, and decreased tricuspid regurgitation. Thus, improved pulmonary hemodynamics after single-lung transplantation for pulmonary vascular disease results in reversal of right heart dilatation and dysfunction, and improved myocardial performance. The extent of right ventricular dysfunction beyond which recovery is unlikely to occur has yet to be determined.

Abstract

Conflicting data exist on the role of graft rejection as a risk factor for later development of accelerated graft coronary artery disease. We analyzed 126 consecutive heart transplant recipients treated with cyclosporine-based immunosuppressive regimens and devised an arbitrary method to incorporate the number, duration, and severity of myocardial rejection episodes during the first postoperative year, resulting in a rejection score for each patient. We then correlated the later incidence (mean follow-up, 4 years) of angiographic accelerated graft coronary artery disease with this rejection score and with its components: number, duration, and severity of rejection; number and duration of untreated rejection; and incidence and duration of delayed rejection therapy. Accelerated graft coronary artery disease developed in 60 patients (48%). The rejection score was 96.7 for patients in the "no accelerated graft coronary artery disease" group and 110.4 for those in the "accelerated graft coronary artery disease" group (p = NS). No significant difference was noted between patients with and without disease in any of the other seven rejection parameters analyzed, and no significant difference in time to occurrence of disease was noted between groups divided at the median rejection score. Donor age was older and fasting triglyceride blood level was higher in patients with accelerated graft coronary artery disease than in those without disease. All other clinical characteristics, including HLA mismatches, ischemic time, blood pressure, lipid profile, and drug therapy, did not differ between the two groups.(ABSTRACT TRUNCATED AT 250 WORDS)

Abstract

Accelerated coronary atherosclerosis is a major factor limiting allograft longevity in cardiac transplant recipients. Histopathology studies have demonstrated the insensitivity of coronary angiography for detecting early atheromatous disease in this patient population. Intracoronary ultrasound is a new imaging technique that provides characterization of vessel wall morphology. The purpose of this study was to compare in vivo intracoronary ultrasound with angiography in cardiac transplant recipients.The left anterior descending coronary artery was studied with intracoronary ultrasound in 80 cardiac transplant recipients at the time of routine screening coronary angiography 2 weeks to 13 years after transplantation. A mean and index of intimal thickening were obtained at four coronary sites. Intimal proliferation was classified as minimal, mild, moderate, or severe according to thickness and degree of vessel circumference involved. Twenty patients were studied within 1 month of transplantation and had no angiographic evidence of coronary disease. An intimal layer was visualized by ultrasound in only 13 of these 20 presumably normal hearts. The 60 patients studied 1 year or more after transplantation all had at least minimal intimal thickening. Twenty-one patients (35%) showed minimal or mild, 17 (28%) moderate, and 21 (35%) severe thickening. Forty-two of these 60 patients had angiographically normal coronary arteries, 21 (50%) of whom had either moderate or severe thickening. All 18 patients with angiographic evidence of coronary disease had moderate or severe intimal thickening, but there was no statistically significant difference in intimal thickness or index when compared with the patients with moderate or severe proliferation and normal angiograms (thickness, 0.53 +/- 0.35 mm versus 0.64 +/- 0.30 mm, p = NS; index, 0.28 +/- 0.10 versus 0.34 +/- 0.10, p = NS).The majority of patients 1 or more years after cardiac transplantation have ultrasound evidence of intimal thickening not apparent by angiography. Intracoronary ultrasound offers early detection and quantitation of transplant coronary disease and provides characterization of vessel wall morphology, which may prove to be a prognostic marker of disease.

Abstract

Coadministration of diltiazem with cyclosporine (CsA) has been reported to alter the metabolism of CsA, resulting in increased blood concentration with potential nephrotoxicity if dosage is not adjusted. This report analyzes the cost saving resulting from use of diltiazem and CsA together and examines the impact on renal function. Sixty-nine heart transplant recipients (59 men, 10 women) were randomized to diltiazem (n = 32) or to no calcium blocker (n = 37). Age range was 18 to 58 years. All patients received CsA (titrated to a 12-hour trough serum level of 100 to 200 ng/ml), azathioprine, and prednisone. Diltiazem was begun at 30 mg three times daily increasing to 60 mg three times daily at 1 month, as tolerated. Renal function was assessed by serial measurements of serum creatinine. Parameters before and after starting diltiazem were compared by paired t-tests, and differences between group means by analysis of variance. CsA doses and levels were comparable at baseline in both groups. At 12 months, CsA dose requirement was 2.5 +/- 1.0 versus 5.9 +/- 3.2 mg/kg/day (diltiazem group versus no calcium blocker group; p less than or equal to 0.001) to achieve similar serum levels (96 +/- 51 versus 123 +/- 96 ng/ml; p = NS). This represents a 48% reduction in dose cost of CsA. The average cost of CsA for 2 to 4 months of therapy in a patient weighing 70 kg was reduced from $12,122 in the no calcium blocker group to $6,356 in the diltiazem group.(ABSTRACT TRUNCATED AT 250 WORDS)

Abstract

Coronary artery vasomotion is altered after cardiac transplantation. The impact of accelerated transplant coronary atherosclerosis and myocardial rejection on vasomotion is not well understood. Intravascular ultrasound is a new imaging method with the ability to study real-time changes in coronary artery dimensions.Epicardial coronary artery response to nitroglycerin was studied in 32 cardiac transplant recipients (age, 47 +/- 11 years) 3 weeks to 10 years after transplantation with intracoronary ultrasound. Cross-sectional luminal area and diameter were measured at a fixed position in the left anterior descending artery immediately before and every 30 seconds for 5 minutes after 0.4 mg of sublingual nitroglycerin. Cross-sectional area increased from a baseline of 13.1 +/- 3.9 mm2 to 15.8 +/- 3.9 mm2 at maximal vasodilation; luminal diameter increased from 4.0 +/- 0.6 mm to 4.5 +/- 0.6 mm. This increase reached statistical significance (p less than 0.001) at 1.5 minutes after administration of nitroglycerin; mean maximum increase occurred at 4.5 minutes (24% for cross-sectional area and 11% for luminal diameter). Patients with biopsy-proven mild or moderate concurrent rejection had a significantly blunted vasodilatory response versus the nonrejection group (9% versus 27% for cross-sectional area, p less than 0.04), although a vasodilatory effect was still present. Nitroglycerin response was well preserved in patients up to 10 years after transplantation; however, there was a trend toward a decreased response in patients studied immediately after transplantation (21% versus 29%, p = 0.37). Coronary intimal thickness, as measured by ultrasound, had no impact on the vasodilatory response (R = 0.23, p = 0.34).Vasodilatory response to nitroglycerin in cardiac transplant recipients is attenuated during episodes of cardiac rejection. This response is preserved in long-term survivors and is independent of the degree of intimal thickening. Intravascular ultrasound provides a new method to document real-time epicardial coronary vasomotion.

Abstract

Orthotopic cardiac transplantation is occasionally complicated by unexplained bradyarrhythmias. Sinus node injury as a consequence of operation or acute rejection has anecdotally been linked to the development of bradycardia early after transplantation. These arrhythmias are empirically managed by pacemaker implantation, the indications for which remain poorly defined. This retrospective study examined the 20-year experience of our institution with bradyarrhythmias after transplantation to determine the predisposing factors and indications for pacemaker implantation. Forty-one of 556 patients in our cardiac transplant program (7.4%) received permanent pacemakers between 1969 and 1989. The predominant rhythm disturbances were junctional rhythm (46%), sinus arrest (27%) and sinus bradycardia (17%). Most patients were asymptomatic (61%), and presented in the early post-transplant period (73%). Four possible predisposing factors were evaluated: (1) graft ischemic time, (2) rejection history, (3) use of bradycardia-inducing drugs, and (4) anatomy of blood supply to the sinoatrial (SA) node. No significant differences existed between patients with and without pacemakers with regard to the first 3 variables. However, after transplantation angiograms showed that prevalence of abnormal SA nodal arteries was greater in patients with than without pacemakers (p less than 0.02). Pacemaker follow-up at 3, 6 and 12 months showed persistent bradycardia (60 to 90 beats/min) in 88, 75 and 50% of patients, respectively. The most common pacemaker complication (15%) was lead displacement at time of biopsy. These results suggest that disruption of the SA nodal blood supply may be an important predisposing factor in the development of bradycardias.

Abstract

In acute cardiac rejection, left ventricular diastolic function is altered, and a restrictive ventricular filling pattern occurs. Doppler echocardiographic indexes of mitral inflow have been proposed as sensitive markers of the rejection process. As rejection progresses, the restrictive ventricular filling pattern is reflected by a shortening of isovolumic relaxation time and mitral valve pressure half-time and by an increase in early transmitral filling velocity. Diastolic function is also compromised in the nonrejecting cardiac transplant recipient during the early postoperative period. This study examined the progression in Doppler-derived mitral filling indexes in 25 recent cardiac transplant recipients who demonstrated no histological evidence of transplant rejection. Isovolumic relaxation time, mitral valve pressure half-time, and early transmitral filling velocity were measured at postoperative weeks 1, 2, 4, and 6 on the day that surveillance right ventricular endomyocardial biopsies were performed. The initial indexes were comparable to previously described restrictive parameters and over the 6-week study period evolved into a nonrestrictive filling pattern. This evolution reflects a progressive improvement in postoperative diastolic function and a decrease in left heart filling pressures. None of the evaluated clinical characteristics, including preoperative pulmonary pressures, total ischemic time of the transplanted heart, cardiopulmonary bypass time, and age of the donor heart, correlated with this process. Given the increasing use of Doppler echocardiography as a means of screening for transplant rejection, it is important to have a thorough understanding of normal postoperative changes in left ventricular diastolic function.

Abstract

This study examines the reproducibility and variability of pulsed wave Doppler versus continuous wave Doppler ultrasound indexes of left ventricular filling in cardiac allograft recipients and in normal subjects. The following indexes were studied: isovolumic relaxation time, pressure half-time, peak early mitral flow velocity, and peak mitral flow velocity after atrial systole. Intraobserver and interobserver variability were assessed by regression analysis. Individual components of variance (subject, reader, beat, day, and tracing) were estimated in a subset of five patients and five normal subjects, and estimated total variance defined for each group. Temporal (day-to-day) variability for 95% confidence was estimated for these patients and for normal subjects. Temporal variability in the group from which the subsets were drawn was measured from absolute and percent change in values on two occasions. Estimated and observed 95% confidence limits were compared. Intersubject variability was the largest component of variance in both transplant recipients and in normal subjects. For all indexes in transplant recipients (in the absence of rejection) and normal subjects, observed absolute mean differences (+/- 2 standard deviations) between values from recordings taken on two different days were larger than the 95% confidence limits estimated from the components of variance analysis. The observed 95% limits for transplant recipients versus normal subjects were as follows: isovolumic relaxation time, 20 msec versus 6 msec; pressure half-time, 16 msec versus 9 msec; peak early mitral flow velocity, 32 cm per second versus 17 cm per second; and peak mitral flow velocity after atrial systole, 28 cm per second versus 10 cm per second.(ABSTRACT TRUNCATED AT 250 WORDS)

Abstract

Cyclic variation of integrated ultrasonic backscatter (IB) was noninvasively measured in the septum and left ventricular posterior wall using a quantitative IB imaging system to assess the alterations in the acoustic properties of myocardium associated with acute cardiac allograft rejection. The study population consisted of 23 cardiac allograft recipients and 18 normal subjects. In each cardiac allograft recipient, one to eight (mean, four) IB studies were performed, each within 24 hours of right ventricular endomyocardial biopsy performed for rejection surveillance. The magnitude of the cyclic variation of IB in the posterior wall was 5.9 +/- 0.9 dB in normal subjects and 6.2 +/- 1.3 dB in the cardiac allograft recipients without previous or current histological evidence of acute rejection (n = 17, p = NS vs. normal subjects). The magnitude of cyclic variation of IB in the septum was 4.8 +/- 1.1 dB in normal subjects and 3.8 +/- 2.0 dB in the cardiac allograft recipients (n = 15, p = NS vs. normal subjects). A significant decrease in the septal IB measure was observed in cardiac allograft recipients with left ventricular hypertrophy (wall thickness of at least 13 mm) (2.6 +/- 1.7 dB, n = 8, p less than 0.05 vs. normal subjects). IB studies were done before and during moderate acute rejection in 11 recipients (14 episodes). During moderate acute cardiac rejection, the magnitude of the cyclic variation in IB decreased from 6.7 +/- 1.3 to 5.1 +/- 1.4 dB in the posterior wall (n = 14, p less than 0.05) and from 4.2 +/- 2.1 dB to 2.9 +/- 1.8 dB in the septum (n = 12, p less than 0.05). These data suggest 1) the magnitude of the cyclic variation in IB of the septum is different in cardiac allografts with cardiac hypertrophy and normal subjects, possibly reflecting regionally depressed myocardial contractile performance and 2) acute cardiac rejection in humans is accompanied by an alteration in the acoustic properties of the myocardium. This change is detectable by serial measurement of the magnitude of the cyclic variation in IB, both in the septum and in the posterior wall.

Abstract

Cardiac transplantation has traditionally been reserved for individuals with end-stage congestive heart failure (CHF) in whom there is no history of other life-threatening systemic disorders. In most transplant centers, patients with a history of malignancy and severe heart failure have not been considered acceptable candidates for cardiac transplantation. In the last 4 years at Stanford University Medical Center, 8 cardiac transplants have been performed in 7 patients with a history of neoplastic disease. Six of these patients had already received treatment for lymphoproliferative disorders and in 1 case, a patient underwent a transplant after treatment for adenocarcinoma of the colon. Six of the 7 patients were discharged from the hospital and in that group, the 1-year posttransplant survival rate was 71%. This was comparable to an overall 1-year survival rate of 80% for patients undergoing a cardiac transplant at our center during the same period of time. At follow-up averaging over 2 years, there has been 1 case of recurrent neoplasia. One patient with evidence of radiation-induced pulmonary damage died of respiratory failure 2 days after transplantation. One patient required retransplantation because of intractable rejection and subsequently died from infectious complications. Immunosuppressive therapy in these patients has not been associated with an increased risk for neoplastic recurrence or for the development of posttransplant lymphoproliferative disorders. The current study demonstrates that in a carefully selected group, previously treated neoplastic disease should not represent a contraindication to cardiac transplantation.

Abstract

This article reviews the evolution of therapeutic strategies for maintenance immunosuppression, and presents current approaches to prevention, treatment, and surveillance of acute rejection. Other major complications influencing mortality and morbidity are discussed; these include infection, cyclosporine nephrotoxicity, malignancy, and bone disease.

Abstract

Rapid development of diffuse, occlusive coronary artery disease in the cardiac allograft has emerged as a major limiting factor for long-term survival after transplantation. Prior multivariate analyses have failed to identify any strong predictors of this disease. We retrospectively reviewed serial annual coronary angiograms to assess the prevalence of transplant coronary artery disease. A total of 103 patients treated initially with azathioprine-based therapy were compared with a later cohort of 78 patients for whom cyclosporine was the basis of immunosuppressive therapy. The percent of patients free of angiographically visible transplant coronary artery disease at 1 year was 89% for the azathioprine group versus 86% for the cyclosporine group. At 3 years, 74% of the azathioprine group versus 63% of the cyclosporine group were free of visible disease (p = NS). By the fifth postoperative year, 58% of azathioprine-treated and 50% of cyclosporine-treated patients were free of transplant coronary artery disease (p = NS). The mean number of rejection episodes in the first year after transplantation was 2.0 for cyclosporine patients versus 2.5 for azathioprine patients. The azathioprine and cyclosporine patients were compared with respect to a variety of baseline and clinical follow-up measurements that might influence the development of coronary artery disease. Patients in the cyclosporine group had higher blood pressure (135/94 versus 123/85 mm Hg, p less than 0.001) and were receiving lower maintenance prednisone doses. This study demonstrates that improved cyclosporine immunosuppression does not decrease the time-related prevalence of transplant coronary artery disease.

Abstract

A familial etiology was identified on the basis of family history in 16 (8.75%) of 184 patients undergoing cardiac transplantation at Stanford for endstage dilated cardiomyopathy (DC). These 16 patients, from 11 families, included 5 sibling pairs. To help determine optimal management of such patients, their case histories and posttransplant courses were reviewed. Mean age of patients at presentation was 23 +/- 15 years. In sibling pairs, duration of symptoms from onset to diagnosis was 14 +/- 5 weeks for the first sibling, but only 4 +/- 2 weeks for the second. Progressive cardiac enlargement was documented radiographically in siblings of transplant recipients in 2 families before the onset of symptoms. The posttransplant course with regard to rejection incidence, infectious complications, coronary artery disease and malignancy was similar to that of the 168 patients with nonfamilial DC. Actuarial survival at 5 years after transplantation was 80%. Thirteen patients (including all sibling pairs) are alive 1 to 11 years after transplantation. Sepsis was the cause of death in 3 patients, occurring during the early postoperative period in 2 and following retransplantation for graft atherosclerosis 7 years after the initial transplant in the third patient. Thus, diagnosis of DC in childhood or adolescence mandates evaluation and surveillance of family members, because this disease can progress rapidly. The favorable results of cardiac transplantation for familial DC suggest that it should be promptly considered for such patients with end-stage disease.

Abstract

Although pericardial effusion after cardiac surgery is frequent and usually benign, its etiology and prognosis after cardiac transplantation are unknown. During 1 year (1985-1986), 12 of our current transplant population (total, 189) developed moderate or large pericardial effusions confirmed by two-dimensional echocardiography. These effusions occurred within 1 month of transplantation in 10 patients and at 3 months and 4.5 years in the other two. Pericardiocentesis was performed because of clinical evidence of increasing effusions in eight patients, with demonstrable hemodynamic compromise secondary to tamponade in five. Pericardial fluid was sterile in all but one. Endomyocardial biopsy at the time of increasing effusion revealed moderate acute rejection in five patients, mild rejection in three, and no rejection in four. All three patients with mild rejection had moderate acute rejection on subsequent biopsy performed within 7 days. In two of the four with no rejection, repeat biopsy within 5 days showed moderate acute rejection; in a third, moderate rejection was present on biopsy performed 14 days later. Legionella dumoffii was isolated from the pericardial fluid of the fourth patient, whose subsequent biopsies never showed rejection. Three of the 12 patients developed progressive ventricular dysfunction sufficiently severe to require retransplantation. One patient died suddenly 12 months after transplantation, and autopsy examination revealed severe coronary artery disease. Two died of sepsis within 3 months of transplantation. Intense inflammatory infiltrates and thickening of the pericardium and epicardium were characteristically present in explanted and autopsy hearts.(ABSTRACT TRUNCATED AT 250 WORDS)

Abstract

Severe heart failure is associated with a reduction in myocardial beta-adrenergic receptor density and an impaired contractile response to catecholamine stimulation. Metoprolol was administered during a 6-month period to 14 patients with dilated cardiomyopathy to examine its effects on these abnormalities. The mean daily dose of metoprolol for the group was 105 mg (range, 75-150 mg). Myocardial beta-receptor density, resting hemodynamic output, and peak left ventricular dP/dt response to dobutamine infusions were compared in 9, 14, and 7 patients, respectively, before and after 6 months of metoprolol therapy while the patients were on therapy. The second hemodynamic study was performed 1-2 hours after the morning dose of metoprolol had been given. Myocardial beta-receptor density increased from 39 +/- 7 to 80 +/- 12 fmol/mg (p less than 0.05). Resting hemodynamic output showed a rise in stroke work index from 27 +/- 4 to 43 +/- 3 g/m/m2, p less than 0.05, and ejection fraction rose from 0.26 +/- 0.03 to 0.39 +/- 0.03 after 6 months of metoprolol therapy, p less than 0.05. Before metoprolol therapy, dobutamine caused a 21 +/- 4% increase in peak positive left ventricular dP/dt; during metoprolol therapy, the same dobutamine infusion rate increased peak positive dP/dt by 74 +/- 18% (p less than 0.05). Thus, long-term metoprolol therapy is associated with an increase in myocardial beta-receptor density, significant improvement in resting hemodynamic output, and improved contractile response to catecholamine stimulation. These changes indicate a restoration of beta-adrenergic sensitivity associated with metoprolol therapy, possibly related to the observed up-regulation of beta-adrenergic receptors.

Abstract

Conventional hemodynamic measurements and Doppler echocardiography were used to assess ventricular physiology of the human cardiac allograft and to examine the influence of pertinent clinical factors on chronic myocardial performance. Sixty-four patients (18-55 years old; mean, 39 years) undergoing routine annual hemodynamic assessment were studied. Blood-flow velocity properties across the mitral, tricuspid, and aortic valves were analyzed from Doppler ultrasound recordings. Ten of these patients had elevated diastolic pressures associated with a sharp early diastolic dip followed by an exaggerated and abrupt rise in pressure, consistent with restrictive-constrictive ventricular physiology. Left ventricular dP/dt and stroke volume were lower in these patients compared with the other 54 patients. Doppler echocardiographic indexes of left ventricular filling and ejection in these 10 patients differed significantly. Isovolumic relaxation time and pressure half-time were shorter, peak early mitral and tricuspid flow velocities were higher, and mean aortic flow velocity and acceleration were lower. A higher rejection incidence was the only demonstrable clinical factor associated with impaired ventricular function. Doppler echocardiography may, therefore, noninvasively identify patients with hemodynamic evidence of restrictive-constrictive physiology. This abnormality occurs in approximately 15% of allograft recipients, is associated with impaired systolic performance, and may be related to rejection incidence.

Abstract

Cardiac transplantation is now an accepted therapeutic option for patients with end-stage myocardial failure. Provided donor and recipient are appropriately selected and adequately matched, expected survival rates at one and five years are 85% and 65%, respectively. Two major challenges are encountered in clinical heart transplantation. The first is monitoring immunosuppression for adequate prevention of acute rejection and surveillance for side effects. The endomyocardial biopsy remains the gold standard for rejection surveillance, but since it is an invasive procedure which can only be performed at arbitrary time intervals, the search for non-invasive methods continues. The approach to immunosuppression currently practised by most centers is that of combination drug therapy, which allows low doses with decreased potential for side effects. At Stanford, immunosuppression is usually initiated with OKT3, corticosteroids, and cyclosporine, and maintained with a combination of steroids, cyclosporine, and azathioprine. The most frequently encountered complications include bacterial and opportunistic infections, cyclosporine nephrotoxicity, and malignancy. The second challenge is accelerated coronary disease, which has emerged as the major factor limiting long-term survival. It is usually clinically silent and often presents with sudden death, acute myocardial infarction, or progressive unexplained graft failure. Coronary arteriography is currently the only method for premorbid diagnosis, and retransplantation the only effective therapy.

Abstract

Changes in left ventricular filling and ejection as potential markers of cardiac allograft rejection were evaluated by serial Doppler echocardiography performed in 23 normal volunteers and within 24 hr of endomyocardial biopsy in 22 patients aged 14 to 53 years (mean 37). Peak aortic velocity, left ventricular ejection time index (ETI), isovolumic relaxation time (IVRT), mitral valve pressure half-time (PHT), peak early mitral flow velocity (M1), and velocity following donor atrial systole (M2) were measured without prior knowledge of endomyocardial biopsy findings. Biopsy specimens were graded histologically as: no rejection, mild rejection (cellular infiltrate), and moderate rejection (myocyte necrosis). A total of 120 biopsy-correlated Doppler echocardiographic studies were performed during 16 weeks after cardiac transplantation. Heart rate and mean arterial pressure were significantly higher in transplant recipients than in normal subjects. IVRT and PHT were significantly longer, while M1 and M2 were similar. Peak aortic velocity was higher in normal subjects than in transplant recipients, while ejection time was similar. Rejection of increasing severity was associated with a progressive shortening of IVRT and PHT and with an increase in M1 (p less than .0005 for all comparisons). Peak aortic velocity and ejection time index did not change significantly with rejection. These data indicate that acute cardiac rejection is accompanied by alteration in left ventricular filling dynamics detectable by Doppler echocardiography, without measureable changes in systolic function. These changes may provide noninvasive markers for surveillance of rejection.

Abstract

From 1976 to 1986, six cases of cardiac sarcoidosis have been documented by myocardial biopsy in three of five instances; on examination of the explanted heart after transplantation in two, and at autopsy in one patient. Right ventricular end-diastolic pressure was elevated in all four patients with right ventricular involvement with sarcoidosis. Of three patients treated with steroids, improvement in ventricular function and decrease in arrhythmia occurred in two, whereas failure to respond led to transplantation in the other patient. Two further patients have undergone heart transplantation, one for resistant ventricular arrhythmia and the other for congestive heart failure. No recurrence of sarcoidosis has occurred in the grafts. Because two of five patients had sarcoidosis diagnosed on gross examination, a negative endomyocardial biopsy does not exclude the diagnosis of myocardial sarcoidosis, which should therefore be pursued in the setting of unexplained heart failure, conduction abnormalities, and ventricular arrhythmia, particularly when right ventricular end-diastolic pressure is raised. Steroids may result in improvement in some patients even in the presence of severe morphological damage. Heart transplantation may be performed without increased risk of recurrence of sarcoidosis.

Abstract

Migration of monocytes into the arterial wall is an early finding of atherosclerosis. Monocytes are attracted to sites of vascular endothelial cell injury, the initiating event in the development of atheromatous disease, by a chemokine known as monocyte chemoattractant protein-1 (MCP-1). Injured vascular endothelial and smooth muscle cells selectively secrete MCP-1.This study was performed to determine if radiolabeled MCP-1 would co-localize at sites of monocyte/macrophage concentration in an experimental model of transplant-induced vasculopathy in diabetic animals.Hearts from 3-month-old male Zucker rats, heterozygote (Lean) or homozygote (Fat) for the diabetes-associated gene fa, were transplanted into the abdomens of genetically matched recipients. Lean and Fat animals were then fed normal or high-fat diets for 90 days.At 90 days significant increases (P < 0.013) of MCP-1 graft uptake were seen at imaging and confirmed on scintillation gamma well counting studies in Lean (n = 5) and Fat (n = 12) animals, regardless of diet, 400 % and 40 %, above control values, respectively. MCP-1 uptake of native and grafted hearts correlated with increased numbers of perivascular macrophages (P < 0.02), as seen by immunostaining with an antibody specific for macrophages (ED 2).Radiolabeled MCP-1 can detect abnormally increased numbers of perivascular mononuclear cells in native and grafted hearts in prediabetic rats. MCP-1 may be useful in the screening of diabetic children for early atherosclerotic disease.

Abstract

Tricuspid regurgitation (TR) is common after heart transplantation. However, the incidence of severe TR and the incidence of symptoms after echocardiographic diagnosis of severe TR have not been documented. The purpose of this study is to determine the incidence of severe TR and its clinical significance in the heart transplant population.We reviewed echocardiograms (echo) of all heart transplant patients coming for regular echocardiographic follow-up between 1990 and 1995. We reviewed the charts of all patients who had echo diagnosis of severe TR.A total of 336 patients had echo follow-up during this time period. The number of months post-heart transplant to last echo was 54 +/- 50 (range, 1 to 265 months). Ninety patients had moderate TR and 23 patients had severe TR. Mean time from heart transplantation to diagnosis of severe TR was 43 +/- 38 months (range, 1 to 132). Using Cutler-Ederer analysis, at 5 years, 92.2% of surviving patients were free from severe TR. At 10 years, 85.8% of surviving patients were free from severe TR. Of the 23 patients with severe TR, 17 had charts available for review. The mean number of prior endomyocardial biopsies was 28 +/- 21 (range, 3 to 88). These patients were followed for 35 +/- 18 months after diagnosis. During this period, they developed significant heart failure and peripheral edema. Six patients eventually underwent tricuspid valve replacement.Moderate to severe TR commonly occurs following heart transplantation. Severe TR is associated with significant morbidity.

Abstract

CyA is the core immunosuppressant of choice for the majority of transplant patients. The introduction of Neoral, a new microemulsion formulation of CyA. and more recently a range of adjunctive immunosuppressants have further enhanced the efficacy and tolerability of CyA-based immunosuppression. In the first year following transplantation the major causes of morbidity and death are graft failure, acute rejection, and systemic infection. Patients with deteriorated pulmonary circulation before transplantation are at increased risk of early postoperative death. Risk factors for early acute rejection include female donor sex, young donor age, and multiple HLA-DR mismatches. The principal cause of death in the long term is graft vasculopathy which accounted for 40% of all deaths. Risk factors that have been hypothesized to play a role in the pathogenesis of graft vasculopathy include hyperlipidemia, recipient age and gender, donor age, the number of HLA AB and DR mismatches, and CMV infection. Strategies proposed to reduce the risk of graft vasculopathy include aggressive use of lipid-lowering agents, avoidance of low CyA doses, and the use of adjunctive rapamycin or RAD therapy. Rejection surveillance therefore relies on routine serial endomyocardial biopsy. Recent research suggests that a more accurate assessment of the state of the graft can be obtained by considering the results across a number of biopsy samples obtained from different parts of the heart, rather than basing clinical judgment on the worst single result obtained. New molecular markers such as granzyme A mRNA are likely to improve the power of histology to diagnose and predict rejection. Neoral pharmacokinetics give greater bioavailability and less intrapatient variability than Sandimmune. In the keynote OLN 351 study comparing Neoral with Sandimmune in de novo heart transplant recipients, fewer Neoral patients needed antilymphocyte therapy to treat rejection, fewer female patients had rejection episodes in the Neoral group, the tolerability of the two formulations was equivalent, and there was a lower incidence of infections in the Neoral group. The clinical impact of Neoral in comparison with Sandimmune in de novo heart transplant patients has been investigated in a number of additional trials, including long-term studies, which have confirmed that Neoral is associated with: Lower CyA doses than Sandimmune. Equal or greater antirejection efficacy than Sandimmune. Comparable tolerability to Sandimmune. During the administration of intravenous CyA as an induction therapy in the days immediately following transplantation, there is evidence to suggest that a 6-hour infusion given twice daily, which mimics the pharmacokinetic profile of oral dosing, may be clinically more effective than a continuous 24-hour infusion. Milligram-for-milligram dose conversion from Sandimmune to Neoral is feasible. Following conversion, a reduction in the CyA dose may be required in the majority of patients to maintain target levels. In pediatric patients, the rate of elimination of CyA is greater and bioavailability increases with increasing age. Younger patients (less than 8 years of age) may be managed more effectively with a 3-times-daily, rather than a twice-daily dosing schedule. A number of studies have compared the clinical effects of Sandimmune and Neoral in maintenance therapy for cardiac transplant patients. As with de novo patients, these studies have found the new formulation of CyA to be associated with lower rates of acute rejection, lower therapeutic doses, and comparable tolerability. Milligram-to-milligram conversion from the old to the new CyA formulation is generally well tolerated, although in a minority of patients there is a significant increase in CyA levels. These may be associated with a transient increase in side effects which resolve on dose reduction. There is a dose-sparing effect with Neoral. Routine monitoring of both CyA and serum creatinine levels are adv

Abstract

As a consequence of recent advances in heart transplantation, upper age limits for the procedure have been liberalized in many centres. It was the purpose of this study to compare post-transplant mortality, morbidity and quality of life in a consecutive series of 72 patients > 54 years (mean age, 57.6 +/- 2.7 years) with a control group of 72 adult patients < or = 54 years (mean age, 42.4 +/- 9.5 years) transplanted at one centre between 1985 and 1991.Patients were followed for 41 +/- 27 months post-transplant. Actuarial 1-, 5- and 7-year survival rates were 78 +/- 5%, vs 81 +/- 5%, 52 +/- 7% vs 66 +/- 6% and 46 +/- 8% vs 63 +/- 6% in patients > 54 years and < or = 54 years, respectively (P = ns). Causes of death were not significantly different between the groups. Patients > 54 years experienced significantly fewer rejection episodes after the 6th month post-transplant (0.5 +/- 0.9 vs 0.9 +/- 1.0, P < 0.04), and incidence and treatment of rejection episodes as well as incidence of infection was comparable between the groups. Non-lymphoid malignancies, mainly skin cancer, occurred more often in the older age group (27% vs 13%, P < 0.05). Quality of life, as assessed by the Nottingham Health Profile, was better in 5/6 dimensions of social functioning in older patients and the difference reached statistical significance for the dimensions of emotional reactions (P = 0.005) and sleep (P = 0.0005).In conclusion, carefully selected patients > 54 years can undergo heart transplantation with mortality and morbidity comparable to younger patients. Quality of life post-transplant seems even to be slightly better in the older age group.

Abstract

Cytomegalovirus (CMV) infection is associated with an increased incidence of other opportunistic infections in organ transplant recipients. Whether this is related to immunomodulating effects of CMV or independent of CMV but associated with a host risk factor common to both infections is unclear. The purpose of this study was to determine whether the reduction in CMV infections seen with prophylactic ganciclovir treatment after heart transplantation is associated with a reduced incidence of other opportunistic infections. Of 149 patients prospectively enrolled in a multicenter, randomized, double-blind, placebo-controlled trial of ganciclovir to prevent CMV disease, 74 patients enrolled at this center (33 control and 41 ganciclovir-treated) were retrospectively identified. All received prophylactic OKT-3 and standard 3 drug maintenance immunosuppressive therapy. Actuarial survival and rejection rates and incidence of opportunistic infections (bacterial, fungal, and protozoal) for the 2 treatment groups were determined and compared using Cox-Mantel analysis. CMV disease occurred 2.5 times more frequently in the control group. There were no significant differences in survival or rejection rates nor in bacterial or protozoal infection incidence between the 2 groups. Bacterial infections occurred in 54% of control and 39% of ganciclovir-treated patients (P = 0.18). There were significantly fewer fungal infections in the ganciclovir-treated group (7% vs. 27%, P = 0.0071). CMV and fungal infections were both significantly reduced in patients who received ganciclovir prophylaxis. This suggests that active CMV disease may be causally associated with the development of opportunistic fungal infections.

Abstract

The mechanisms underlying contractile dysfunction following heart transplantation are poorly defined. To investigate the role of cytotoxic T cells (CTL) in cardiac transplant rejection, and during episodic contractile dysfunction, we performed a prospective study analyzing the expression of granzyme A and perforin, two functional markers of activated CTL. Sixteen consecutive patients were analyzed during the first year posttransplantation. All patients received induction therapy with OKT-3 and received standard three-drug immunosuppression therapy. Rejection status was monitored using routine surveillance endomyocardial biopsy and graded according to the ISHLT scale. Granzyme A and perforin mRNA were detected by reverse transcription PCR at the time of each routine biopsy. A total of 64/123 biopsies were positive for granzyme expression, while 38/123 samples were positive for perforin expression. LV function was monitored using M-mode derived fractional shortening and Doppler assessment of diastolic function (isovolumic relaxation time [IVRT] and pressure half-time [P1/2]). As expected, the presence of granzyme A message was associated with rejection score (ANOVA, P = 0.001). In addition, granzyme A expression was correlated with a decrease in diastolic function (chi 2 = 6.4, P < 0.02), but was not associated with systolic function. The presence of perforin message was not correlated with functional changes or with rejection grade, but was associated with granzyme expression (chi 2 = 9.11, P = 0.0025). These studies suggest that the presence of granzyme A message may be an important predictor of graft function.

Abstract

Pulmonary hypertension is a source of perioperative mortality after orthotopic liver transplantation (OLT). The purpose of this study is to (1) characterize the pulmonary hemodynamic response in OLT candidates, and (2) determine whether portal flow index (PFI), a magnetic resonance imaging (MRI)-derived parameter, is a useful predictor of the pulmonary hemodynamic response.Twenty-five consecutive OLT candidates underwent right heart catheterization with pressure measurements at baseline and after infusion of 1 L of crystalloid. MRI, chest roentgenography, electrocardiography, and echocardiography were also performed as routine screening techniques. Sixteen patients in intensive care unit with normal liver function served as controls.After volume infusion, pulmonary hypertension (mean pulmonary artery pressure greater than 25 mm Hg) developed in 9 of 25 OLT candidates with elevations in both pulmonary capillary wedge and mean pulmonary pressures. In contrast, 0 of 16 controls experienced pulmonary hypertension (p < 0.01). Although routine modalities did not predict this hemodynamic response, PFI had a 94% specificity and 78% sensitivity.OLT candidates exhibit volume-induced pulmonary hypertension with responses suggestive of left ventricular dysfunction. The significance of this observation is unknown, but the MRI-derived parameter, PFI, may serve as a screening technique to limit catheterization to a select group of OLT candidates.

Abstract

Although acute diastolic dysfunction is an early sequela of the rejecting heart, reported sensitivities and specificities have varied widely when Doppler echocardiography is used for rejection surveillance. This study examines the temporal relationship between changes in Doppler echocardiographic indexes of diastolic function and sequential endomyocardial biopsies to identify possible factors accounting for false-positive and false-negative results. A total of 114 Doppler echocardiographic studies and biopsies were performed weekly in 39 patients aged 14 to 59 years during the initial 3 months after heart transplantation. All Doppler examinations were within 24 hours of biopsy and were analyzed in a blinded fashion. Onset of restrictive physiology, defined as a 15% decrease in either isovolumic relaxation time or pressure half-time, was determined by analysis of the Doppler mitral flow velocity curve.