Professor of Medicine (Cardiovascular) and, by courtesy, of Health Research and Policy at the Palo Alto Veterans Affairs Health Care System

Medicine - Cardiovascular Medicine

Bio

Bio

Dr. Paul Heidenreich is Professor of Medicine and Health Research and Policy at the Stanford University School of Medicine. He also serves as a Vice-Chair for Clinical, Quality and Analytics in the Department of Medicine. He is a practicing non-invasive cardiologist and active clinical researcher. He has an extensive background in outcomes and health services research in the areas of technology assessment including use of screening with diagnostic tests, quality improvement, and economic analyses. He served as Chair of the American College of Cardiology and American Heart Association's (AHA) Task Force on Performance Measurement, the AHA Council on Quality of Car and Outcomes Research and the AHA Get With the Guidelines Steering Committee.

Clinical Trials

This proposal examines use of a clinical reminder to the primary provider of patient with a
high B type natriuretic peptide but no prior imaging.
Electrical Medical Record-based Intervention to Determine whether Clinical Reminders Improve
Heart Failure Management in Patients with High BNP Values and Unknown LVEF.

The scope of the USE-BNP Trial is to investigate whether knowledge of BNP measurements, in
conjunction with clinical assessment, in the outpatient setting can guide the management of
therapy in patients with heart failure.

Implantable Cardioverter-Defibrillator Use in the VA SystemNot Recruiting

Despite being a proven life-saving intervention in appropriately selected individuals,
multiple studies continue to demonstrate low implantation of defibrillators in potential
candidates. Based upon prior research, a major barrier to low utilization is low referral of
potential candidates by healthcare providers. In this study, via brief clinical reminder
placed in the electronic medical record, we ask healthcare providers who have not referred
potential candidates for defibrillator the reasons for this decision and provide them with
the tools for referral if appropriate.

Stanford is currently not accepting patients for this trial.For more information, please contact Parisa Gholami, (650) 493 - 5000.

Abstract

This study aimed to evaluate the cost-effectiveness of the CardioMEMS (CardioMEMS Heart Failure System, St Jude Medical Inc, Atlanta, Georgia) device in patients with chronic heart failure.The CardioMEMS device, an implantable pulmonary artery pressure monitor, was shown to reduce hospitalizations for heart failure and improve quality of life in the CHAMPION (CardioMEMS Heart Sensor Allows Monitoring of Pressure to Improve Outcomes in NYHA Class III Heart Failure Patients) trial.We developed a Markov model to determine the hospitalization, survival, quality of life, cost, and incremental cost-effectiveness ratio of CardioMEMS implantation compared with usual care among a CHAMPION trial cohort of patients with heart failure. We obtained event rates and utilities from published trial data; we used costs from literature estimates and Medicare reimbursement data. We performed subgroup analyses of preserved and reduced ejection fraction and an exploratory analysis in a lower-risk cohort on the basis of the CHARM (Candesartan in Heart failure: Reduction in Mortality and Morbidity) trials.CardioMEMS reduced lifetime hospitalizations (2.18 vs. 3.12), increased quality-adjusted life-years (QALYs) (2.74 vs. 2.46), and increased costs ($176,648 vs. $156,569), thus yielding a cost of $71,462 per QALY gained and $48,054 per life-year gained. The cost per QALY gained was $82,301 in patients with reduced ejection fraction and $47,768 in those with preserved ejection fraction. In the lower-risk CHARM cohort, the device would need to reduce hospitalizations for heart failure by 41% to cost

Abstract

We sought to determine survival for patients with heart failure after an implantation of an implantable cardioverter defibrillator (ICD) for primary prevention in the United States and to develop a simple model that would predict mortality risk.Clinical trials have found that patients with heart failure with a 1-year mortality risk near 20% may not benefit from an ICD.We identified patients from the ICD Registry of the National Cardiovascular Disease Registries who underwent ICD implantation for primary prevention from 2007 to 2009. Two risk scores for mortality were developed in 2 cohorts: one limited to those with a B-type natriuretic peptide (BNP) value and a second for all patients. The scores were obtained from derivation datasets and tested in a validation sets using logistic regression models and classification and regression trees.In a primary prevention population with BNP available (18,725) the 6 variables most predictive of 1-year mortality were age ≥75, BNP ≥700 pg/mL, chronic lung disease, dialysis, blood urea nitrogen ≥30 mg/dL, and systolic blood pressure <120 mmHg. Patients with zero risk factors had a 3.3% one-year mortality compared to a 66.7% one-year mortality for those with all 6 risk factors. Those with ≥3 risk factors (24.0% of the population) had a 25.8% one-year mortality. A second score using a larger cohort that did not consider BNP identified similar risk factors.A simple validated risk score can identify patients at high and low risk for death within a year after ICD placement. A large fraction of those currently implanted with an ICD in the United States have a high 1-year mortality and may not benefit from ICD therapy.

Abstract

This study sought to describe the use of CRT-D and its association with survival for older patients.Many patients who receive cardiac resynchronization therapy with defibrillator (CRT-D) in practice are older than those included in clinical trials.We identified patients undergoing ICD implantation in the National Cardiovascular Disease Registry (NCDR) ICD registry from 2006 to 2009, who also met clinical trial criteria for CRT, including left ventricular ejection fraction (LVEF) ≤35%, QRS ≥120 ms, and New York Heart Association (NYHA) functional class III or IV. NCDR registry data were linked to the social security death index to determine the primary outcome of time to death from any cause. We identified 70,854 patients from 1,187 facilities who met prior trial criteria for CRT-D. The mean age of the 58,147 patients receiving CRT-D was 69.4 years with 6.4% of patients age 85 or older. CRT use was 80% or higher among candidates in all age groups. Follow-up was available for 42,285 patients age ≥65 years at 12 months.Receipt of CRT-D was associated with better survival at 1 year (82.1% vs. 77.1%, respectively) and 4 years (54.0% vs. 46.2% , respectively) than in those receiving only an ICD (p < 0.001). The CRT association with improved survival was not different for different age groups (p = 0.86 for interaction).More than 80% of older patients undergoing ICD implantation who were candidates for a CRT-D received the combined device. Mortality in older patients undergoing ICD implantation was high but was lower for those receiving CRT-D.

Abstract

Coronary artery disease (CAD) outcomes consistently improve when they are routinely measured and provided back to physicians and hospitals. However, few centers around the world systematically track outcomes, and no global standards exist. Furthermore, patient-centered outcomes and longitudinal outcomes are under-represented in current assessments.The nonprofit International Consortium for Health Outcomes Measurement (ICHOM) convened an international Working Group to define a consensus standard set of outcome measures and risk factors for tracking, comparing, and improving the outcomes of CAD care. Members were drawn from 4 continents and 6 countries. Using a modified Delphi method, the ICHOM Working Group defined who should be tracked, what should be measured, and when such measurements should be performed. The ICHOM CAD consensus measures were designed to be relevant for all patients diagnosed with CAD, including those with acute myocardial infarction, angina, and asymptomatic CAD. Thirteen specific outcomes were chosen, including acute complications occurring within 30 days of acute myocardial infarction, coronary artery bypass grafting surgery, or percutaneous coronary intervention; and longitudinal outcomes for up to 5 years for patient-reported health status (Seattle Angina Questionnaire [SAQ-7], elements of Rose Dyspnea Score, and Patient Health Questionnaire [PHQ-2]), cardiovascular hospital admissions, cardiovascular procedures, renal failure, and mortality. Baseline demographic, cardiovascular disease, and comorbidity information is included to improve the interpretability of comparisons.ICHOM recommends that this set of outcomes and other patient information be measured for all patients with CAD.

Abstract

Background- Many patients who are candidates for implantable cardioverter defibrillators (ICDs) are not referred for potential implantation. We sought to determine if a simple provider reminder would increase referrals. Methods and Results- We identified consecutive patients from January 2007 through July 2010 in the VA Palo Alto Health Care System with a left ventricular ejection fraction <35% on echocardiography. Patients were excluded using available administrative data only (no chart review) if they were known to have an ICD, if they were ≥80 years old, or if they did not have a current primary care or cardiology provider within the system. We randomized patients to no intervention or a clinical note to the provider in the medical record. The outcomes were referral for consideration of defibrillator implantation (primary) and documented discussion (secondary). Of 330 patients with left ventricular ejection fraction ≤35%, 128 were known to have an ICD, 85 were no longer followed in the healthcare system, and 28 were ≥80 years old, leaving 89 patients to be randomized. Forty-six patients were randomized to intervention and 43 to control. Eleven of 46 (24%) intervention patients were referred for consideration of ICD implantation during the following 6 months versus 1 of 43 (2%) control patients (P=0.004). Overall, 31 of 46 (67%) intervention patients versus 19 of 43 (44%) control patients had documentation discussing potential candidacy for defibrillators (P=0.05). Conclusions- In patients with low left ventricular ejection fraction, a simple electronic medical record-based intervention directed to their providers improved the rates of referral for ICD implantation. Clinical Trial Registration- URL: http://www.clinicaltrials.gov. Unique identifier: NCT01217827.

Abstract

Contrast left ventriculography is a method of measuring left ventricular function usually performed at the discretion of the invasive cardiologist during cardiac catheterization. We sought to determine variation in the use of left ventriculography in the Veterans Affairs (VA) Health Care System.We identified adult patients who underwent cardiac catheterization including coronary angiography between 2000 and 2009 in the VA Health Care System. We determined patient and hospital predictors of the use of left ventriculography as well as the variation in use across VA facilities. Results were validated using data from the VA's Clinical Assessment, Reporting, and Tracking (CART) program. Of 457 170 cardiac catheterization procedures among 336 853 patients, left ventriculography was performed on 263 695 (58%) patients. Use of left ventriculography decreased over time (64% in 2000 to 50% in 2009) and varied markedly across facilities (<1->95% of cardiac catheterizations). Patient factors explained little of the large variation in use between facilities. When the cohort was restricted to those with an echocardiogram in the prior 30 days and no intervening event, left ventriculography was still performed in 50% of cases.There is large variation in the use of left ventriculography across VA facilities that is not explained by patient characteristics.

Abstract

Background- Heart failure (HF) is an important contributor to both the burden and cost of national healthcare expenditures, with more older Americans hospitalized for HF than for any other medical condition. With the aging of the population, the impact of HF is expected to increase substantially. Methods and Results- We estimated future costs of HF by adapting a methodology developed by the American Heart Association to project the epidemiology and future costs of HF from 2012 to 2030 without double counting the costs attributed to comorbid conditions. The model assumes that HF prevalence will remain constant by age, sex, and race/ethnicity and that rising costs and technological innovation will continue at the same rate. By 2030, >8 million people in the United States (1 in every 33) will have HF. Between 2012 and 2030, real (2010$) total direct medical costs of HF are projected to increase from $21 billion to $53 billion. Total costs, including indirect costs for HF, are estimated to increase from $31 billion in 2012 to $70 billion in 2030. If one assumes all costs of cardiac care for HF patients are attributable to HF (no cost attribution to comorbid conditions), the 2030 projected cost estimates of treating patients with HF will be 3-fold higher ($160 billion in direct costs). Conclusions- The estimated prevalence and cost of care for HF will increase markedly because of aging of the population. Strategies to prevent HF and improve the efficiency of care are needed.

Abstract

This study sought to quantify the incremental cost-effectiveness ratios (ICER) of angiotensin-converting enzyme inhibitor (ACEI), beta-blocker (BB), and aldosterone antagonist (AldA) therapies for patients with heart failure with reduced ejection fraction (HFrEF).There are evidence-based, guideline-directed medical therapies for patients with HFrEF, but the incremental cost-effectiveness of these therapies has not been well studied using contemporary data.A Markov model with lifetime horizon and two states, dead or alive, was created. We compared HFrEF patients treated with diuretic agents alone to three treatment arms: 1) ACEI therapy alone; 2) ACEI+BB; and 3) ACEI+BB+AldA. Sequential therapy was also analyzed. HF hospitalizations and mortality rates were based on representative studies. Costs of medications and inpatient and outpatient care were accounted for.Treatment with ACEI and ACEI+BB strictly dominated treatment with diuretics only (cost-saving). The greatest gains in quality-adjusted life-years occurred when all 3 guideline-directed medications were provided. The incremental cost-effectiveness ratio (ICER) of ACEI+BB+AldA versus ACEI+BB and ACEI+BB versus ACEI was

Measuring the Quality of Echocardiography Using the Predictive Value of the Left Ventricular Ejection FractionJOURNAL OF THE AMERICAN SOCIETY OF ECHOCARDIOGRAPHYHeidenreich, P. A., Maddox, T. M., Nath, J.2013; 26 (3): 237-242

Abstract

One of the main challenges for imaging laboratories is demonstrating the quality of their studies. The aim of this study was to determine if echocardiographic training and experience are associated with the accuracy of left ventricular ejection fraction (LVEF) reporting using all-cause mortality as the gold standard.Survival was determined for consecutive patients undergoing echocardiography at one of four academic facilities. The relationship between LVEF and survival was determined for different groups of physician readers and sonographers on the basis of board certification and experience. Studies of physicians reading <200 studies were excluded.Data from 63,108 patients and 40 physicians were included. There was moderate variation across physicians in the relationship between LVEF and 1-year mortality (area under the receiver operating characteristic curve interquartile range, 0.56-0.64). The relationship between LVEF and 1-year mortality was stronger for physicians board certified in echocardiography (area under the receiver operating characteristic curve, 0.60; 95% confidence interval, 0.59-0.61) compared with those not certified (area under the receiver operating characteristic curve, 0.56; 95% confidence interval, 0.55-0.57; P < .0001). Physician experience, years since training, and sonographer experience and certification were not clearly associated with the predictive value of LVEF. After adjustment for patient characteristics, the LVEF-mortality association of board-certified physicians remained stronger than the LVEF-mortality association of those not certified.LVEF as determined by physicians board certified in echocardiography was associated with a stronger relationship with mortality than as determined by those not certified. The LVEF-mortality relationship may be useful as one measure of the quality of imaging.

Abstract

Randomized clinical trials have shown that implantable cardioverter-defibrillator (ICD) therapy saves lives. Whether the survival of patients who received an ICD in primary prevention clinical trials differs from that of trial-eligible patients receiving a primary prevention ICD in clinical practice is unknown.To determine whether trial-eligible patients who received a primary prevention ICD as documented in a large national registry have a survival rate that differs from the survival rate of similar patients who received an ICD in the 2 largest primary prevention clinical trials, MADIT-II (n = 742) and SCD-HeFT (n = 829).Retrospective analysis of data for patients enrolled in the National Cardiovascular Data Registry ICD Registry between January 1, 2006, and December 31, 2007, meeting the MADIT-II criteria (2464 propensity score-matched patients) or the SCD-HeFT criteria (3352 propensity score-matched patients). Mortality data for the registry patients were collected through December 31, 2009.Cox proportional hazards models were used to compare mortality from any cause.The median follow-up time in MADIT-II, SCD-HeFT, and the ICD Registry was 19.5, 46.1, and 35.2 months, respectively. Compared with patients enrolled in the clinical trials, patients in the ICD Registry were significantly older and had a higher burden of comorbidities. In the matched cohorts, there was no significant difference in survival between MADIT-II-like patients in the registry and MADIT-II patients randomized to receive an ICD (2-year mortality rates: 13.9% and 15.6%, respectively; adjusted ICD Registry vs trial hazard ratio, 1.06; 95% CI, 0.85-1.31; P = .62). Likewise, the survival among SCD-HeFT-like patients in the registry was not significantly different from survival among patients randomized to receive ICD therapy in SCD-HeFT (3-year mortality rates: 17.3% and 17.4%, respectively; adjusted registry vs trial hazard ratio, 1.16; 95% CI, 0.97-1.38; P = .11).There was no significant difference in survival between clinical trial patients randomized to receive an ICD and a similar group of clinical registry patients who received a primary prevention ICD. Our findings support the continued use of primary prevention ICDs in similar patients seen in clinical practice.clinicaltrials.gov Identifier: NCT00000609.

Abstract

In this issue of Circulation, two studies examine the value (cost-effectiveness) of two rapidly changing technologies: ventricular assist devices (VADs) as a bridge to transplant for patients with heart failure, and left atrial appendage (LAA) occlusion as an alternative to anticoagulation for atrial fibrillation. Both heart failure and atrial fibrillation impose an important economic and health burden on western societies that is only going to worsen as their populations age. In addition, the high cost of treating these conditions in the United States (US) is increasingly paid by Medicare resulting in greater taxes and premiums for all. Heart failure is already the most common reason for hospitalization in the US Medicare program and its prevalence in the US is estimated to grow by 43% to 8 million people by 2030(1). The cost of this care, due solely to the aging of the US population is expected to increase from 30 to 70 billion during the next 20 years. As the number of patients with heart failure grows so will the number of those with end-stage heart failure. Given that the rate of cardiac transplantation has not increased(2) many patients, providers, and payers will consider the use of ventricular assist devices (VADs) as a potential therapy for those not responding to other therapies. Older generation VADs were shown shown to improve survival in patients with severe heart failure (REMATCH)(3). More recently continuous flow devices have been found to provide even better outcome and have been used routinely for several years(4). Unfortunately the devices are expensive with an acquisition cost near $150,000(5). The use of VADs for both destination therapy and as a bridge to transplant has increased resulting in estimated VAD costs in the US climbing from $143 million to $479 million in 2009(6).

Abstract

Left ventriculography provided the first imaging of left ventricular function and was historically performed as part of coronary angiography despite a small but significant risk of complications. Because modern noninvasive imaging techniques are more accurate and carry smaller risks, the routine use of left ventriculography is of questionable utility. We sought to analyze the frequency that left ventriculography was performed during coronary angiography in patients with and without a recent alternative assessment of left ventricular function.We performed a retrospective analysis of insurance claims data from the Aetna health care benefits database including all adults who underwent coronary angiography in 2007. The primary outcome was the concomitant use of left ventriculography during coronary angiography.Of 96,235 patients who underwent coronary angiography, left ventriculography was performed in 78,705 (81.8%). Use of left ventriculography was high in all subgroups, with greatest use in younger patients, those with a diagnosis of coronary disease, and those in the Southern United States. In the population who had undergone a very recent ejection fraction assessment by another modality (within 30 days) and who had had no intervening diagnosis of new heart failure, myocardial infarction, hypotension, or shock (37,149 patients), left ventriculography was performed in 32,798 patients (88%)-a rate higher than in the overall cohort.Left ventriculography was performed in most coronary angiography cases and often when an alternative imaging modality had been recently completed. New clinical practice guidelines should be considered to decrease the overuse of this invasive test.

Abstract

Hospitals enrolled in the American Heart Association's Get With The Guidelines Program for heart failure (GWTG-HF) have improved their process of care. However, it is unclear if process of care and outcomes are better in the GWTG-HF hospitals compared with hospitals not enrolled.We compared hospitals enrolled in GWTG-HF from 2006 to 2007 with other hospitals using data on 4 process of heart failure care measures, 5 noncardiac process measures, risk-adjusted 30-day mortality, and 30-day all-cause readmission after a heart failure hospitalization, as reported by the Center for Medicare and Medicaid Services (CMS). Among the 4460 hospitals reporting data to CMS, 215 (5%) were enrolled in GWTG-HF. Of the 4 CMS heart failure performance measures, GWTG-HF hospitals had significantly higher documentation of the left ventricular ejection fraction (93.4% versus 88.8%), use of angiotensin-converting enzyme inhibitor or angiotensin receptor antagonist (88.3% versus 86.6%), and discharge instructions (74.9% versus 70.5%) (P<0.005 for all). Smoking cessation counseling rates were similar (94.1% versus 94.0%; P=0.51). There was no significant difference in compliance with noncardiac process of care. After heart failure discharge, all-cause readmission at 30 days was 24.5% and mortality at 30 days after admission was 11.1%. After adjustment for hospital characteristics, 30-day mortality rates were no different (P=0.45). However, 30-day readmission was lower for GWTG hospitals (-0.33%; 95% CI, -0.53% to -0.12%; P=0.002).Although there was evidence that hospitals enrolled in the GTWG-HF program demonstrated better processes of care than other hospitals, there were few clinically important differences in outcomes. Further identification of opportunities to improve outcomes, and inclusion of these metrics in GTWG-HF, may further support the value of GTWG-HF in improving care for patients with HF.

Abstract

Older patients often receive less guideline-concordant care for heart failure than younger patients.To determine whether age differences in heart failure care are explained by patient, provider, and health system characteristics and/or by chart-documented reasons for non-adherence to guidelines.Retrospective cohort study of 2,772 ambulatory veterans with heart failure and left ventricular ejection fraction <40% from a 2004 nationwide medical record review program (the VA External Peer Review Program).Ambulatory use of ACE inhibitors, angiotensin receptor blockers (ARBs), and beta blockers.Among 2,772 patients, mean age was 73 +/- 10 years, 87% received an ACE inhibitor or ARB, and 82% received a beta blocker. When patients with explicit chart-documented reasons for not receiving these drugs were excluded, 95% received an ACE inhibitor or ARB and 89% received a beta blocker. In multivariable analyses controlling for a variety of patient and health system characteristics, the adjusted odds ratio for ACE-inhibitor and ARB use was 0.43 (95% CI 0.24-0.78) for patients age 80 and over vs. those age 50-64 years, and the adjusted odds ratio for beta blocker use was 0.66 (95% CI 0.48-0.93) between the two age groups. The magnitude of these associations was similar but not statistically significant after excluding patients with chart-documented reasons for not prescribing ACE inhibitors or ARBs and beta blockers.A high proportion of veterans receive guideline-recommended medications for heart failure. Older veterans are consistently less likely to receive these drugs, although these differences were no longer significant when accounting for patients with chart-documented reasons for not prescribing these drugs. Closely evaluating reasons for non-prescribing in older adults is essential to assessing whether non-treatment represents good clinical judgment or missed opportunities to improve care.

Abstract

BACKGROUND- The majority of current implantable cardioverter-defibrillator (ICD) recipients are significantly older than those in the ICD trials. Data on periprocedural complications among the elderly are insufficient. We evaluated the influence of age on perioperative complications among primary prevention ICD recipients in the United States. METHODS AND RESULTS- Using the National Cardiovascular Data's ICD Registry, we identified 150 264 primary prevention patients who received ICDs from January 2006 to December 2008. The primary end point was any adverse event or in-hospital mortality. Secondary end points included major adverse events, minor adverse events, and length of stay. Of 150 264 patients, 61% (n=91 863) were 65 years and older. A higher proportion of patients ?65 years had diabetes, congestive heart failure, atrial fibrillation, renal disease, and coronary artery disease. Approximately 3.4% of the entire cohort had any complication, including death, after ICD implant. Any adverse event or death occurred in 2.8% of patients under 65 years old; 3.1% of 65- to 69-year-olds; 3.5% of 70- to 74-year-olds; 3.9% of 75- to 79-year-olds, 4.5% of 80- to 84-year-olds; and 4.5% of patients 85 years and older. After adjustment for clinical covariates, multivariate analysis found an increased odds of any adverse event or death among 75- to 79-year-olds (1.14 [95% confidence interval, 1.03 to 1.25], 80-to 84-year-olds (1.22 [95% confidence interval, 1.10 to 1.36], and patients 85 years and older (1.15 [95% confidence interval, 1.01 to 1.32], compared with patients under 65 years old. CONCLUSIONS- Older patients had a modestly increased-but acceptably safe-risk of periprocedural complications and in-hospital mortality, driven mostly by increased comorbidity.

Abstract

To estimate the potentially inappropriate use of implantable cardioverter-defibrillator ICDs in older U.S. adults.Retrospective study.The National Cardiovascular Data ICD Registry.Forty-four thousand eight hundred five individuals in the National Cardiovascular Data's ICD Registry(™) who had received ICDs for primary prevention from January 2006 to December 2008. Individuals with a prior myocardial infarction and ejection fraction less than 30% were included.Mortality risk was categorized using the Multicenter Automatic Defibrillator Implantation (MADIT) II risk-stratification system. Low-risk and very-high-risk individuals were considered potentially inappropriate recipients.Of 44,805 individuals, 67% (n = 29,893) were aged 65 and older, of whom 51% were aged 75 and older. A significant proportion of ICD recipients had a low risk of death (16%, n = 6,969) or very high risk of nonarrhythmic death (8%, n = 3,693). Potentially inappropriate ICD use was 10% in those aged 75 and older, much less than in younger groups (40%, <65; 21%, 65-74, P < .001). Although age was associated with a high risk of nonarrhythmic death, its influence was markedly attenuated after adjusting for comorbidities and timing of ICD implantation (odds ratio = 1.02, 95% confidence interval = 1.02-1.03, P < .001).Potentially inappropriate ICD use appears significantly less--and at modest rates--in older Americans than in younger age groups. Overall, almost one-quarter of individuals may have received ICDs inappropriately based on their risk of death. Physicians appear to be conservatively referring older adults and wisely deferring those with high comorbid burden.

Abstract

The very elderly (age 80 years and older) with heart failure (HF) is a growing population that is rarely included in clinical trials. The aim of this investigation was to describe the characteristics and outcomes of very elderly patients after a first HF hospitalization.We identified very elderly patients (age 80 years and older) discharged with HF from the Veteran's Administration National Patient Care Database from 1999 to 2008. Outcomes of interest were death during index admission, 30-day and 1-year mortality, and 30-day all-cause and HF readmissions. We used generalized estimating equations to evaluate outcome differences between age groups within the very elderly cohort (ages 80 to 84, 85 to 89, and 90 and older), adjusting for comorbidities, demographics, and clustering by treatment facility. We identified 21 397 very elderly veterans with a first HF hospitalization during the study period. Thirty-day mortality decreased from 14% to 7% (both P<0.001) and 1-year mortality decreased from 49% to 27% (P<0.001). Although these improvements were most notable for patients age 90 and older (1-year mortality improved by 25.9%), the adjusted odds of death within 1 year were highest for the oldest veterans (odds ratio, 1.85; 95% confidence interval, 1.64 to 2.09, using the 80- to 85-year age group as reference). For all patients, 30-day all-cause readmissions remained largely unchanged and did not differ between age groups.Mortality for very elderly HF patients has improved over time, but 30-day readmissions remain frequent. Future studies should identify interventions to reduce cardiac and noncardiac rehospitalization of very elderly HF patients.

Abstract

Cardiovascular disease (CVD) is the leading cause of death in the United States and is responsible for 17% of national health expenditures. As the population ages, these costs are expected to increase substantially.To prepare for future cardiovascular care needs, the American Heart Association developed methodology to project future costs of care for hypertension, coronary heart disease, heart failure, stroke, and all other CVD from 2010 to 2030. This methodology avoided double counting of costs for patients with multiple cardiovascular conditions. By 2030, 40.5% of the US population is projected to have some form of CVD. Between 2010 and 2030, real (2008$) total direct medical costs of CVD are projected to triple, from $273 billion to $818 billion. Real indirect costs (due to lost productivity) for all CVD are estimated to increase from $172 billion in 2010 to $276 billion in 2030, an increase of 61%.These findings indicate CVD prevalence and costs are projected to increase substantially. Effective prevention strategies are needed if we are to limit the growing burden of CVD.

Abstract

Outcome data for patients receiving implantable cardioverter-defibrillator (ICD) and cardiac resynchronization therapy-defibrillator (CRT-D) devices treated outside of clinical trials are lacking. No clinical trial has evaluated mortality after device implantation or after shock therapy in large numbers of patients with implanted devices that regularly transmit device data over a network.Survival status in patients implanted with ICD and CRT devices across the United States from a single manufacturer was assessed. Outcomes were compared between patients followed in device clinic settings and those who regularly transmit remote data collected from the device an average of 4 times monthly. Shock delivery and electrogram analysis could be ascertained from patients followed on the network, enabling survival after ICD shock to be evaluated. One- and 5-year survival rates in 185,778 patients after ICD implantation were 92% and 68% and were 88% and 54% for CRT-D device recipients. In 8228 patients implanted with CRT-only devices, survival was 82% and 48% at 1 and 5 years, respectively. For the 69,556 ICD and CRT-D patients receiving remote follow-up on the network, 1- and 5-year survival rates were higher compared with those in the 116,222 patients who received device follow-up in device clinics only (50% reduction; P<0.0001). There were no differences between patients followed on or off the remote network for the characteristics of age, gender, implanted device year or type, and economic or educational status. Shock therapy was associated with subsequent mortality risk for both ICD and CRT-D recipients.Survival after ICD and CRT-D implantation in patients treated in naturalistic practice compares favorably with survival rates observed in clinical trials. Remote follow-up of device data is associated with excellent survival, but arrhythmias that result in device therapy in this population are associated with a higher mortality risk compared with patients who do not require shock therapy.

Divergent Trends in Survival and Readmission Following a Hospitalization for Heart Failure in the Veterans Affairs Health Care System 2002 to 2006JOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGYHeidenreich, P. A., Sahay, A., Kapoor, J. R., Pham, M. X., Massie, B.2010; 56 (5): 362-368

Abstract

This study sought to determine recent trends over time in heart failure hospitalization, patient characteristics, treatment, rehospitalization, and mortality within the Veterans Affairs health care system.Use of recommended therapies for heart failure has increased in the U.S. However, it is unclear to what extent hospitalization rates and the associated mortality have improved.We compared rates of hospitalization for heart failure, 30-day rehospitalization for heart failure, and 30-day mortality following discharge from 2002 to 2006 in the Veterans Affairs Health Care System. Odds ratios for outcome were adjusted for patient diagnoses within the past year, laboratory data, and for clustering of patients within hospitals.We identified 50,125 patients with a first hospitalization for heart failure from 2002 to 2006. Mean age did not change (70 years), but increases were noted for most comorbidities (mean Charlson score increased from 1.72 to 1.89, p < 0.0001). Heart failure admission rates remained constant at about 5 per 1,000 veterans. Mortality at 30 days decreased (7.1% to 5.0%, p < 0.0001), whereas rehospitalization for heart failure at 30 days increased (5.6% to 6.1%, p = 0.11). After adjustment for patient characteristics, the odds ratio for rehospitalization in 2006 (vs. 2002) was 0.54 (95% confidence interval [CI]: 0.47 to 0.61) for mortality, but 1.21 (95% CI: 1.04 to 1.41) for heart failure rehospitalization at 30 days.Recent mortality and rehospitalization rates in the Veterans Affairs Health Care System have trended in opposite directions. These results have implications for using rehospitalization as a measure of quality of care.

Abstract

Readmission after hospitalization for heart failure is common. Early outpatient follow-up after hospitalization has been proposed as a means of reducing readmission rates. However, there are limited data describing patterns of follow-up after heart failure hospitalization and its association with readmission rates.To examine associations between outpatient follow-up within 7 days after discharge from a heart failure hospitalization and readmission within 30 days.Observational analysis of patients 65 years or older with heart failure and discharged to home from hospitals participating in the Organized Program to Initiate Lifesaving Treatment in Hospitalized Patients With Heart Failure and the Get With the Guidelines-Heart Failure quality improvement program from January 1, 2003, through December 31, 2006.All-cause readmission within 30 days after discharge.The study population included 30,136 patients from 225 hospitals. Median length of stay was 4 days (interquartile range, 2-6) and 21.3% of patients were readmitted within 30 days. At the hospital level, the median percentage of patients who had early follow-up after discharge from the index hospitalization was 38.3% (interquartile range, 32.4%-44.5%). Compared with patients whose index admission was in a hospital in the lowest quartile of early follow-up (30-day readmission rate, 23.3%), the rates of 30-day readmission were 20.5% among patients in the second quartile (risk-adjusted hazard ratio [HR], 0.85; 95% confidence interval [CI], 0.78-0.93), 20.5% among patients in the third quartile (risk-adjusted HR, 0.87; 95% CI, 0.78-0.96), and 20.9% among patients in the fourth quartile (risk-adjusted HR, 0.91; 95% CI, 0.83-1.00).Among patients who are hospitalized for heart failure, substantial variation exists in hospital-level rates of early outpatient follow-up after discharge. Patients who are discharged from hospitals that have higher early follow-up rates have a lower risk of 30-day readmission.clinicaltrials.gov Identifier: NCT00344513.

Abstract

Inclusion of 12-lead electrocardiography (ECG) in preparticipation screening of young athletes is controversial because of concerns about cost-effectiveness.To evaluate the cost-effectiveness of ECG plus cardiovascular-focused history and physical examination compared with cardiovascular-focused history and physical examination alone for preparticipation screening.Decision-analysis, cost-effectiveness model.Published epidemiologic and preparticipation screening data, vital statistics, and other publicly available data.Competitive athletes in high school and college aged 14 to 22 years.Lifetime.Societal.Nonparticipation in competitive athletic activity and disease-specific treatment for identified athletes with heart disease.Incremental health care cost per life-year gained.Addition of ECG to preparticipation screening saves 2.06 life-years per 1000 athletes at an incremental total cost of $89 per athlete and yields a cost-effectiveness ratio of $42 900 per life-year saved (95% CI, $21 200 to $71 300 per life-year saved) compared with cardiovascular-focused history and physical examination alone. Compared with no screening, ECG plus cardiovascular-focused history and physical examination saves 2.6 life-years per 1000 athletes screened and costs $199 per athlete, yielding a cost-effectiveness ratio of $76 100 per life-year saved ($62 400 to $130 000).Results are sensitive to the relative risk reduction associated with nonparticipation and the cost of initial screening.Effectiveness data are derived from 1 major European study. Patterns of causes of sudden death may vary among countries.Screening young athletes with 12-lead ECG plus cardiovascular-focused history and physical examination may be cost-effective.Stanford Cardiovascular Institute and the Breetwor Foundation.

Abstract

Many hospitals enrolled in the American Heart Association's Get With The Guidelines (GWTG) Program achieve high levels of recommended care for heart failure, acute myocardial infarction (MI) and stroke. However, it is unclear if outcomes are better in those hospitals recognized by the GWTG program for their processes of care.We compared hospitals enrolled in GWTG and receiving achievement awards for high levels of recommended processes of care with other hospitals using data on risk-adjusted 30-day survival for heart failure and acute MI reported by the Center for Medicare and Medicaid Services.Among the 3,909 hospitals with 30-day data reported by Center for Medicare and Medicaid Services 355 (9%) received GWTG achievement awards. Risk-adjusted mortality for hospitals receiving awards was lower for both heart failure (11.0% vs 11.2%, P = .0005) and acute MI (16.1% vs 16.5%, P < .0001) compared to those not receiving awards. After additional adjustment for hospital characteristics and noncardiac performance measures, the reduction in mortality remained significantly lower for GWTG award hospitals for acute myocardial infraction (-0.19%, 95% CI -0.33 to -0.05), but not for heart failure (-0.11%, 95% CI -0.25 to 0.02). Additional adjustment for cardiac processes of care reduced the benefit of award hospitals by 28% for heart failure mortality and 43% for acute MI mortality.Hospitals receiving achievement awards from the GWTG program have modestly lower risk adjusted mortality for acute MI and to a lesser extent, heart failure, explained in part by better process of care.

Abstract

Allowing nonelectrophysiologists to perform implantable cardioverter-defibrillator (ICD) procedures is controversial. However, it is not known whether outcomes of ICD implantation vary by physician specialty.To determine the association of implanting physician certification with outcomes following ICD implantation.Retrospective cohort study using cases submitted to the ICD Registry performed between January 2006 and June 2007. Patients were grouped by the certification status of the implanting physician into mutually exclusive categories: electrophysiologists, nonelectrophysiologist cardiologists, thoracic surgeons, and other specialists. Hierarchical logistic regression models were developed to determine the independent association of physician certification with outcomes.In-hospital procedural complication rates and the proportion of patients meeting criteria for a defibrillator with cardiac resynchronization therapy (CRT-D) who received that device.Of 111,293 ICD implantations included in the analysis, 78,857 (70.9%) were performed by electrophysiologists, 24,399 (21.9%) by nonelectrophysiologist cardiologists, 1862 (1.7%) by thoracic surgeons, and 6175 (5.5%) by other specialists. Compared with patients whose ICD was implanted by electrophysiologists, patients whose ICD was implanted by either nonelectrophysiologist cardiologists or thoracic surgeons were at increased risk of complications in both unadjusted (electrophysiologists, 3.5% [2743/78,857]; nonelectrophysiologist cardiologists, 4.0% [970/24,399]; thoracic surgeons, 5.8% [108/1862]; P < .001) and adjusted analyses (relative risk [RR] for nonelectrophysiologist cardiologists, 1.11 [95% confidence interval {CI}, 1.01-1.21]; RR for thoracic surgeons, 1.44 [95% CI, 1.15-1.79]). Among 35,841 patients who met criteria for CRT-D, those whose ICD was implanted by physicians other than electrophysiologists were significantly less likely to receive a CRT-D device compared with patients whose ICD was implanted by an electrophysiologist in both unadjusted (electrophysiologists, 83.1% [21 303/25,635]; nonelectrophysiologist cardiologists, 75.8% [5950/7849]; thoracic surgeons, 57.8% [269/465]; other specialists, 74.8% [1416/1892]; P < .001) and adjusted analyses (RR for nonelectrophysiologist cardiologists, 0.93 [95% CI, 0.91-0.95]; RR for thoracic surgeons, 0.81 [95% CI, 0.74-0.88]; RR for other specialists, 0.97 [95% CI, 0.94-0.99]).In this registry, nonelectrophysiologists implanted 29% of ICDs. Overall, implantations by a nonelectrophysiologist were associated with a higher risk of procedural complications and lower likelihood of receiving a CRT-D device when indicated compared with patients whose ICD was implanted by an electrophysiologist.

Abstract

To evaluate the cost-effectiveness of first-line treatments for hypertension.The Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial (ALLHAT) found that first-line treatment with lisinopril or amlodipine was not significantly superior to chlorthalidone in terms of the primary endpoint, so differences in costs may be critical for optimizing decision-making.Cost-effectiveness analysis was performed using bootstrap resampling to evaluate uncertainty.Over a patient's lifetime, chlorthalidone was always least expensive (mean $4,802 less than amlodipine, $3,700 less than lisinopril). Amlodipine provided more life-years (LYs) than chlorthalidone in 84% of bootstrap samples (mean 37 days) at an incremental cost-effectiveness ratio of $48,400 per LY gained. Lisinopril provided fewer LYs than chlorthalidone in 55% of bootstrap samples (mean 7-day loss) despite a higher cost. At a threshold of $50,000 per LY gained, amlodipine was preferred in 50%, chlorthalidone in 40%, and lisinopril in 10% of bootstrap samples, but these findings were highly sensitive to the cost of amlodipine and the cost-effectiveness threshold chosen. Incorporating quality of life did not appreciably alter the results. Overall, no reasonable combination of assumptions led to 1 treatment being preferred in over 90% of bootstrap samples.Initial treatment with chlorthalidone is less expensive than lisinopril or amlodipine, but amlodipine provided a nonsignificantly greater survival benefit and may be a cost-effective alternative. A randomized trial with power to exclude "clinically important" differences in survival will often have inadequate power to determine the most cost-effective treatment.

Abstract

Aggressive lipid management has recently become the standard of care for patients with coronary heart disease. The safety and effectiveness of statin usage for patients with extremely low low-density lipoprotein (LDL) levels are less clear, however. The aim of this study was to investigate the safety and clinical outcomes of statin treatment in patients with LDL cholesterol levels below 60 mg/dL.A total of 6107 consecutive patients with LDL levels less than 60 mg/dL were identified from a tertiary care medical center or affiliated community clinic. Statin therapy was defined as a prescription during the 150 days after the low LDL value was obtained. The propensity to be treated with a statin was used to adjust the association of statin therapy and survival. A total of 4295 patients (70%) had at least 1 prescription for any medication during the 150-day observation period after the low LDL value. Their mean age was 65 years, 43% had prior ischemic heart disease, and 47% had diabetes mellitus. Statins were prescribed in 2564 patients (60%) after the low LDL value was observed. During a mean follow-up of 2.0+/-1.4 years after the observation period, there were 510 deaths. After controlling for the propensity to receive a statin, statin therapy was associated with improved survival (hazard ratio [HR], 0.65; 95% CI, 0.53 to 0.80). This lower mortality was also observed for subgroups of patients already taking statins at baseline (HR, 0.58; 95% CI, 0.38 to 0.88), those with extremely low LDL levels (<40 mg/dL, n=623; HR, 0.51; 95% CI, 0.33 to 0.79), and those without a history of ischemic heart disease (n=2438; HR, 0.58; 95% CI, 0.42 to 0.80). Statin use was not associated with an increase in malignancy, transaminase elevation, or rhabdomyolysis.Statin therapy in the setting of a very low LDL level appears to be safe and is associated with improved survival.

Abstract

Although beta-blockers are known to prolong survival for patients with reduced left ventricular ejection fraction, they are often underused. We hypothesized that a reminder attached to the echocardiography report would increase the use of beta-blockers for patients with reduced left ventricular ejection fraction.We randomized 1546 consecutive patients with a left ventricular ejection fraction <45% found on echocardiography at 1 of 3 laboratories to a reminder for use of beta-blockers or no reminder. Patients were excluded from analysis if they died within 30 days of randomization (n=89), did not receive medications through the Veterans Affairs system after 30 days (n=180), or underwent echocardiography at >1 laboratory (n=6). The primary outcome was a prescription for an oral beta-blocker between 1 and 9 months after randomization. The mean age of the 1271 included patients was 69 years; 60% had a history of heart failure, and 51% were receiving treatment with beta-blockers at the time of echocardiography. More patients randomized to the reminder had a subsequent beta-blocker prescription (74%, 458 of 621) compared with those randomized to no reminder (66%, 428 of 650; P=0.002). The effect of the reminder was not significantly different for subgroups based on patient location (inpatient versus outpatient) or prior use of beta-blockers.A reminder attached to the echocardiography report increased the use of beta-blockers in patients with depressed left ventricular systolic function.

Abstract

We tested the hypothesis that one health status measure, the Kansas City Cardiomyopathy Questionnaire (KCCQ), provides prognostic information independent of other clinical data in outpatients with heart failure (HF).Health status measures are used to describe a patient's clinical condition and have been shown to predict mortality in some populations. Their prognostic value may be particularly useful among patients with HF for identifying candidates for disease management in whom increased care may reduce hospitalizations and prevent death.We evaluated 505 HF patients from 13 outpatient clinics who had an ejection fraction <40% using the KCCQ summary score. Proportional hazards regression was used to evaluate the association between the KCCQ summary score (range, 0 to 100; higher scores indicate better health status) and the primary outcome of death or HF admission, adjusting for baseline patient characteristics, 6-min walk distance, and B-type natriuretic peptide (BNP).The mean age was 61 years, 76% of patients were male, 51% had an ischemic HF etiology, and 5% were New York Heart Association functional class IV. At 12 months, among the 9% of patients with a KCCQ score <25, 37% had been admitted for HF and 20% had died, compared with 7% (HF admissions) and 5% (death) of those with a KCCQ score > or =75 (33% of patients, p < 0.0001 for both comparisons). In sequential multivariable models adjusting for clinical variables, 6-min walk, and BNP levels, the KCCQ score remained significantly associated with survival free of HF hospitalization.A low KCCQ score is an independent predictor of poor prognosis in outpatients with HF.

Abstract

Although clopidogrel plus aspirin is more effective than aspirin alone in preventing subsequent vascular events in patients with unstable angina, the cost-effectiveness of this combination has yet to be examined in this high-risk population.To determine the cost-effectiveness of clopidogrel plus aspirin compared with aspirin alone.Cost-utility analysis.Published literature.Patients with unstable angina and electrocardiographic changes or non-Q-wave myocardial infarction. time horizon: Lifetime.Societal.Combination therapy with clopidogrel, 75 mg/d, plus aspirin, 325 mg/d, for 1 year, followed by aspirin monotherapy, was compared with lifelong aspirin therapy, 325 mg/d.Lifetime costs, life expectancy in quality-adjusted life-years (QALYs), and the incremental cost-effectiveness ratio.Patients treated with aspirin alone lived 9.51 QALYs after their initial event and incurred expenses of 127,700 dollars; the addition of clopidogrel increased life expectancy to 9.61 QALYs and costs to 129,300 dollars. The incremental cost-effectiveness ratio for clopidogrel plus aspirin compared with aspirin alone was 15,400 dollars per QALY. RESULTS OF SENSITIVITY ANALYSES: The analysis of 1 year of therapy was robust to all sensitivity analyses. In the probabilistic sensitivity analysis, fewer than 3% of simulations resulted in cost-effectiveness ratios over 50,000 dollars per QALY. The cost-effectiveness of longer combination therapy depends critically on the balance of thrombotic event rates, durable efficacy, and the increased bleeding rate in patients taking clopidogrel.This analysis may not apply to patients with severe heart failure, those undergoing long-term anticoagulant therapy, those recently managed with revascularization, or those undergoing short-term treatment with glycoprotein IIb/IIIa inhibitors.In patients with high-risk acute coronary syndromes, 1 year of therapy with clopidogrel plus aspirin results in greater life expectancy than aspirin alone, at a cost within the traditional limits of cost-effectiveness. The durable efficacy of clopidogrel relative to the risk for hemorrhage should be further explored before more protracted therapy can be recommended.

Abstract

This study was designed to evaluate the cost-effectiveness of screening patients with a B-type natriuretic peptide (BNP) blood test to identify those with depressed left ventricular systolic function.Asymptomatic patients with depressed ejection fraction (EF) may have less progression to heart failure if they can be identified and treated.We used a decision model to estimate economic and health outcomes for different screening strategies using BNP and echocardiography to detect left ventricular EF <40% for men and women age 60 years. We used published data from community cohorts (gender-specific BNP test characteristics, prevalence of depressed EF) and randomized trials (benefit from treatment).Screening 1,000 asymptomatic patients with BNP followed by echocardiography in those with an abnormal test increased the lifetime cost of care (176,000 US dollars for men, 101,000 US dollars for women) and improved outcome (7.9 quality-adjusted life years [QALYs] for men, 1.3 QALYs for women), resulting in a cost per QALY of 22,300 US dollars for men and 77,700 US dollars for women. For populations with a prevalence of depressed EF of at least 1%, screening with BNP followed by echocardiography increased outcome at a cost < 50,000 US dollars per QALY gained. Screening would not be attractive if a diagnosis of left ventricular dysfunction led to significant decreases in quality of life or income.Screening populations with a 1% prevalence of reduced EF (men at age 60 years) with BNP followed by echocardiography should provide a health benefit at a cost that is comparable to or less than other accepted health interventions.

Abstract

The dissemination of clinical practice guidelines often has not been accompanied by desired improvements in guideline adherence. This study evaluated interventions for implementing a new practice guideline advocating the use of beta-blockers for heart failure patients.This was a randomized controlled trial involving heart failure patients (n=169) with an ejection fraction < or =45% and no contraindications to beta-blockers. Patients' primary providers were randomized in a stratified design to 1 of 3 interventions: (1) control: provider education; (2) provider and patient notification: computerized provider reminders and patient letters advocating beta-blockers; and (3) nurse facilitator: supervised nurse to initiate and titrate beta-blockers. The primary outcome, the proportion of patients who were initiated or uptitrated and maintained on beta-blockers, analyzed by intention to treat, was achieved in 67% (36 of 54) of patients in the nurse facilitator group compared with 16% (10 of 64) in the provider/patient notification and 27% (14 of 51) in the control groups (P<0.001 for the comparisons between the nurse facilitator group and both other groups). The proportion of patients on target beta-blocker doses at the study end (median follow-up, 12 months) was also highest in the nurse facilitator group (43%) compared with the control (10%) and provider/patient notification groups (2%) (P<0.001). There were no differences in adverse events among groups.The use of a nurse facilitator was a successful approach for implementing a beta-blocker guideline in heart failure patients. The use of provider education, clinical reminders, and patient education was of limited value in this setting.

Abstract

To determine if greater managed care market share is associated with greater use of recommended therapies for fee-for-service patients with acute myocardial infarction.We examined the care of 112,900 fee-for-service Medicare beneficiaries aged > or = 65 years who resided in one of 320 metropolitan statistical areas and who were admitted with an acute myocardial infarction between February 1994 through July 1995. Use of recommended medical treatments and 30-day survival were determined for areas with low (<10%), medium (10% to 30%), and high (>30%) managed care market share.After adjustment for severity of illness, teaching status of the admission hospital, and area characteristics, areas with high levels of managed care had greater use of beta-blockers (relative risk [RR] for greater use = 1.18; 95% confidence interval [CI]: 1.06 to 1.29) and aspirin at discharge (RR = 1.05; 95% CI: 1.02 to 1.07), but less appropriate coronary angiography (RR = 0.93; 95% CI: 0.86 to 1.01) and reperfusion (RR = 0.95; 95% CI: 0.85 to 1.03) when compared with areas with low levels of managed care.Medicare beneficiaries with fee-for-service insurance who resided in areas with high managed care activity were more likely to have received appropriate treatment with beta-blockers and aspirin, and less likely to have undergone coronary angiography following admission for myocardial infarction. Thus, the effects of managed care may not be limited to managed care enrollees.

Abstract

To review the trends in treatment and survival for patients with acute myocardial infarction over the last 20 years.Studies were identified through MEDLINE searches and review of study bibliographies. Additional data were obtained from the Health Care Financing Administration including data from Medicare claims files (part A). Thirty-day mortality rates were calculated using Medicare data and case fatality rates from the National Hospital Discharge Survey. Published meta-analyses were used to determine treatment effects. Published studies were included if they reported the use of therapies for acute myocardial infarction at a population level. Trends in the demographic characteristics of the patients as well as infarct characteristics, medication use, and revascularization were recorded.The use of acute treatments that are known to improve survival among patients with myocardial infarction has increased markedly during the last 20 years, leading to an estimated 9.6% reduction (from 27.0% to 17.4%) in 30-day mortality. After adjusting for potential interactions between therapies, the increase in use of aspirin, beta-blockers, angiotensin-converting enzyme (ACE) inhibitors, and reperfusion can explain 71% of the decrease in the 30-day age- and sex-adjusted mortality rate from 1975 to 1995. The greatest effect of a given therapy was that of aspirin, which accounted for 34% of the decrease in 30-day mortality, followed by thrombolysis (17%), primary angioplasty (10%), beta-blockers (7%), and ACE inhibitors (3%). If other treatments (such as heparin or nonprimary angioplasty), whose effects on mortality are less certain, are included, up to 90% of the decrease in 30-day mortality can be explained by changes in treatment.The primary reason for the decrease in early mortality from myocardial infarction during the last 20 years appears to be increased use of effective treatments.

Abstract

Which drug is most effective as a first-line treatment for stable angina is not known.To compare the relative efficacy and tolerability of treatment with beta-blockers, calcium antagonists, and long-acting nitrates for patients who have stable angina.We identified English-language studies published between 1966 and 1997 by searching the MEDLINE and EMBASE databases and reviewing the bibliographies of identified articles to locate additional relevant studies.Randomized or crossover studies comparing antianginal drugs from 2 or 3 different classes (beta-blockers, calcium antagonists, and long-acting nitrates) lasting at least 1 week were reviewed. Studies were selected if they reported at least 1 of the following outcomes: cardiac death, myocardial infarction, study withdrawal due to adverse events, angina frequency, nitroglycerin use, or exercise duration. Ninety (63%) of 143 identified studies met the inclusion criteria.Two independent reviewers extracted data from selected articles, settling any differences by consensus. Outcome data were extracted a third time by 1 of the investigators. We combined results using odds ratios (ORs) for discrete data and mean differences for continuous data. Studies of calcium antagonists were grouped by duration and type of drug (nifedipine vs nonnifedipine).Rates of cardiac death and myocardial infarction were not significantly different for treatment with beta-blockers vs calcium antagonists (OR, 0.97; 95% confidence interval [CI], 0.67-1.38; P = .79). There were 0.31 (95% CI, 0.00-0.62; P = .05) fewer episodes of angina per week with beta-blockers than with calcium antagonists. beta-Blockers were discontinued because of adverse events less often than were calcium antagonists (OR, 0.72; 95% CI, 0.60-0.86; P

Abstract

The objective of this study was to determine how often providers did not obtain a recommended measure of left ventricular ejection fraction (LVEF) following a high B-type natriuretic peptide (BNP) value when the LVEF was not known to be low (<40%). Such patients may benefit from life-prolonging treatment.We identified consecutive patients (inpatient or outpatient) with a BNP value of at least 200 pg/mL within a single VA health care system (3 inpatient facilities and 8 community clinics) during a 10-month period (September 2008-June 2009). We performed chart review to determine results of any imaging study performed (inside or outside the health system) prior to or after the high BNP value.Of the 296 patients with a high BNP, 212 were not known to have a low LVEF. Of these, 99 (47%) did not have the guideline recommended follow-up LVEF study. Among those that survived at least 6 months following BNP and a follow-up echocardiogram was indicated (no prior LVEF or prior LVEF was > 40%), mortality was 20% if an echocardiogram was performed within 6 months of the BNP and 27% if it was not performed within 6 months of BNP testing (P = 0.21).Approximately half of patients with a high BNP and an LVEF not known to be low did not have a follow-up guideline recommended LVEF study and may have unrecognized heart failure. Our findings suggest that a trial is warranted of a clinical pathway where those patients with a high BNP and without appropriate follow-up are randomized to have their physician receive a notification of the high BNP value.

Abstract

Real-world use of traditional heart failure (HF) medications for patients with left ventricular assist devices (LVADs) is not well known.We conducted a retrospective, observational analysis of 1,887 advanced HF patients with and without LVADs from 32 LVAD hospitals participating in the Get With The Guidelines-Heart Failure registry from January 2009 to March 2015. We examined HF medication prescription at discharge, temporal trends, and predictors of prescription among patients with an in-hospital (n = 258) or prior (n = 171) LVAD implant, and those with advanced HF but no LVAD, as defined by a left ventricular ejection fraction ≤25% and in-hospital receipt of intravenous inotropes or vasopressin receptor antagonists (n = 1,458).For β-blocker and angiotensin-converting enzyme inhibitors/angiotensin II receptor blockers (ACEI/ARB), discharge prescriptions were 58.9% and 53.5% for new LVAD patients, 53.8% and 42.9% for prior LVAD patients, and 73.4% and 63.2% for patients without LVAD support, respectively (both P < .0001). Aldosterone antagonist prescription quadrupled among LVAD patients during the study period (P < .0001), whereas ACEI/ARB use decreased nearly 20 percentage points (60.0% to 41.4%, P = .0003). In the multivariable analysis of LVAD patients, patient age was inversely associated with β-blocker, ACEI/ARB, and aldosterone antagonist prescription.Traditional HF therapies were moderately prescribed at discharge to patients with LVADs and were more frequently prescribed to patients with advanced HF without LVAD support. Moderate prescription rates suggest clinical uncertainty in the use of antiadrenergic medication in this population. Further research is needed on the optimal medical regimen for patients with LVADs.

Abstract

The purpose of this study was to determine the temporal trends in the adherence to heart failure (HF)-related process of care measures and clinical outcomes among patients with acute decompensated HF with reduced ejection fraction (HFrEF) and end-stage renal disease (ESRD).Previous studies have demonstrated significant underuse of evidence-based HF therapies among patients with coexisting ESRD and HFrEF. However, it is unclear if the proportional use of evidence-based medical therapies and associated clinical outcomes among these patients has changed over time.Get With The Guidelines-HF study participants who were admitted for acute HFrEF between January 2005 and June 2014 were stratified into 3 groups on the basis of their admission renal function: normal renal function, renal insufficiency without dialysis, and dialysis. Temporal change in proportional adherence to the HF-related process of care measures and incidence of clinical outcomes (1-year mortality, HF hospitalization, and all-cause hospitalization) during the study period was evaluated across the 3 renal function groups.The study included 111,846 patients with HFrEF from 390 participating centers, of whom 19% had renal insufficiency but who did not require dialysis, and 3% were on dialysis. There was a significant temporal increase in adherence to evidence-based medical therapies (angiotensin-converting enzyme inhibitor/angiotensin receptor blocker: p trend <0.0001, β-blockers: p trend = 0.0089; post-discharge follow-up referral: p trend <0.0001) and defect-free composite care (p trend <0.0001) among dialysis patients. An improvement in adherence to these measures was also observed among patients with normal renal function and patients with renal insufficiency without a need for dialysis. There was no significant change in cumulative incidence of clinical outcomes over time among the HF patients on dialysis.In a large contemporary cohort of HFrEF patients with ESRD, adherence to the HF process of care measures has improved significantly over the past 10 years. Unlike patients with normal renal function, there was no significant change in 1-year clinical outcomes over time among HF patients on dialysis.

Abstract

This study sought to determine the variation in annual health care costs among patients with heart failure in the Veterans Affairs (VA) system.Heart failure is associated with considerable use of health care resources, but little is known about patterns in patient characteristics related to higher costs.We obtained VA utilization and cost records for all patients with a diagnosis of heart failure in fiscal year 2010. We compared total VA costs by patient demographic factors, comorbid conditions, and facility where they were treated in bivariate analyses. We regressed total costs on patient factors alone, VA facility alone, and all factors combined to determine the relative contribution of patient factors and facility to explaining cost differences.There were 117,870 patients with heart failure, and their mean annual VA costs were $30,719 (SD 49,180) with more than one-half of their costs from inpatient care. Patients at younger ages, of Hispanic or black race/ethnicity, diagnosed with comorbid drug use disorders, or who died during the year had the highest costs (all p < 0.01). There was variation in costs by facility as mean adjusted costs ranged from approximately $15,000 to $48,000. In adjusted analyses, patient factors alone explained more of the variation in health care costs (R(2) = 0.116) compared with the facility where the patient was treated (R(2) = 0.018).A large variation in costs of heart failure patients was observed across facilities, although this was explained largely by patient factors. Improving the efficiency of VA resource utilization may require increased scrutiny of high-cost patients to determine if adequate value is being delivered to those patients.

Abstract

Hospital to Home (H2H) is a national quality improvement initiative sponsored by the Institute for Healthcare Improvement and the American College of Cardiology, with the goal of reducing readmission for patients hospitalized with heart disease. We sought to determine the impact of H2H within the Veterans Affairs (VA) health care system.Using a controlled interrupted time series, we determined the association of VA hospital enrollment in H2H with the primary outcome of 30-day all-cause readmission following a heart failure hospitalization. VA heart failure providers were surveyed to determine quality improvement projects initiated in response to H2H. Secondary outcomes included initiation of recommended H2H projects, follow-up within 7 days, and total hospital days at 30 days and 1 year.Sixty-five of 104 VA hospitals (66%) enrolled in the national H2H initiative. Hospital characteristic associated with H2H enrollment included provision of tertiary care, academic affiliation, and greater use of home monitoring. There was no significant difference in mean 30-day readmission rates (20.0% ± 5.0% for H2H vs 19.3% ± 5.9% for non-H2H hospitals; P = .48) The mean fraction of patients with a cardiology visit within 7 days was slightly higher for H2H hospitals (3.0% ± 2.4% for H2H vs 2.0% ± 1.9% for non-H2H hospitals; P = .05). Patients discharged from H2H hospitals had fewer mean hospitals days during the following year (7.6% ± 2.6% for H2H vs 9.2% ± 3.0 for non-H2H; P = .01) early after launch of H2H, but the effect did not persist.VA hospitals enrolling in H2H had slightly more early follow-up in cardiology clinic but no difference in 30-day readmission rates compared with hospitals not enrolling in H2H.

Abstract

This study assessed the comparative frequency of precipitating clinical factors leading to hospitalization among heart failure (HF) patients with reduced, borderline, and preserved ejection fraction (EF) BACKGROUND: There are few data assessing the comparative frequency of clinical factors leading to HF among hospitalized among patients with reduced, borderline, and preserved EF.We analyzed the factors potentially contributing to HF hospitalization among 99,825 HF admissions from 305 hospitals in the Get With The Guidelines-HF (GWTG-HF) database between January 2005 and September 2013 and assessed their association with length of stay and in-hospital mortality.Mean patient age was 72.6 ± 14.2 years, 49% were female, and mean EF was 39.3 ± 17.2%. Common factors included pneumonia/respiratory process (28.2%), arrhythmia (21.7%), medication noncompliance (15.8%), worsening renal failure (14.7%), and uncontrolled hypertension (14.5%). In patients with borderline EF (EF 40% to 49%), pneumonia was associated with longer hospital stay, whereas dietary and medication noncompliance were associated with reduced length of stay. In patients with preserved EF (EF ≥50% or qualitative assessment of normal or mild dysfunction), pneumonia, weight gain, and worsening renal function were independently associated with longer lengths of stay. Worsening renal function and pneumonia were independently associated with higher in-hospital mortality in all HF groups, and acute pulmonary edema was associated with higher mortality in reduced EF. Dietary noncompliance (14.7%) was associated with reduced mortality for all groups but reached statistical significance in the subgroups of reduced (odds ratio [OR]: 0.65; 95% confidence interval [CI]: 0.46 to 0.91) and preserved systolic function (OR: 0.52; 95% CI: 0.33 to 0.83). Patients presenting with ischemia had a higher mortality rate (OR: 1.31; 95% CI: 1.02 to 1.69; and 1.72; 95% CI: 1.27 to 2.33, respectively, in the 2 groups).Potential precipitating factors among patients hospitalized with HF vary by EF group and are independently associated with clinical outcomes.

Abstract

Randomized trials of left atrial appendage (LAA) closure with the Watchman device have shown varying results, and its cost effectiveness compared with anticoagulation has not been evaluated using all available contemporary trial data.We used a Markov decision model to estimate lifetime quality-adjusted survival, costs, and cost effectiveness of LAA closure with Watchman, compared directly with warfarin and indirectly with dabigatran, using data from the long-term (mean 3.8 year) follow-up of Percutaneous Closure of the Left Atrial Appendage Versus Warfarin Therapy for Prevention of Stroke in Patients With Atrial Fibrillation (PROTECT AF) and Prospective Randomized Evaluation of the Watchman LAA Closure Device in Patients With Atrial Fibrillation (PREVAIL) randomized trials. Using data from PROTECT AF, the incremental cost-effectiveness ratios compared with warfarin and dabigatran were $20 486 and $23 422 per quality-adjusted life year, respectively. Using data from PREVAIL, LAA closure was dominated by warfarin and dabigatran, meaning that it was less effective (8.44, 8.54, and 8.59 quality-adjusted life years, respectively) and more costly. At a willingness-to-pay threshold of $50 000 per quality-adjusted life year, LAA closure was cost effective 90% and 9% of the time under PROTECT AF and PREVAIL assumptions, respectively. These results were sensitive to the rates of ischemic stroke and intracranial hemorrhage for LAA closure and medical anticoagulation.Using data from the PROTECT AF trial, LAA closure with the Watchman device was cost effective; using PREVAIL trial data, Watchman was more costly and less effective than warfarin and dabigatran. PROTECT AF enrolled more patients and has substantially longer follow-up time, allowing greater statistical certainty with the cost-effectiveness results. However, longer-term trial results and postmarketing surveillance of major adverse events will be vital to determining the value of the Watchman in clinical practice.

Abstract

Reducing hospital readmissions for patients with heart failure is a national priority, and quality improvement campaigns are targeting reductions of ≥20%. However, there are limited data on whether such targets have been met.We analyzed data from the American Heart Association's Get With The Guidelines-Heart Failure registry linked to Medicare claims between 2009 and 2012 to describe trends and relative reduction of rates of 30-day all-cause readmission among patients with heart failure. A total of 21,264 patients with heart failure were included from 70 US sites from January 2009 to October 2012. Overall hospital-level, risk-adjusted, 30-day all-cause readmission rates declined slightly, from 20.0% (SD, 1.3%) in 2009 to 19.0% (SD, 1.2%) in 2012 (P=0.001). Only 1 in 70 (1.4%) hospitals achieved the 20% relative reduction in 30-day risk-adjusted readmission rates. A multivariable linear regression model was used to determine hospital-level factors associated with relative improvements in 30-day risk-adjusted readmissions between 2009 and 2012. Teaching hospitals had higher relative readmission rates as compared with their peers, and hospitals that used postdischarge heart failure disease management programs had lower relative readmission rates.Although there has been slight improvement in 30-day all-cause readmission rates during the past 4 years in patients with heart failure, few hospitals have seen large success.

Abstract

Acute Coronary Treatment and Intervention Outcomes Network Registry-Get With The Guidelines (ACTION Registry-GWTG) was designed to measure and improve the treatment and outcomes of patients with acute myocardial infarction (AMI), yet it is unknown whether performance of Medicare Hospital Compare metrics and outcomes differ between hospitals participating versus those not participating in the registry.Using 2007 to 2010 Hospital Compare data, we matched participating to nonparticipating hospitals based on teaching status, size, percutaneous coronary intervention capability, and baseline (2007) Hospital Compare AMI process measure performance. We used linear mixed modeling to compare 2010 Hospital Compare process measure adherence, 30-day risk-adjusted mortality, and readmission rates. We repeated these analyses after stratification according to baseline performance level.Compared with nonparticipating hospitals, those participating were larger (median 288 vs 139 beds, P < .0001), more often teaching hospitals (18.8% vs 6.3%, P < .0001), and more likely had interventional catheterization lab capabilities (85.7% vs 34.0%, P < .0001). Among 502 matched pairs of participating and nonparticipating hospitals, we found high levels of process measure adherence in both 2007 and 2010, with minimal differences between them. Rates of 30-day mortality and readmission in 2010 were also similar between both groups. Results were consistent across strata of baseline performance level.In this observational analysis, there were no significant differences in the performance of Hospital Compare process measures or outcomes between hospitals in Acute Coronary Treatment and Intervention Outcomes Network Registry-Get With The Guidelines and other hospitals not in the registry. However, baseline performance on the Hospital Compare process measures was very high in both groups, suggesting the need for new quality improvement foci to further improve patient outcomes.

Abstract

Do-not-resuscitate (DNR) orders reflect an important means of respecting patient autonomy while minimizing the risk of nonbeneficial interventions. We sought to clarify trends and differences in rates of DNR orders for patients hospitalized with heart failure.We used statewide data from California's Healthcare Cost and Utilization dataset (2007-2010) to determine trends in DNR orders within 24 hours of admission for patients with a primary discharge diagnosis of heart failure.Among 347,541 hospitalizations for heart failure, the rate of DNR order within 24 hours increased from 10.4% in 2007 to 11.3% in 2010 (P < .0001). After adjustment, DNR status correlated with older age, female gender, white race, frequent comorbidities (Charlson Score), and residence in higher income area (P

Abstract

The Institute of Medicine recommends people with serious advanced illness have access to skilled palliative care. However, the predominant delivery model of nonhospice palliative care is inpatient, consultative care focused on the end of life, with a small specialist palliative care workforce.The study objective was to understand organizational factors that could influence the adoption and scale-up of outpatient palliative care in chronic advanced illness, using the example of heart failure.This was a cross-sectional qualitative study. Participants were 17 health care providers and local, regional, and national health system leaders from the Veterans Health Administration (VHA) who were considering whether and how to adopt and sustain outpatient palliative care. Individual interviews using semistructured questions assessed domains of the Consolidated Framework for Implementation Science.Most providers and leaders perceived outpatient palliative care as high priority in the VHA given its patient-centeredness and potential to decrease health care use and costs associated with conditions like heart failure. They also supported a collaborative care team model of outpatient palliative care delivery where a palliative care specialist collaborates with medical nurses and social workers. They reported lack of performance measures/incentives for patient-centered care processes and outcomes as a potential barrier to implementation. Features of outpatient palliative care viewed as important for successful adoption and scale-up included coordination and communication with other providers, ease of integration into existing programs, and evidence of improving quality of care while not substantially increasing overall health care costs.Incentives such as performance measures and collaboration with local VHA providers and leaders could improve adoption and scale-up of outpatient palliative care.

Abstract

In contrast to chronic heart failure (CHF), measures of quality of care for chronic obstructive pulmonary disease (COPD) are poor. Our objective was to examine differences in organizational structure available to support quality of care for patients with CHF and COPD.We performed 2 nationwide surveys exploring organizational structure for the management of CHF and COPD. We surveyed the chief of medicine and the chief of cardiology and pulmonary medicine at 120 Veterans Affairs facilities in the United States.Analogous questions about organizational structure that enhanced adherence to guideline-based care were compared between CHF and COPD surveys.We found large and notable differences in the organizational structure for disease management, with systematically less attention given to COPD than CHF. These differences were evident in multiple processes of care. Key differences included fewer facilities: having COPD clinics than CHF clinics (12.7% vs 50.8%; P < .01), relating performance measures with COPD providers than CHF providers (17.1% vs 70%; P < .01), and having home monitoring programs for COPD than for CHF (50.5% vs 87.4%; P < .01).Despite the growing burden of COPD, less organizational structure existed for COPD than CHF. Lack of organizational structure for COPD likely impedes an organization's abilities to encourage high-quality care and avoid recently implemented hospital readmission penalties. Our results suggest the need to develop a systematic approach for healthcare systems to provide essential organizational structure based on the burden of disease in the population.

Abstract

In clinical trials, hydralazine-isosorbide dinitrate (H-ISDN) for heart failure with reduced ejection fraction reduced morbidity and mortality among black patients and patients with intolerance to angiotensin-converting enzyme inhibitors or angiotensin II receptor blockers. The effectiveness of H-ISDN in clinical practice is unknown.Using data from a clinical registry linked with Medicare claims, we examined the use and outcomes of H-ISDN between 2005 and 2011 among older patients hospitalized with heart failure and reduced ejection fraction. We adjusted for demographic and clinical characteristics using Cox proportional hazards models and inverse probability weighting. Among 4663 eligible patients, 22.7% of black patients and 18.2% of patients not on an angiotensin-converting enzyme inhibitor or angiotensin II receptor blocker were newly prescribed H-ISDN therapy at discharge. By 3 years, the cumulative incidence rates of mortality and readmission were similar between treated and untreated patients. After multivariable adjustment, 3-year outcomes remained similar for mortality [black patients: hazard ratio (HR), 0.92; 95% confidence interval (CI), 0.75-1.13; other patients: HR, 0.93; 95% CI, 0.79-1.09], all-cause readmission (black patients: HR, 0.98; 95% CI, 0.84-1.13; other patients: HR, 1.02; 95% CI, 0.90-1.17), and cardiovascular readmission (black patients: HR, 0.99; 95% CI, 0.82-1.19; other patients: HR, 0.94; 95% CI, 0.81-1.09). A post hoc analysis of Medicare Part D data revealed low postdischarge adherence to therapy.Guideline-recommended initiation of H-ISDN therapy at hospital discharge was uncommon, and adherence was low. For both black patients and patients of other races, there were no differences in outcomes between those treated and untreated at discharge.

Abstract

Obesity remains a significant public health concern. One of the primary messages from providers and health-care organizations is to eat healthier foods with lower fat. Many in the lay press, however, have suggested that lower fat versions of foods contain more sugar. To our knowledge, a systematic comparison of the sugar content in food with lower fat alternatives has not been performed. In this study, we compared fat free, low fat and regular versions of the same foods using data collected from the USDA National Nutrient Database. We found that the amount of sugar is higher in the low fat (that is, reduced calorie, light, low fat) and non-fat than 'regular' versions of tested items (Friedman P=0.00001, Wilcoxon P=0.0002 for low fat vs regular food and P=0.0003 for non-fat vs regular food). Our data support the general belief that food that is lower in fat may contain more sugar.

Abstract

The influence of race on quality of anticoagulation control is not well described. We examined the association between race, international normalized ratio (INR) monitoring intensity, and INR control in warfarin-treated patients with atrial fibrillation (AF). Using data from the Veterans Health Administration (VHA), we performed a retrospective cohort study of 184,161 patients with a new diagnosis of AF/flutter from 2004 to 2012 who received any VHA prescription within 90 days of diagnosis. The primary predictor was race, ascertained from multiple VHA and linked Medicare demographic files. The primary outcome was first-year and long-term time in therapeutic range (TTR) of INR 2.0 to 3.0. Secondary outcomes were INR monitoring intensity and warfarin persistence. Of the 116,021 patients who received warfarin in the cohort, INR monitoring intensity was similar across racial groups. However, TTR was lowest in blacks and highest in whites (first year 0.49 ± 0.23 vs 0.57 ± 0.21, p <0.001; long term 0.52 ± 0.20 vs 0.59 ± 0.18, p <0.001); 64% of whites and 49% of blacks had long-term TTR >55% (p <0.001). After adjusting for site and patient-level covariates, black race was associated with lower first-year and long-term TTRs (4.2% and 4.1% below the conditional mean, relative to whites; p <0.0001 for both). One-year warfarin persistence was slightly lower in blacks compared to whites (58% vs 60%, p <0.0001). In conclusion, in patients with AF anticoagulated with warfarin, differences in INR control are most evident among blacks, underscoring the need to determine if other types of intensive management or warfarin alternatives may be necessary to improve anticoagulation among vulnerable AF populations.

Abstract

Heart failure (HF) is associated with frequent exacerbations and shortened lifespan. Informal caregivers such as significant others often support self-management in patients with HF. However, existing programs that aim to enhance self-management seldom engage informal caregivers or provide tools that can help alleviate caregiver burden or improve collaboration between patients and their informal caregivers.To develop and pilot test a program targeting the needs of self-management support among HF patients as well as their significant others.We developed the Dyadic Health Behavior Change model and conducted semi-structured interviews to determine barriers to self-management from various perspectives. Participants' feedback was used to develop a family-centered self-management program called "SUCCEED: Self-management Using Couples' Coping EnhancEment in Diseases." The goals of this program are to improve HF self-management, quality of life, communication within couples, relationship quality, and stress and caregiver burden. We conducted a pilot study with 17 Veterans with HF and their significant others to determine acceptability of the program. We piloted psychosocial surveys at baseline and after participants' program completion to evaluate change in depressive symptoms, caregiver burden, self-management of HF, communication, quality of relationship, relationship mutuality, and quality of life.Of the 17 couples, 14 completed at least 1 SUCCEED session. Results showed high acceptability for each of SUCCEED's sessions. At baseline, patients reported poor quality of life, clinically significant depressive symptoms, and inadequate self-management of HF. After participating in SUCCEED, patients showed improvements in self-management of HF, communication, and relationship quality, while caregivers reported improvements in depressive symptoms and caregiver burden. Quality of life of both patients and significant others declined over time.In this small pilot study, we showed positive trends with involving significant others in self-management. SUCCEED has the potential of addressing the growing public health problem of HF among patients who receive care from their significant other.

Abstract

Clinical decision support (CDS) systems with complex logic are being developed. Ensuring the quality of CDS is imperative, but there is no consensus on testing standards. We tested ATHENA-HTN CDS after encoding updated hypertension guidelines into the system. A logic flow and a complexity analysis of the encoding were performed to guide testing. 100 test cases were selected to test the major pathways in the CDS logic flow, and the effectiveness of the testing was analyzed. The encoding contained 26 decision points and 3120 possible output combinations. The 100 cases selected tested all of the major pathways in the logic, but only 1% of the possible output combinations. Test case selection is one of the most challenging aspects in CDS testing and has a major impact on testing coverage. A test selection strategy should take into account the complexity of the system, identification of major logic pathways, and available resources.

Abstract

To characterize warfarin eligibility and receipt among Veterans Health Administration (VHA) patients with and without mental health conditions (MHCs).Retrospective cohort study.This observational study identified VHA atrial fibrillation (AF) patients with and without MHCs in 2004. We examined unadjusted MHC-related differences in warfarin eligibility and warfarin receipt among warfarin-eligible patients, using logistic regression for any MHC and for specific MHCs (adjusting for sociodemographic and clinical characteristics).Of 125,670 patients with AF, most (96.8%) were warfarin-eligible based on a CHADS2 stroke risk score. High stroke risk and contraindications to anticoagulation were both more common in patients with MHC. Warfarin-eligible patients with MHC were less likely to receive warfarin than those without MHC (adjusted odds ratio [AOR], 0.90; 95% CI, 0.87-0.94). The association between MHC and warfarin receipt among warfarin-eligible patients varied by specific MHC. Patients with anxiety disorders (AOR, 0.86; 95% CI, 0.80-0.93), psychotic disorders (AOR, 0.77; 95% CI, 0.65-0.90), and alcohol use disorders (AOR 0.62, 95% CI 0.54-0.72) were less likely to receive warfarin than patients without these conditions, whereas patients with depressive disorders and posttraumatic stress disorder were no less likely to receive warfarin than patients without these conditions.Compared with patients with AF without MHCs, those with MHCs are less likely to be eligible for warfarin receipt and, among those eligible, are less likely to receive such treatment. Although patients with AF with MHC need careful assessment of bleeding risk, this finding suggests potential missed opportunities for more intensive therapy among some individuals with MHCs.

Abstract

There are limited data on mortality outcomes associated with use of amiodarone in atrial fibrillation and flutter (AF).We evaluated the association of amiodarone use with mortality in patients with newly diagnosed AF using complete data from the Department of Veterans Affairs national health care system. We included patients seen in an outpatient setting within 90 days of a new diagnosis for nonvalvular AF between Veterans Affairs fiscal years 2004 and 2008. Multivariate analysis and propensity-matched Cox proportional hazards regression were used to evaluate the association of amiodarone use to death.Of 122,465 patients (353,168 person-years of follow-up, age 72.1 ± 10.3 years, 98.4% males), amiodarone was prescribed in 11,655 (9.5%). Cumulative, unadjusted mortality rates were higher for amiodarone recipients than for nonrecipients (87 vs 73 per 1,000 person-years, P < .001). However, in multivariate and propensity-matched survival analyses, there was no significant difference in mortality (multivariate hazard ratio 1.01, 95% CI 0.97-1.05, P = .51, and propensity-matched hazard ratio 1.02, 95% CI 0.97-1.07, P = .45). The hazard of death was not modified by age, sex, heart failure, kidney function, β-blocker use, or warfarin use, but there was evidence of effect modification among patients diagnosed with AF as an inpatient versus outpatient.In a national health care system population of newly diagnosed AF, overall use of amiodarone as an early treatment strategy was not associated with mortality.

Abstract

Mineralocorticoid receptor antagonists (MRAs) have been shown to reduce morbidity and mortality in patients with heart failure (HF) with reduced ejection fraction but are associated with hyperkalemia. We sought to evaluate the frequency, variation, and predictors associated with serum potassium monitoring in patients with HF initiated on an MRA among facilities in the Veterans Affairs (VA) Health Care System.We performed a retrospective cohort analysis of patients with HF across 133 Veterans Affairs facilities from 2003 to 2013 who were given a new prescription of an MRA. The primary outcome was the mean percentage of patients per facility with serum potassium monitoring within 14 days of MRA dispensing. Univariate and covariate analyses were performed to determine factors associated with monitoring.There were 142,880 patients identified with HF initiated on an MRA who met the study inclusion and exclusion criteria. The mean (SD) percentage of patients per facility with serum potassium monitoring within 14 days was 41.6% (standard deviation 8.0%; minimum 18.9%, maximum 56.7%). Facilities with a higher frequency of monitoring were associated with membership in the Council on Teaching Hospitals (n = 70, P < .0001), had academic affiliations (n = 100, P < .0001), and a higher annual volume of patients with HF (≥200 patients, P < .0001).In a large multicenter national sample of patients with HF receiving a new MRA prescription, the frequency of serum potassium monitoring was below recommended guidelines. Academic facilities and those with a higher volume of patients with HF were associated with an increased frequency of monitoring.

Abstract

Implantable cardioverter-defibrillator (ICD) therapy is associated with improved outcomes in patients with heart failure (HF), but whether this association holds among older patients with multiple comorbid illnesses and worse HF burden remains unclear.Using the National Cardiovascular Data Registry's ICD Registry and the Get With The Guidelines-Heart Failure (GWTG-HF) registry linked with Medicare claims, we examined outcomes associated with primary-prevention ICD versus no ICD among HF patients aged ≥65 years in clinical practice. We included patients with an ejection fraction ≤35% who received (ICD Registry) and who did not receive (GWTG-HF) an ICD. Compared with patients with an ICD, patients in the non-ICD group were older and more likely to be female and white. In matched cohorts, the 3-year adjusted mortality rate was lower in the ICD group versus the non-ICD group (46.7% versus 55.8%; adjusted hazard ratio [HR] 0.76; 95% CI 0.69 to 0.83). There was no associated difference in all-cause readmission (HR 0.99; 95% CI 0.92 to 1.08) but a lower risk of HF readmission (HR 0.88; 95% CI 0.80 to 0.97). When compared with no ICD, ICDs were also associated with better survival in patients with ≤3 comorbidities (HR 0.77; 95% CI 0.69 to 0.87) and >3 comorbidities (HR 0.77; 95% CI 0.64 to 0.93) and in patients with no hospitalization for HF (HR 0.75; 95% CI 0.65 to 0.86) and at least 1 prior HF hospitalization (HR 0.69; 95% CI 0.58 to 0.82). In subgroup analyses, there were no interactions between ICD and mortality risk for comorbidity burden (P=0.95) and for prior HF hospitalization (P=0.46).Among older HF patients, ICDs for primary prevention were associated with lower risk of mortality even among those with high comorbid illness burden and prior HF hospitalization.

Abstract

There is substantial opportunity to reduce health care costs through prevention of heart failure. Team-based management of medical homes and large populations will be important for the success of any prevention interventions. Clinical trials of treatment are needed to show that heart failure is reduced by treatment. A team-based approach to treatment of asymptomatic left ventricular systolic dysfunction (LVSD) can work well with the availability of electronic medical records and a population approach to health. Attention should be given to optimizing risk factor reduction and preventive treatment with angiotensin-converting enzyme inhibitors/angiotensin receptor blockers and β-blockers if LVSD is present.

Abstract

There is significant variation in the delivery of evidence-based care for patients with heart failure (HF), but there is limited evidence defining the best methods to improve the quality of care.We performed a cluster-randomized trial of personalized site performance feedback at 147 hospitals participating in the Get With The Guidelines-Heart Failure quality improvement program from October 2009 to March 2011. The intervention provided sites with specific data on their heart failure achievement and quality measures in addition to the usual Get With The Guidelines-Heart Failure tools. The primary outcome for our trial was improvement in site composite quality of care score. Overall, 73 hospitals (n=33 886 patients) received the intervention, whereas 74 hospitals (n=37 943 patients) did not. One year after the intervention, both the intervention and control arms had a similar mean change in percentage points in their composite quality score (absolute change, +0.31 [SE, 1.51] versus +3.18 [SE, 1.68] in control; P=0.21). Similarly, none of the individual achievement measures or quality measures improved more at intervention versus control hospitals.Our site-based intervention, which included personalized site feedback on adherence to quality metrics, was not able to elicit more quality improvement beyond that already associated with participation in the Get With The Guidelines-Heart Failure program.URL: http://www.clinicaltrials.gov. Unique identifier: NCT00979264.

Abstract

Million Hearts is a national initiative to prevent 1 million heart attacks and strokes over 5 years by improving cardiovascular prevention. An important tool in the success of programs like Million Hearts is public ranking on the quality of practices, yet different measures may provide different rankings, so the true quality of practices is difficult to discern. We evaluated the quality of ambulatory cardiology care using performance measure metrics.We compared rankings of practices participating in the National Cardiovascular Data Registry's Practice Innovation and Clinical Excellence Registry using measures from (1) the physician quality reporting system and (2) the American College of Cardiology/American Heart Association/Physician Consortium for Performance Improvement. We compared achievement rates for measures between the 2 frameworks and determined correlations in rankings using Spearman correlation coefficients.From January 1, 2008 to December 31, 2012, there were 1,711,326 patients enrolled from 111 US practices. Among eligible patients, the physician quality reporting system and American College of Cardiology/American Heart Association/Physician Consortium for Performance Improvement measures were achieved in 76.1% versus 77.4% for antiplatelet prescription (P < .001), 68.3% versus 90.8% for blood pressure control (P < .001), 26.9% versus 43.4% for cholesterol control (P < .001), and 37.4% versus 40.6% for smoking cessation (P = .383). Practice rankings were strongly correlated for antiplatelet prescription (correlation coefficient 0.98) and cholesterol control (0.92) but poorly correlated for blood pressure control (0.39) and smoking cessation (0.22).Evaluation of preventive care and individual practice rankings vary significantly depending on how measures are defined. Publicly reported measures need to be validly associated with outcomes to avoid incorrectly evaluating practice performance and failing to achieve public health goals.

Abstract

Electronic health records (EHRs) may be key tools for improving the quality of health care, particularly for conditions for which guidelines are rapidly evolving and timely care is critical, such as ischemic stroke.The goal of this study was to determine whether hospitals with EHRs differed on quality or outcome measures for ischemic stroke from those without EHRs.We studied 626,473 patients from 1,236 U.S. hospitals in Get With the Guidelines-Stroke (GWTG-Stroke) from 2007 through 2010, linked with the American Hospital Association annual survey to determine the presence of EHRs. We conducted patient-level logistic regression analyses for each of the outcomes of interest.A total of 511 hospitals had EHRs by the end of the study period. Hospitals with EHRs were larger and were more often teaching hospitals and stroke centers. After controlling for patient and hospital characteristics, patients admitted to hospitals with EHRs had similar odds of receiving "all-or-none" care (odds ratio [OR]: 1.03; 95% CI: 0.99 to 1.06; p=0.12), of discharge home (OR: 1.02; 95% CI: 0.99 to 1.04; p=0.15), and of in-hospital mortality (OR: 1.01; 95% CI: 0.96 to 1.05; p=0.82). The odds of having a length of stay>4 days was slightly lower at hospitals with EHRs (OR: 0.97; 95% CI: 0.95 to 0.99; p=0.01).In our sample of GWTG-Stroke hospitals, EHRs were not associated with higher-quality care or better clinical outcomes for stroke care. Although EHRs may be necessary for an increasingly high-tech, transparent healthcare system, as currently implemented, they do not appear to be sufficient to improve outcomes for this important disease.

Abstract

Identification of silent atrial fibrillation (AF) could prevent stroke and other sequelae.Screening for AF using continuous ambulatory electrocardiographic (ECG) monitoring can detect silent AF in asymptomatic in patients with known risk factors.We performed a single-center prospective screening study using a wearable patch-based device that provides up to 2 weeks of continuous ambulatory ECG monitoring (iRhythm Technologies, Inc.). Inclusion criteria were age ≥55 years and ≥2 of the following risk factors: coronary disease, heart failure, hypertension, diabetes, sleep apnea. We excluded patients with prior AF, stroke, transient ischemic attack, implantable pacemaker or defibrillator, or with palpitations or syncope in the prior year.Out of 75 subjects (all male, age 69 ± 8.0 years; ejection fraction 57% ± 8.7%), AF was detected in 4 subjects (5.3%; AF burden 28% ± 48%). Atrial tachycardia (AT) was present in 67% (≥4 beats), 44% (≥8 beats), and 6.7% (≥60 seconds) of subjects. The combined diagnostic yield of sustained AT/AF was 11%. In subjects without sustained AT/AF, 11 (16%) had ≥30 supraventricular ectopic complexes per hour.Outpatient extended ECG screening for asymptomatic AF is feasible, with AF identified in 1 in 20 subjects and sustained AT/AF identified in 1 in 9 subjects, respectively. We also found a high prevalence of asymptomatic AT and frequent supraventricular ectopic complexes, which may be relevant to development of AF or stroke. If confirmed in a larger study, primary screening for AF could have a significant impact on public health.

Abstract

Heart failure (HF) has a major effect on patients' health status, including their symptom burden, functional status, and health-related quality of life.To determine the effectiveness of a collaborative care patient-centered disease management (PCDM) intervention to improve the health status of patients with HF.The Patient-Centered Disease Management (PCDM) trial was a multisite randomized clinical trial comparing a collaborative care PCDM intervention with usual care in patients with HF. A population-based sample of 392 patients with an HF diagnosis from 4 Veterans Affairs centers who had a Kansas City Cardiomyopathy Questionnaire (KCCQ) overall summary score of less than 60 (heavy symptom burden and impaired functional status and quality of life) were enrolled between May 2009 and June 2011.The PCDM intervention included collaborative care by a multidisciplinary care team consisting of a nurse coordinator, cardiologist, psychiatrist, and primary care physician; home telemonitoring and patient self-management support; and screening and treatment for comorbid depression.The primary outcome was change in the KCCQ overall summary score at 1 year (a 5-point change is clinically significant). Mortality, hospitalization, and depressive symptoms (Patient Health Questionnaire 9) were secondary outcomes.There were no significant differences in baseline characteristics between patients randomized to the PCDM intervention (n=187) vs usual care (n=197); baseline mean KCCQ overall summary scores were 37.9 vs 36.9 (P=.48). There was significant improvement in the KCCQ overall summary scores in both groups after 1 year (mean change, 13.5 points in each group), with no significant difference between groups (P=.97). The intervention was not associated with greater improvement in the KCCQ overall summary scores when the effect over time was estimated using 3-month, 6-month, and 12-month data (P=.74). Among secondary outcomes, there were significantly fewer deaths at 1 year in the intervention arm (8 of 187 [4.3%]) than in the usual care arm (19 of 197 [9.6%]) (P = .04). Among those who screened positive for depression, there was a greater improvement in the Patient Health Questionnaire 9 scores after 1 year in the intervention arm than in the usual care arm (2.1 points lower, P=.01). There was no significant difference in 1-year hospitalization rates between the intervention arm and the usual care arm (29.4% vs 29.9%, P=.87).This multisite randomized trial of a multifaceted HF PCDM intervention did not demonstrate improved patient health status compared with usual care.clinicaltrials.gov Identifier: NCT00461513.

Abstract

Liver function test (LFT) abnormalities are often observed in patients with heart failure (HF). However, the relation of LFTs with outcomes has not been well described. Patients of the VA Palo Alto Health Care System (3 inpatient facilities and 7 community clinics) with a complete set of LFTs in the 60 days before a first HF diagnosis were included in the analysis from January 2005 to April 2013. A total of 2,096 patients met inclusion criteria. Patients were a mean of 71 ± 12 years old, 97% were men, 57% had a previous diagnosis of ischemic heart disease, and the mean left ventricular ejection fraction was 51 ± 12%. The median (twenty fifth and seventy fifth) values were albumin 3.6 g/dl (3.3, 3.9), alanine transaminase 21 IU/L (16, 30), aspartate transaminase 24 IU/L (20,31), AP 70 IU/L (57, 87), and total bilirubin 0.8 mg/dl (0.6, 1.0). There were 851 deaths (41%) over a mean duration of 41 ± 27 months. Mortality significantly increased with lower values of albumin and alanine transaminase and higher levels of aspartate transaminase and AP. The association with total bilirubin was not significant. In conclusion, many LFT values in the "normal" range are independently associated with decreased survival beyond traditional risk factors for mortality in HF.

Abstract

Whether heart rate upon discharge following hospitalization for heart failure is associated with long-term adverse outcomes and whether this association differs between patients with sinus rhythm (SR) and atrial fibrillation (AF) have not been well studied.We conducted a retrospective cohort study from clinical registry data linked to Medicare claims for 46 217 patients participating in Get With The Guidelines(®)-Heart Failure. Cox proportional-hazards models were used to estimate the association between discharge heart rate and all-cause mortality, all-cause readmission, and the composite outcome of mortality/readmission through 1 year. For SR and AF patients with heart rate ≥75, the association between heart rate and mortality (expressed as hazard ratio [HR] per 10 beats-per-minute increment) was significant at 0 to 30 days (SR: HR 1.30, 95% CI 1.22 to 1.39; AF: HR 1.23, 95% CI 1.16 to 1.29) and 31 to 365 days (SR: HR 1.15, 95% CI 1.12 to 1.20; AF: HR 1.05, 95% CI 1.01 to 1.08). Similar associations between heart rate and all-cause readmission and the composite outcome were obtained for SR and AF patients from 0 to 30 days but only in the composite outcome for SR patients over the longer term. The HR from 0 to 30 days exceeded that from 31 to 365 days for both SR and AF patients. At heart rates <75, an association was significant for mortality only for both SR and AF patients.Among older patients hospitalized with heart failure, higher discharge heart rate was associated with increased risks of death and rehospitalization, with higher risk in the first 30 days and for SR compared with AF.

Abstract

The purpose of this study was to assess the benefit of primary prevention implantable cardioverter defibrillators (ICDs) in women.Clinical trials of primary prevention ICDs enrolled a limited number of women.Using a propensity score method, we matched 490 women ≥65 years of age who received an ICD during a hospitalization for heart failure in the National Cardiovascular Data Registry ICD Registry from January 1, 2006, through December 31, 2007, to 490 ICD-eligible women without an ICD hospitalized for heart failure in the Get With The Guidelines for Heart Failure database from January 1, 2006, through December 31, 2009. The primary endpoint was all-cause mortality obtained from the Medicare Claims Database. An identical analysis was conducted in men.Median follow-up for patients with an ICD was 4.6 years versus 3.2 years for patients with no ICD. Compared with women with no ICD, those with an ICD were younger and less frequently white. In the matched cohorts, the survival of women with an ICD was significantly longer than that of women without an ICD (adjusted hazard ratio: 0.79, 95% confidence interval: 0.66 to 0.95; p = 0.013). Similarly, men with an ICD had longer survival than men without an ICD (adjusted hazard ratio: 0.73, 95% confidence interval: 0.65 to 0.83; p < 0.0001). There was no interaction between sex and the presence of an ICD with respect to survival (p = 0.44).Among older women with left ventricular dysfunction, a primary prevention ICD was associated with a significant survival benefit that was nearly identical to that seen in men. These findings support the use of primary prevention ICDs in eligible patients regardless of sex.

Abstract

Hospital to Home (H2H) is a national quality improvement (QI) initiative composed of three recommended hospital interventions to improve the transition of care for hospitalized patients with heart disease. A study was conducted to determine if enrollment of Department of Veterans Affairs (VA) hospitals in H2H and adoption of the recommended interventions would both increase following facilitation of an existing Heart Failure (HF) provider-based community of practice (COP) within the VA health care system. The VA HF COP includes more than 800 VA providers and other VA staff from VA inpatient medical centers.In 2010, 122 VA hospitals were randomized to facilitation using the VA HF COP (intervention) or no facilitation (control). COP members from intervention hospitals were invited to periodic teleconferences promoting H2H and received multiple e-mails asking members to report interest and then progress in H2H implementation.Among the 61 hospitals randomized to HF COP facilitation, 33 (54%) enrolled in H2H, compared with 6 (10%) of 61 control hospitals (p < .001) at five months after randomization. Of 38 intervention hospitals responding to the follow-up survey, 13 stated they had initiated 22 QI projects as a result of the H2H campaign. Another 7 hospitals had planned H2H projects. Of 20 control hospitals that responded, 5 had initiated 9 projects as a result of H2H, and no additional hospitals had plans to do so.Facilitation using the VA HF COP was successful in increasing enrollment in the H2H initiative and providing implementation support for recommended QI projects. Multihospital provider groups are a potentially valuable tool for implementation of national QI campaigns.

Socioeconomic inequalities in quality of care and outcomes among patients with acute coronary syndrome in the modern era of drug eluting stents.Journal of the American Heart AssociationYong, C. M., Abnousi, F., Asch, S. M., Heidenreich, P. A.2014; 3 (6)

Abstract

The rapidly changing landscape of percutaneous coronary intervention provides a unique model for examining disparities over time. Previous studies have not examined socioeconomic inequalities in the current era of drug eluting stents (DES).We analyzed 835 070 hospitalizations for acute coronary syndrome (ACS) from the Healthcare Cost and Utilization Project across all insurance types from 2008 to 2011, examining whether quality of care and outcomes for patients with ACS differed by income (based on zip code of residence) with adjustment for patient characteristics and clustering by hospital. We found that lower-income patients were less likely to receive an angiogram within 24 hours of a ST elevation myocardial infarction (STEMI) (69.5% for IQ1 versus 73.7% for IQ4, P<0.0001, OR 0.79 [0.68 to 0.91]) or within 48 hours of a Non-STEMI (47.6% for IQ1 versus 51.8% for IQ4, P<0.0001, OR 0.86 [0.75 to 0.99]). Lower income was associated with less use of a DES (64.7% for IQ1 versus 71.2% for IQ4, P<0.0001, OR 0.83 [0.74 to 0.93]). However, no differences were found for coronary artery bypass surgery. Among STEMI patients, lower-income patients also had slightly increased adjusted mortality rates (10.8% for IQ1 versus 9.4% for IQ4, P<0.0001, OR 1.17 [1.11 to 1.25]). After further adjusting for time to reperfusion among STEMI patients, mortality differences across income groups decreased.For the most well accepted procedural treatments for ACS, income inequalities have faded. However, such inequalities have persisted for DES use, a relatively expensive and until recently, controversial revascularization procedure. Differences in mortality are significantly associated with differences in time to primary PCI, suggesting an important target for understanding why these inequalities persist.

Abstract

Cardiac resynchronization therapy with defibrillator (CRT-D) reduces morbidity and mortality among selected patients with heart failure in clinical trials. The effectiveness of this therapy in clinical practice has not been well studied.We compared a cohort of 4471 patients from the National Cardiovascular Data Registry's Implantable Cardioverter-Defibrillator (ICD) Registry hospitalized primarily for heart failure and who received CRT-D between April 1, 2006, and December 31, 2009, to a historical control cohort of 4888 patients with heart failure without CRT-D from the Acute Decompensated Heart Failure National Registry (ADHERE) hospitalized between January 1, 2002, and March 31, 2006. Both registries were linked with Medicare claims to evaluate longitudinal outcomes. We included patients from the ICD Registry with left ventricular ejection fraction ≤35% and QRS duration ≥120 ms who were admitted for heart failure. We used Cox proportional hazards models to compare outcomes with and without CRT-D after adjustment for important covariates. After multivariable adjustment, CRT-D was associated with lower 3-year risks of death (hazard ratio, 0.52; 95% confidence interval, 0.48-0.56; P<0.001), all-cause readmission (hazard ratio, 0.69; 95% confidence interval, 0.65-0.73; P<0.001), and cardiovascular readmission (hazard ratio, 0.60; 95% confidence interval, 0.56-0.64; P<0.001). The association of CRT-D with mortality did not vary significantly among subgroups defined by age, sex, race, QRS duration, and optimal medical therapy.CRT-D was associated with lower risks of mortality, all-cause readmission, and cardiovascular readmission than medical therapy alone among patients with heart failure in community practice.

Abstract

Atrial fibrillation (AF) is one of the most common cardiac conditions treated in primary care and specialty cardiology settings, and is associated with considerable morbidity, mortality and cost. Catheter ablation, typically by electrically isolating the pulmonary veins and surrounding tissue, is more effective at maintaining sinus rhythm than conventional antiarrhythmic drug therapy and is now recommended as first-line therapy. From a value standpoint, the cost-effectiveness of ablation must incorporate the upfront procedural costs and risks with the benefits of longer term improvements in quality of life (QOL) and healthcare utilisation. Here, we present a primer on cost-effectiveness analysis (CEA), review the data on cost-effectiveness of AF ablation and outline key areas for further investigation.

Abstract

Studies on outcomes among patients with heart failure (HF) with preserved left ventricular ejection fraction (HFpEF), borderline left ventricular ejection fraction (HFbEF), and reduced left ventricular ejection fraction (HFrEF) remain limited. We sought to characterize mortality and readmission in patients with HF in the contemporary era.Get With The Guidelines-HF was linked to Medicare data for longitudinal follow-up. Patients were grouped into HFpEF (left ventricular ejection fraction [EF] ≥ 50%), HFbEF (40% ≤ EF < 50%), and HFrEF (EF < 40%). Multivariable models were constructed to examine the relationship between EF and outcomes at 30 days and 1 year and to study trends over time.A total of 40,239 patients from 220 hospitals between 2005 and 2011 were included in the study: 18,897 (47%) had HFpEF, 5,626 (14%) had HFbEF, and 15,716 (39%) had HFrEF. In crude survival analysis, patients with HFrEF had slightly increased mortality compared with HFbEF and HFpEF. After risk adjustment, mortality at 1 year was not significantly different for HFrEF, HFbEF, and HFpEF (HFrEF vs HFpEF, hazard ratio [HR] 1.040 [95% CI 0.998-1.084], and HFbEF vs HFpEF, HR 0.967 [95% CI 0.917-1.020]). Patients with HFpEF had increased risk of all-cause readmission compared with HFrEF. Conversely, risk of cardiovascular and HF readmissions were higher in HFrEF and HFbEF compared with HFpEF.Among patients hospitalized with HF, patients with HFpEF and HFbEF had slightly lower mortality and higher all-cause readmission risk than patients with HFrEF, although the mortality differences did not persist after risk adjustment. Irrespective of EF, these patients experience substantial mortality and readmission highlighting the need for new therapeutic strategies.

Abstract

In 2009, the Get With The Guidelines-Heart Failure program enhanced the standard recognition of hospitals by offering additional recognition if hospitals performed well on certain quality measures. We sought to determine whether initiation of this enhanced recognition opportunity led to acceleration in quality of care for all hospitals participating in the program.We examined hospital-level performance on 9 quality-of-care (process) measures that were added to an existing recognition program (based on existing published performance measures). The rate of increase in use over time 6 months to 2 years after the start of the program was compared with the rate of increase in use for the measures during the 18-month period prior to the start of the program. Use increased for all 9 new quality measures from 2008 to 2011. Among 4 measures with baseline use near or lower than 50%, a statistically significant greater increase in use during the program was seen for implantable cardioverter defibrillator use (program versus preprogram use: odds ratio 1.14, 95% CI 1.06 to 1.23). Among the 5 measures for which baseline use was 50% or higher, the increase in influenza vaccination rates actually slowed. There was no evidence of adverse impact on the 4 established quality measures, a composite of which actually increased faster during the expanded program (adjusted odds ratio 1.08, 95% CI 1.01 to 1.15).A program providing expanded hospital recognition for heart failure had mixed results in accelerating the use of 9 quality measures.

Abstract

The degree to which outcomes following hospitalization for acute heart failure (HF) vary by racial and ethnic groups is poorly characterized. We sought to compare 30-day and 1-year rehospitalization and mortality rates for HF among 4 race/ethnic groups.Using the Get With The Guidelines-HF registry linked with Medicare data, we compared 30-day and 1-year outcomes between racial/ethnic groups by using a multivariable Cox proportional hazards model adjusting for clinical, hospital, and socioeconomic status characteristics. We analyzed 47 149 Medicare patients aged ≥65 years who had been discharged for HF between 2005 and 2011: there were 39 213 whites (83.2%), 4946 blacks (10.5%), 2347 Hispanics (5.0%), and 643 Asians/Pacific Islanders (1.4%). Relative to whites, blacks and Hispanics had higher 30-day and 1-year unadjusted readmission rates but lower 30-day and 1-year mortality; Asians had similar 30-day readmission rates but lower 1-year mortality. After risk adjustment, blacks had higher 30-day and 1-year CV readmission than whites but modestly lower short- and long-term mortality; Hispanics had higher 30-day and 1-year readmission rates and similar 1-year mortality than whites, while Asians had similar outcomes. When socioeconomic status data were added to the model, the majority of associations persisted, but the difference in 30-day and 1-year readmission rates between white and Hispanic patients became nonsignificant.Among Medicare patients hospitalized with HF, short- and long-term readmission rates and mortality differed among the 4 major racial/ethnic populations and persisted even after controlling for clinical, hospital, and socioeconomic status variables.

Abstract

Contemporary patterns of use and outcomes of implantable cardioverter-defibrillators (ICDs) in community practice settings are not well characterized. We assessed temporal trends in patient characteristics and outcomes among older patients undergoing primary prevention ICD therapy in US hospitals between 2006 and 2010.Using the National Cardiovascular Data Registry's ICD Registry, we identified Medicare fee-for-service beneficiaries aged ≥65 years and older with left ventricular ejection fraction ≤35% who underwent primary prevention ICD implantation, including those receiving concomitant cardiac resynchronization therapy between 2006 and 2010 and could be matched to Medicare claims. Outcomes were mortality and hospitalization (all-cause and heart failure) at 180 days, and device-related complications. We used multivariable hierarchical logistic regression to assess temporal trends in outcomes accounting for changes in patient, physician, and hospital characteristics. The cohort included 117 100 patients. Between 2006 and 2010, only modest changes in patient characteristics were noted. Fewer single lead devices and more cardiac resynchronization therapy devices were used over time. Between 2006 and 2010, there were significant improvements in all outcomes, including 6-month all cause mortality (7.1% in 2006, 6.5% 2010; adjusted odds ratio, 0.88; 95% confidence interval, 0.82-0.95), 6-month rehospitalization (36.3% in 2006, 33.7% in 2010; adjusted odds ratio, 0.87; 95% confidence interval, 0.83-0.91), and device-related complications (5.8% in 2006, 4.8% in 2010; adjusted odds ratio, 0.80; 95% confidence interval, 0.74-0.88).The clinical characteristics of this national population of Medicare patients undergoing primary prevention ICD implantation were stable between 2006 and 2010. Simultaneous improvements in outcomes suggest meaningful advances in the care for this patient population.

Abstract

Patients with heart failure and atrial fibrillation are at higher risk of thromboembolic events than patients with heart failure alone. Yet, the use of anticoagulation therapy varies in clinical practice, especially among older patients, for whom its effectiveness is poorly understood.Using clinical registry data linked to Medicare claims from 2005 to 2011, we examined outcomes of older patients hospitalized with heart failure and atrial fibrillation who newly initiated anticoagulation therapy at discharge. We used Cox proportional hazards models and inverse probability-weighted treatment estimates to adjust for selection bias. Main outcomes were mortality and readmission at 1 and 3 years. Among 5105 patients in 195 hospitals, 1623 (31.8%) started anticoagulation therapy at discharge. Treated patients had lower unadjusted rates of all-cause mortality (26.4% versus 42.8%; P<0.001) and all-cause readmission (58.4% versus 63.7%; P<0.001) at 1 year. After inverse weighting for the probability of treatment and adjustment for other discharge medications, anticoagulation therapy was associated with significantly lower 1-year mortality (hazard ratio, 0.70; 99% confidence interval, 0.59-0.82), but there was no statistically significant difference in the risk of all-cause readmission (hazard ratio, 0.89; 99% confidence interval, 0.78-1.01) or other readmission outcomes. Results were similar at 3 years.Initiation of anticoagulation therapy at hospital discharge was associated with improved mortality at 1 and 3 years but was not associated with improved cardiovascular readmission among older patients with heart failure and atrial fibrillation.

Abstract

Despite endorsement of digoxin in clinical practice guidelines, there exist limited data on its safety in atrial fibrillation/flutter (AF).The goal of this study was to evaluate the association of digoxin with mortality in AF.Using complete data of the TREAT-AF (The Retrospective Evaluation and Assessment of Therapies in AF) study from the U.S. Department of Veterans Affairs (VA) healthcare system, we identified patients with newly diagnosed, nonvalvular AF seen within 90 days in an outpatient setting between VA fiscal years 2004 and 2008. We used multivariate and propensity-matched Cox proportional hazards to evaluate the association of digoxin use with death. Residual confounding was assessed by sensitivity analysis.Of 122,465 patients with 353,168 person-years of follow-up (age 72.1 ± 10.3 years, 98.4% male), 28,679 (23.4%) patients received digoxin. Cumulative mortality rates were higher for digoxin-treated patients than for untreated patients (95 vs. 67 per 1,000 person-years; p < 0.001). Digoxin use was independently associated with mortality after multivariate adjustment (hazard ratio [HR]: 1.26, 95% confidence interval [CI]: 1.23 to 1.29, p < 0.001) and propensity matching (HR: 1.21, 95% CI: 1.17 to 1.25, p < 0.001), even after adjustment for drug adherence. The risk of death was not modified by age, sex, heart failure, kidney function, or concomitant use of beta-blockers, amiodarone, or warfarin.Digoxin was associated with increased risk of death in patients with newly diagnosed AF, independent of drug adherence, kidney function, cardiovascular comorbidities, and concomitant therapies. These findings challenge current cardiovascular society recommendations on use of digoxin in AF.

Abstract

The impact of depression on outcome in implantable cardioverter defibrillator (ICD) recipients has not been fully appreciated. We assessed the prevalence of depression and its association with heart failure (HF) outcome among veterans with ICDs.Patients enrolled between January 2005 and January 2010 in the Outcomes among Veterans with Implantable Defibrillators Registry were studied. We examined the cross-sectional association of depression with severity of HF functional class as well as the association of depression with the composite outcome of mortality or HF hospitalization over a mean follow-up time of 2.7 years. There were 3,862 patients enrolled. Patients with depression (1,162, 43%) were younger (63.1 ± 9.4 years vs 66.6 ± 9.9 years, P < 0.001), more likely to have a history of tobacco or alcohol abuse (P < 0.0001) or atrial fibrillation (P = 0.05) while having a higher ejection fraction (28.3% vs 27.4%, P = 0.03). Depression was associated with advanced HF class at time of implant; odds ratio (OR; vs class I) for class III: 1.65 (confidence interval [CI] 1.17-2.33), class IV: 1.73 (95% CI 1.08-2.76). Death or HF hospitalization was more likely to occur in patients with depression (35.2% vs 32.0%, HR: 1.15 [95% CI 0.99-1.33]). The predictive value of depression was stronger after multivariable adjustment; HR: 1.25 (95% CI 1.05-1.49).Depression was prevalent among veterans with ICDs. Depression was associated with severity of HF. The predictive value of associated depression was significant after multivariable adjustment.

Abstract

Leadership by health care professionals is likely to vary because of differences in the social contexts within which they are situated, socialization processes and societal expectations, education and training, and the way their professions define and operationalize key concepts such as teamwork, collaboration, and partnership. This research examines the effect of the nurse and physician leaders on interdependence and encounter preparedness in chronic disease management practice groups.The aim of this study was to examine the effect of complementary leadership by nurses and physicians involved in jointly producing a health care service on care team functioning.The design is a retrospective observational study based on survey data. The unit of analysis is heart failure care groups in U.S. Veterans Health Administration medical centers. Survey and administrative data were collected in 2009 from 68 Veterans Health Administration medical centers. Key variables include nurse and physician leadership, interdependence, psychological safety, coordination, and encounter preparedness. Reliability and validity of survey measures were assessed with exploratory factor analysis and Cronbach alphas. Multivariate analyses tested hypotheses.Professional leadership by nurses and physicians is related to encounter preparedness by different paths. Nurse leadership is associated with greater team interdependence, and interdependence is positively associated with respect. Physician leadership is positively associated with greater psychological safety, respect, and shared goals but is not associated with interdependence. Respect is associated with involvement in learning activities, and shared goals are associated with coordination. Coordination and involvement in learning activities are positively associated with encounter preparedness.By focusing on increasing interdependence and a constructive climate, nurse and physician leaders have the opportunity to increase care coordination and involvement in learning activities.

Abstract

Clinical trials of prophylactic implantable cardioverter-defibrillators (ICDs) have included a minority of patients with a left ventricular ejection fraction (LVEF) between 30% and 35%. Because a large number of ICDs in the United States are implanted in such patients, it is important to study survival associated with this therapy.To characterize patients with LVEF between 30% and 35% and compare the survival of those with and without ICDs.Retrospective cohort study of Medicare beneficiaries in the National Cardiovascular Data Registry ICD registry (January 1, 2006, through December 31, 2007) with an LVEF between 30% and 35% who received an ICD during a heart failure hospitalization and similar patients in the Get With The Guidelines-Heart Failure (GWTG-HF) database (January 1, 2005, through December 31, 2009) with no ICD. The analysis was repeated in patients with an LVEF less than 30%. There were 3120 patients with an LVEF between 30% and 35% (816 in matched cohorts) and 4578 with an LVEF less than 30% (2176 in matched cohorts). Propensity score matching and Cox models were applied.The primary outcome was all-cause mortality; data were obtained from Medicare claims through December 31, 2011.There were no significant differences in the baseline characteristics of the matched groups (n = 408 for both groups). Among patients with an LVEF between 30% and 35%, there were 248 deaths in the ICD Registry group, within a median follow-up of 4.4 years (interquartile range, 2.7-4.9) and 249 deaths in the GWTG HF group, within a median follow-up of 2.9 years (interquartile range, 2.1-4.4). The risk of all-cause mortality in patients with an LVEF between 30% and 35% and an ICD was significantly lower than that in matched patients without an ICD (3-year mortality rates: 51.4% vs 55.0%; hazard ratio, 0.83 [95% CI, 0.69-0.99]; P = .04). Presence of an ICD also was associated with better survival in patients with an LVEF less than 30% (3-year mortality rates: 45.0% vs 57.6%; 634 and 660 total deaths; hazard ratio, 0.72 [95% CI, 0.65-0.81]; P

Abstract

Prior claims analyses suggest that the use of intravenous inotropic therapy for patients hospitalized with heart failure varies substantially by hospital. Whether differences in the clinical characteristics of the patients explain observed differences in the use of inotropic therapy is not known.We sought to characterize institutional variation in inotrope use among patients hospitalized with heart failure before and after accounting for clinical factors of patients. Hierarchical generalized linear regression models estimated risk-standardized hospital-level rates of inotrope use within 209 hospitals participating in Get With The Guidelines-Heart Failure (GWTG-HF) registry between 2005 and 2011. The association between risk-standardized rates of inotrope use and clinical outcomes was determined. Overall, an inotropic agent was administered in 7691 of 126 564 (6.1%) heart failure hospitalizations: dobutamine 43%, dopamine 24%, milrinone 17%, or a combination 16%. Patterns of inotrope use were stable during the 7-year study period. Use of inotropes varied significantly between hospitals even after accounting for patient and hospital characteristics (median risk-standardized hospital rate, 5.9%; interquartile range, 3.7%-8.6%; range, 1.3%-32.9%). After adjusting for case-mix and hospital structural differences, model intraclass correlation indicated that 21% of the observed variation in inotrope use was potentially attributable to random hospital effects (ie, institutional preferences). Hospitals with higher risk-standardized inotrope use had modestly longer risk-standardized length of stay (P=0.005) but had no difference in risk-standardized inpatient mortality (P=0.12).Use of intravenous inotropic agents during hospitalization for heart failure varies significantly among US hospitals even after accounting for patient and hospital factors.

Abstract

People with chronic heart failure (HF) suffer from numerous symptoms that worsen quality of life. The CASA (Collaborative Care to Alleviate Symptoms and Adjust to Illness) intervention was designed to improve symptoms and quality of life by integrating palliative and psychosocial care into chronic care.Our aim was to determine the feasibility and acceptability of CASA and identify necessary improvements.We conducted a prospective mixed-methods pilot trial. The CASA intervention included (1) nurse phone visits involving structured symptom assessments and guidelines to alleviate breathlessness, fatigue, pain, or depression; (2) structured phone counseling targeting adjustment to illness and depression if present; and (3) weekly team meetings with a palliative care specialist, cardiologist, and primary care physician focused on medical recommendations to primary care providers (PCPs, physician or nurse practioners) to improve symptoms. Study subjects were outpatients with chronic HF from a Veteran's Affairs hospital (n=15) and a university hospital (n=2). Measurements included feasibility (cohort retention rate, medical recommendation implementation rate, missing data, quality of care) and acceptability (an end-of-study semi-structured participant interview).Participants were male with a median age of 63 years. One withdrew early and there were <5% missing data. Overall, 85% of 87 collaborative care team medical recommendations were implemented. All participants who screened positive for depression were either treated for depression or thought to not have a depressive disorder. In the qualitative interviews, patients reported a positive experience and provided several constructive critiques.The CASA intervention was feasible based on participant enrollment, cohort retention, implementation of medical recommendations, minimal missing data, and acceptability. Several intervention changes were made based on participant feedback.

Abstract

This study sought to examine the long-term outcomes of patients hospitalized with heart failure and atrial fibrillation.Atrial fibrillation is common among patients hospitalized with heart failure. Associations of pre-existing and new-onset atrial fibrillation with long-term outcomes are unclear.We analyzed 27,829 heart failure admissions between 2006 and 2008 at 281 hospitals in the American Heart Association's Get With The Guidelines-Heart Failure program linked with Medicare claims. Patients were classified as having pre-existing, new-onset, or no atrial fibrillation. Cox proportional hazards models were used to identify factors that were independently associated with all-cause mortality, all-cause readmission, and readmission for heart failure, stroke, and other cardiovascular disease at 1 and 3 years.After multivariable adjustment, pre-existing atrial fibrillation was associated with greater 3-year risks of all-cause mortality (hazard ratio [HR]: 1.14 [99% confidence interval (CI): 1.08 to 1.20]), all-cause readmission (HR: 1.09 [99% CI: 1.05 to 1.14]), heart failure readmission (HR: 1.15 [99% CI: 1.08 to 1.21]), and stroke readmission (HR: 1.20 [99% CI: 1.01 to 1.41]), compared with no atrial fibrillation. There was also a greater hazard of mortality at 1 year among patients with new-onset atrial fibrillation (HR: 1.12 [99% CI: 1.01 to 1.24]). Compared with no atrial fibrillation, new-onset atrial fibrillation was not associated with a greater risk of the readmission outcomes. Stroke readmission rates at 1 year were just as high for patients with preserved ejection fraction as for patients with reduced ejection fraction.Both pre-existing and new-onset atrial fibrillation were associated with greater long-term mortality among older patients with heart failure. Pre-existing atrial fibrillation was associated with greater risk of readmission.

Abstract

Stage D heart failure (HF) is associated with poor prognosis, yet little consensus exists on the care of patients with HF approaching the end of life. Treatment options for end-stage HF range from continuation of guideline-directed medical therapy to device interventions and cardiac transplantation. However, patients approaching the end of life may elect to forego therapies or procedures perceived as burdensome, or to deactivate devices that were implanted earlier in the disease course. Although discussing end-of-life issues such as advance directives, palliative care, or hospice can be difficult, such conversations are critical to understanding patient and family expectations and to developing mutually agreed-on goals of care. Because patients with HF are at risk for rapid clinical deterioration or sudden cardiac death, end-of-life issues should be discussed early in the course of management. As patients progress to advanced HF, the need for such discussions increases, especially among patients who have declined, failed, or been deemed to be ineligible for advanced HF therapies. Communication to define goals of care for the individual patient and then to design therapy concordant with these goals is fundamental to patient-centered care. The objectives of this white paper are to highlight key end-of-life considerations in patients with HF, to provide direction for clinicians on strategies for addressing end-of-life issues and providing optimal patient care, and to draw attention to the need for more research focusing on end-of-life care for the HF population.

Abstract

Preoperative β-blockade has been posited to result in better outcomes for vascular surgery patients by attenuating acute hemodynamic changes associated with stress. However, the incremental effectiveness, if any, of β-blocker usage in blunting heart rate responsiveness for vascular surgery patients who avoid general anesthesia remains unknown.We reviewed an existing database and identified 213 consecutive vascular surgery cases from 2005-2011 conducted without general anesthesia (i.e., under monitored anesthesia care or regional anesthesia) at a tertiary care Veterans Administration medical center and categorized patients based on presence or absence of preoperative β-blocker prescription. For this series of patients, with the primary outcome of maximum heart rate during the interval between operating room entry to surgical incision, we examined the association of maximal heart rate and preoperative β-blocker usage by performing crude and multivariate linear regression, adjusting for relevant patient factors.Of 213 eligible cases, 137 were prescribed preoperative β-blockers, and 76 were not. The two groups were comparable across baseline patient factors and intraoperative medication doses. The β-blocker group experienced lower maximal heart rates during the period of evaluation compared to the non-β-blocker group (85 ± 22 bpm vs. 98 ± 36 bpm, respectively; p = 0.002). Adjusted linear regression confirmed a statistically-significant association between lower maximal heart rate and the use of β-blockers (Beta = -11.5; 95% CI [-3.7, -19.3] p = 0.004).The addition of preoperative β-blockers, even when general anesthesia is avoided, may be beneficial in further attenuating stress-induced hemodynamic changes for vascular surgery patients.

Abstract

B-type natriuretic peptide (BNP) is a marker for heart failure (HF) severity, but its association with hospital readmission is not well defined.We identified all hospital discharges (n=109 875) with a primary diagnosis of HF in the Veterans Affairs Health Care System from 2006 to 2009. We examined the association between admission (n=53 585), discharge (n=24 326), and change in BNP (n=7187) and 30-day readmission for HF or other causes. Thirty-day HF readmission was associated with elevated admission BNP, elevated discharge BNP, and smaller percent change in BNP from admission to discharge. Patients with a discharge BNP ≥ 1000 ng/L had an unadjusted 30-day HF readmission rate over 3 times as high as patients whose discharge BNP was ≤ 200 ng/L (15% vs. 4.1%). BNP improved discrimination and risk classification for 30-day HF readmission when added to a base clinical model, with discharge BNP having the greatest effect (C-statistic, 0.639 to 0.664 [P<0.0001]; net reclassification improvement, 9% [P<0.0001]). In contrast, 30-day readmission for non-HF causes was not associated with BNP levels during index HF hospitalization.In this study of over 50 000 veterans hospitalized with a primary diagnosis of HF, BNP levels measured during hospitalization were associated with 30-day HF readmission, but not readmissions for other causes. These data may help guide future study aimed at identifying the optimal timing for hospital discharge and help allocate high-intensity, HF-specific transitional care interventions to the patients most likely to benefit.

Abstract

The cost-effectiveness of the optimal use of hospital-based acute myocardial infarction (AMI) treatments and their potential impact on coronary heart disease (CHD) mortality in China is not well known.The effectiveness and costs of optimal use of hospital-based AMI treatments were estimated by the CHD Policy Model-China, a Markov-style computer simulation model. Changes in simulated AMI, CHD mortality, quality-adjusted life years, and total healthcare costs were the outcomes. The incremental cost-effectiveness ratio was used to assess projected cost-effectiveness. Optimal use of 4 oral drugs (aspirin, β-blockers, statins, and angiotensin-converting enzyme inhibitors) in all eligible patients with AMI or unfractionated heparin in non-ST-segment-elevation myocardial infarction was a highly cost-effective strategy (incremental cost-effectiveness ratios approximately US $3100 or less). Optimal use of reperfusion therapies in eligible patients with ST-segment-elevation myocardial infarction was moderately cost effective (incremental cost-effectiveness ratio ≤$10,700). Optimal use of clopidogrel for all eligible patients with AMI or primary percutaneous coronary intervention among high-risk patients with non-ST-segment-elevation myocardial infarction in tertiary hospitals alone was less cost effective. Use of all the selected hospital-based AMI treatment strategies together would be cost-effective and reduce the total CHD mortality rate in China by ≈9.6%.Optimal use of most standard hospital-based AMI treatment strategies, especially combined strategies, would be cost effective in China. However, because so many AMI deaths occur outside of the hospital in China, the overall impact on preventing CHD deaths was projected to be modest.

Abstract

Postdischarge adherence and long-term persistence in the use of warfarin among patients with heart failure and atrial fibrillation without contraindications have not been fully described.We identified patients with heart failure and atrial fibrillation who were ≥65 years old, eligible for warfarin, and discharged home from hospitals in the Get With the Guidelines-Heart Failure registry from January 1, 2006, to December 31, 2009. We used linked Medicare prescription drug event data to measure adherence and persistence. The main outcome measures were rates of prescription at discharge, outpatient dispensing, discontinuation, and adherence as measured by the medication possession ratio. We hypothesized that adherence to warfarin would differ according to whether patients received the prescription at discharge. Among 2,691 eligible patients, 1,856 (69.0%) were prescribed warfarin at discharge. Patients prescribed warfarin at discharge had significantly higher prescription fill rates within 90 days (84.5% vs 12.3%; P < .001) and 1 year (91.6% vs 16.8%; P < .001) and significantly higher medication possession ratios (0.78 vs 0.63; P < .001). Among both previous nonusers and existing users, fill rates at 90 days and 1 year and possession ratios were significantly higher among those prescribed warfarin at discharge.One-third of eligible patients with heart failure and atrial fibrillation were not prescribed warfarin at discharge from a heart failure hospitalization, and few started therapy as outpatients. In contrast, most patients who were prescribed warfarin at discharge filled the prescription within 90 days and remained on therapy at 1 year.

Abstract

Despite the U.S. Food and Drug Administration (FDA) warning regarding cognitive impairment, the relationship between statins and cognition remains unknown.To examine the effect of statins on cognition.PubMed, Embase, and Cochrane Library from inception through October 2012; FDA databases from January 1986 through March 2012.Randomized, controlled trials (RCTs) and cohort, case-control, and cross-sectional studies evaluating cognition in patients receiving statins.Two reviewers extracted data, 1 reviewer assessed study risk of bias, and 1 reviewer checked all assessments.Among statin users, low-quality evidence suggested no increased incidence of Alzheimer disease and no difference in cognitive performance related to procedural memory, attention, or motor speed. Moderate-quality evidence suggested no increased incidence of dementia or mild cognitive impairment or any change in cognitive performance related to global cognitive performance scores, executive function, declarative memory, processing speed, or visuoperception. Examination of the FDA postmarketing surveillance databases revealed a low reporting rate for cognitive-related adverse events with statins that was similar to the rates seen with other commonly prescribed cardiovascular medications.The absence of many well-powered RCTs for most outcomes resulted in final strengths of evidence that were low or moderate. Imprecision, inconsistency, and risk of bias also limited the strength of findings.Larger and better-designed studies are needed to draw unequivocal conclusions about the effect of statins on cognition. Published data do not suggest an adverse effect of statins on cognition; however, the strength of available evidence is limited, particularly with regard to high-dose statins.

Abstract

The aim of this report was to characterize the patients, participating centers, and measures of quality of care and outcomes for 5 NCDR (National Cardiovascular Data Registry) programs: 1) ACTION (Acute Coronary Treatment and Intervention Outcomes Network) Registry-GWTG (Get With The Guidelines) for acute coronary syndromes; 2) CathPCI Registry for coronary angiography and percutaneous coronary intervention; 3) CARE (Carotid Artery Revascularization and Endarterectomy) Registry for carotid revascularization; 4) ICD Registry for implantable cardioverter defibrillators; and the 5) PINNACLE (Practice INNovation And CLinical Excellence) Registry for outpatients with cardiovascular disease (CVD).CVD is a leading cause of death and disability in the United States. The quality of care for patients with CVD is suboptimal. National registry programs, such as NCDR, permit assessments of the quality of care and outcomes for broad populations of patients with CVD.For the year 2011, we assessed for each of the 5 NCDR programs: 1) demographic and clinical characteristics of enrolled patients; 2) key characteristics of participating centers; 3) measures of processes of care; and 4) patient outcomes. For selected variables, we assessed trends over time.In 2011 ACTION Registry-GWTG enrolled 119,967 patients in 567 hospitals; CathPCI enrolled 632,557 patients in 1,337 hospitals; CARE enrolled 4,934 patients in 130 hospitals; ICD enrolled 139,991 patients in 1,435 hospitals; and PINNACLE enrolled 249,198 patients (1,436,328 individual encounters) in 74 practices (1,222 individual providers). Data on performance metrics and outcomes, in some cases risk-adjusted with validated NCDR models, are presented.The NCDR provides a unique opportunity to understand the characteristics of large populations of patients with CVD, the centers that provide their care, quality of care provided, and important patient outcomes.

Abstract

We sought to determine if the mortality risk associated with inappropriate ICD shocks is due to the underlying arrhythmia or the shock itself.Shocks delivered from ICDs are associated with increased mortality risk. It is unknown if all patients that experience inappropriate ICD shocks have an increased risk of death.We evaluated survival outcomes in ICD and CRT-D patients enrolled in the LATITUDE remote monitoring system through January 1, 2010. First shock episode rhythms from 3,809 patients who acutely survived the initial shock were adjudicated by seven electrophysiologists. Patients with a shock were matched to patients without a shock (n=3,630) by age at implant, implant year, gender, and device type.The mean age of the study group was 64±13 years, with 78% male. Compared to no shock, there was increased mortality in those who received their first shock for monomorphic ventricular tachycardia (HR 1.65, p<0.0001), ventricular fibrillation/polymorphic ventricular tachycardia (HR 2.10, p<0.0001), and atrial fibrillation/flutter (HR 1.61, p=0.003). In contrast, mortality following first shocks due to sinus tachycardia and supraventricular tachycardia (HR 0.97, p=0.86), and noise/artifact/oversensing (HR 0.91, p=0.76) was comparable to that in patients without a shock.Compared to no shock, those who received their first shock for ventricular rhythms and atrial fibrillation had an increased risk of death. There was no significant difference in survival after inappropriate shocks for sinus tachycardia or noise/artifact/oversensing. In this study, the adverse prognosis following first shock appears to be more related to the underlying arrhythmia than to an adverse effect from the shock itself.

Abstract

Little is known about how often contextual factors such as patient preferences and competing priorities impact prescribing of guideline-recommended medications, or about the extent to which these factors are documented in medical records and available to performance measurement systems.Mixed-methods study of 295 veterans aged 50 years and older in 4 VA health care systems who had systolic heart failure and were not prescribed a β-blocker and/or an angiotensin converting enzyme inhibitor or angiotensin-receptor blocker. Reasons for nontreatment were identified from clinic notes and from interviews with 62 primary care clinicians caring for these patients. These reasons were classified using a published taxonomy.Among 295 patients not receiving guideline-recommended drugs for heart failure, chart review identified biomedical reasons for nonprescribing in 42%-58% of patients and contextual reasons in 11%-17%. Clinician interviews identified twice as many reasons for nonprescribing as chart review (mean 1.6 vs. 0.8 reasons per patient, P<0.001). In these interviews, biomedical reasons for nonprescribing were cited in 50%-70% of patients, and contextual reasons in 64%-70%. The most common contextual reasons were comanagement with other clinicians (32%-35% of patients), patient preferences and nonadherence (15%-24%), and clinician belief that the medication is not indicated in the patient (12%-20%).Contextual reasons for not prescribing angiotensin converting enzyme inhibitor / angiotensin-receptor blockers and β-blockers are present in two thirds of patients with heart failure who did not receive these medications, yet are poorly documented in medical records. The structure of medical records should be improved to facilitate documentation of contextual reasons for not providing guideline-recommended care.

Abstract

This study sought to examine the associations of hospitalist and cardiologist care of patients with heart failure with outcomes and adherence to quality measures.The hospitalist model of inpatient care has grown nationally, but its associations with quality of care and outcomes of patients hospitalized with heart failure are not known.We analyzed data from the Get With the Guidelines-Heart Failure registry linked to Medicare claims for 2005 through 2008. For each hospital, we calculated the percentage of heart failure hospitalizations for which a hospitalist was the attending physician. We examined outcomes and care quality for patients stratified by rates of hospitalist use. Using multivariable models, we estimated associations between hospital-level use of hospitalists and cardiologists and 30-day risk-adjusted outcomes and adherence to measures of quality care.The analysis included 31,505 Medicare beneficiaries in 166 hospitals. Across hospitals, the use of hospitalists varied from 0% to 83%. After multivariable adjustment, a 10% increase in the use of hospitalists was associated with a slight increase in mortality (risk ratio: 1.03; 95% confidence interval [CI]: 1.00 to 1.06) and decrease in length of stay (0.09 days; 95% CI: 0.02 to 0.16). There was no association with 30-day readmission. Increased use of hospitalists in hospitals with high use of cardiologists was associated with improved defect-free adherence to a composite of heart failure performance measures (risk ratio: 1.03; 95% CI: 1.01 to 1.06).Hospitalist care varied significantly across hospitals for heart failure admissions and was not associated with improved 30-day outcomes. Comanagement by hospitalists and cardiologists may help to improve adherence to some quality measures, but it remains unclear what care model improves 30-day clinical outcomes.

Abstract

Cardiac toxicity is one of the most concerning side effects of anti-cancer therapy. The gain in life expectancy obtained with anti-cancer therapy can be compromised by increased morbidity and mortality associated with its cardiac complications. While radiosensitivity of the heart was initially recognized only in the early 1970s, the heart is regarded in the current era as one of the most critical dose-limiting organs in radiotherapy. Several clinical studies have identified adverse clinical consequences of radiation-induced heart disease (RIHD) on the outcome of long-term cancer survivors. A comprehensive review of potential cardiac complications related to radiotherapy is warranted. An evidence-based review of several imaging approaches used to detect, evaluate, and monitor RIHD is discussed. Recommendations for the early identification and monitoring of cardiovascular complications of radiotherapy by cardiac imaging are also proposed.

Abstract

Although extending the duration of ambulatory electrocardiographic monitoring beyond 24 to 48 hours can improve the detection of arrhythmias, lead-based (Holter) monitors might be limited by patient compliance and other factors. We, therefore, evaluated compliance, analyzable signal time, interval to arrhythmia detection, and diagnostic yield of the Zio Patch, a novel leadless, electrocardiographic monitoring device in 26,751 consecutive patients. The mean wear time was 7.6 ± 3.6 days, and the median analyzable time was 99% of the total wear time. Among the patients with detected arrhythmias (60.3% of all patients), 29.9% had their first arrhythmia and 51.1% had their first symptom-triggered arrhythmia occur after the initial 48-hour period. Compared with the first 48 hours of monitoring, the overall diagnostic yield was greater when data from the entire Zio Patch wear duration were included for any arrhythmia (62.2% vs 43.9%, p <0.0001) and for any symptomatic arrhythmia (9.7% vs 4.4%, p <0.0001). For paroxysmal atrial fibrillation (AF), the mean interval to the first detection of AF was inversely proportional to the total AF burden, with an increasing proportion occurring after 48 hours (11.2%, 10.5%, 20.8%, and 38.0% for an AF burden of 51% to 75%, 26% to 50%, 1% to 25%, and <1%, respectively). In conclusion, extended monitoring with the Zio Patch for ≤14 days is feasible, with high patient compliance, a high analyzable signal time, and an incremental diagnostic yield beyond 48 hours for all arrhythmia types. These findings could have significant implications for device selection, monitoring duration, and care pathways for arrhythmia evaluation and AF surveillance.

Abstract

The benefits of cardiac resynchronization therapy (CRT) in clinical trials were greater among patients with left bundle-branch block (LBBB) or longer QRS duration.To measure associations between QRS duration and morphology and outcomes among patients receiving a CRT defibrillator (CRT-D) in clinical practice.Retrospective cohort study of Medicare beneficiaries in the National Cardiovascular Data Registry's ICD Registry between 2006 and 2009 who underwent CRT-D implantation. Patients were stratified according to whether they were admitted for CRT-D implantation or for another reason, then categorized as having either LBBB or no LBBB and QRS duration of either 150 ms or greater or 120 to 149 ms.All-cause mortality; all-cause, cardiovascular, and heart failure readmission; and complications. Patients underwent follow-up for up to 3 years, with follow-up through December 2011.Among 24 169 patients admitted for CRT-D implantation, 1-year and 3-year mortality rates were 9.2% and 25.9%, respectively. All-cause readmission rates were 10.2% at 30 days and 43.3% at 1 year. Both the unadjusted rate and adjusted risk of 3-year mortality were lowest among patients with LBBB and QRS duration of 150 ms or greater (20.9%), compared with LBBB and QRS duration of 120 to 149 ms (26.5%; adjusted hazard ratio [HR], 1.30 [99% CI, 1.18-1.42]), no LBBB and QRS duration of 150 ms or greater (30.7%; HR, 1.34 [99% CI, 1.20-1.49]), and no LBBB and QRS duration of 120 to 149 ms (32.3%; HR, 1.52 [99% CI, 1.38-1.67]). The unadjusted rate and adjusted risk of 1-year all-cause readmission were also lowest among patients with LBBB and QRS duration of 150 ms or greater (38.6%), compared with LBBB and QRS duration of 120 to 149 ms (44.8%; adjusted HR, 1.18 [99% CI, 1.10-1.26]), no LBBB and QRS duration of 150 ms or greater (45.7%; HR, 1.16 [99% CI, 1.08-1.26]), and no LBBB and QRS duration of 120 to 149 ms (49.6%; HR, 1.31 [99% CI, 1.23-1.40]). There were no observed associations with complications.Among fee-for-service Medicare beneficiaries undergoing CRT-D implantation in clinical practice, LBBB and QRS duration of 150 ms or greater, compared with LBBB and QRS duration less than 150 ms or no LBBB regardless of QRS duration, was associated with lower risk of all-cause mortality and of all-cause, cardiovascular, and heart failure readmissions.

Abstract

OBJECTIVES: To develop a method for risk-standardizing hospital survival after cardiac arrest. BACKGROUND: A foundation with which hospitals can improve quality is to be able to benchmark their risk-adjusted performance against other hospitals, something that cannot currently be done for survival after in-hospital cardiac arrest. METHODS: Within the Get With The Guidelines-Resuscitation registry, we identified 48,841 patients admitted between 2007 and 2010 with an in-hospital cardiac arrest. Using hierarchical logistic regression, we derived and validated a model for survival to hospital discharge and calculated risk-standardized survival rates (RSSRs) for 272 hospitals with at least 10 cardiac arrest cases. RESULTS: The survival rate was 21.0% and 21.2% for the derivation and validation cohorts, respectively. The model had good discrimination (C-statistic 0.74) and excellent calibration. Eighteen variables were associated with survival to discharge, and a parsimonious model contained 9 variables with minimal change in model discrimination. Prior to risk-adjustment, the median hospital survival rate was 20% (IQR: 14%-26%), with a wide range (0%-85%). After adjustment, the distribution of RSSRs was substantially narrower: median of 21% (IQR: 19%-23%; range: 11%-35%). More than half (143 [52.6%]) of hospitals had at least a 10% positive or negative absolute change in percentile rank after risk standardization, and 50 (23.2%) had a ≥20% absolute change in percentile rank. CONCLUSION: We have derived and validated a model to risk-standardize hospital rates of survival for in-hospital cardiac arrest. Use of this model can support efforts to compare hospitals in resuscitation outcomes as a foundation for quality assessment and improvement.

Abstract

Cardiac toxicity is one of the most concerning side effects of anti-cancer therapy. The gain in life expectancy obtained with anti-cancer therapy can be compromised by increased morbidity and mortality associated with its cardiac complications. While radiosensitivity of the heart was initially recognized only in the early 1970s, the heart is regarded in the current era as one of the most critical dose-limiting organs in radiotherapy. Several clinical studies have identified adverse clinical consequences of radiation-induced heart disease (RIHD) on the outcome of long-term cancer survivors. A comprehensive review of potential cardiac complications related to radiotherapy is warranted. An evidence-based review of several imaging approaches used to detect, evaluate, and monitor RIHD is discussed. Recommendations for the early identification and monitoring of cardiovascular complications of radiotherapy by cardiac imaging are also proposed.

Abstract

Treatment with specific beta-blockers and doses recommended by guidelines is often not achieved in practice. We evaluated an intervention directed to the pharmacy to improve prescribing.We conducted a pragmatic cluster-randomized trial, where facilities (n = 12) with patients (n = 220) were the clusters. Eligible patients had a beta-blocker prescription that was not guideline concordant. Level 1 intervention included information to a pharmacist on facility guideline concordance. Level 2 also provided a list of patients not meeting guideline goals. Intervention and follow-up periods were each 6 months. Achievement of full concordance with recommendations was low (4%-5%) in both groups, primarily due to lack of tolerability. However, compared with level 1, the level 2 intervention was associated with 1.9-fold greater odds of improvement in prescribing (95% confidence interval [CI] 1.1-3.2). Level 2 patients also had greater odds of a higher dose (1.9, 95% CI 1.1-3.3). The intervention was aided by the patient lists provided, the electronic medical record system, and staff support.In actual practice, full achievement of guideline goals was low. However, a simple intervention targeting pharmacy moved patients toward guideline goals. As health care systems incorporate electronic medical records, this intervention should have broader feasibility.

Abstract

OBJECTIVES: To compare the 1-year survival for different age strata of intensive care unit (ICU) patients after receipt of packed red blood cell (PRBC) transfusions. BACKGROUND: Despite guidelines documenting risks of PRBC transfusion and data showing that increasing age is associated with ICU mortality, little data exist on whether age alters the transfusion-related risk of decreased survival. METHODS: We retrospectively examined data on 2393 consecutive male ICU patients admitted to a tertiary-care hospital from 2003 to 2009 in age strata: 21-50, 51-60, 61-70, 71-80 and >80 years. We calculated Cox regression models to determine the modifying effect of age on the impact of PRBC transfusion on 1-year survival by using interaction terms between receipt of transfusion and age strata, controlling for type of admission and Charlson co-morbidity indices. We also examined the distribution of admission haematocrit and whether transfusion rates differed by age strata. RESULTS: All age strata experienced statistically similar risks of decreased 1-year survival after receipt of PRBC transfusions. However, patients age >80 were more likely than younger cohorts to have haematocrits of 25-30% at admission and were transfused at approximately twice the rate of each of the younger age strata. DISCUSSION: We found no significant interaction between receipt of red cell transfusion and age, as variables, and survival at 1 year as an outcome.

Abstract

Chronic heart failure (HF) disease management programs have reported inconsistent results and have not included comorbid depression management or specifically focused on improving patient-reported outcomes. The Patient Centered Disease Management (PCDM) trial was designed to test the effectiveness of collaborative care disease management in improving health status (symptoms, functioning, and quality of life) in patients with HF who reported poor HF-specific health status.Patients with a HF diagnosis at four VA Medical Centers were identified through population-based sampling. Patients with a Kansas City Cardiomyopathy Questionnaire (KCCQ, a measure of HF-specific health status) score of

Abstract

Aldosterone antagonist therapy is recommended for selected patients with heart failure and reduced ejection fraction. Adherence to therapy in the transition from hospital to home is not well understood.We identified patients with heart failure and reduced ejection fraction who were ?65 years old, eligible for aldosterone antagonist therapy, and discharged home from hospitals in the Get With the Guidelines-Heart Failure registry between January 1, 2005, and December 31, 2008. We used Medicare prescription drug event data to measure adherence. Main outcome measures were prescription at discharge, outpatient prescription claim within 90 days, discontinuation, and adherence as measured with the medication possession ratio. We used the cumulative incidence function to estimate rates of initiation and discontinuation.Among 2,086 eligible patients, 561 (26.9%) were prescribed an aldosterone antagonist at discharge. Within 90 days, 78.6% of eligible patients with a discharge prescription filled a prescription for the therapy, compared with 13.0% of eligible patients without a discharge prescription (P < .001). The median medication possession ratio was 0.63 over 1 year of follow-up. Among 634 patients who filled a prescription within 90 days of discharge, 7.9% discontinued therapy within 1 year.Most eligible patients were not prescribed aldosterone antagonist therapy at discharge from a heart failure hospitalization. Eligible patients without a discharge prescription seldom initiated therapy as outpatients. Most patients who were prescribed an aldosterone antagonist at discharge filled the prescription within 90 days and remained on therapy.

Abstract

The study sought to derive and validate risk-prediction tools from a large nationwide registry linked with Medicare claims data.Few clinical models have been developed utilizing data elements readily available in electronic health records (EHRs) to facilitate "real-time" risk estimation.Heart failure (HF) patients ≥ 65 years of age hospitalized in the GWTG-HF (Get With The Guidelines-Heart Failure) program were linked with Medicare claims from January 2005 to December 2009. Multivariable models were developed for 30-day mortality after admission, 30-day rehospitalization after discharge, and 30-day mortality/rehospitalization after discharge. Candidate variables were selected based on availability in EHRs and prognostic value. The models were validated in a 30% random sample and separately in patients with reduced and preserved ejection fraction (EF).Among 33,349 patients at 160 hospitals, 3,002 (9.1%) died within 30 days of admission, 7,020 (22.8%) were rehospitalized within 30 days of discharge, and 8,374 (27.2%) died or were rehospitalized within 30 days of discharge. Compared with patients classified as low risk, high-risk patients had significantly higher odds of death (odds ratio [OR]: 8.82, 95% confidence interval [CI]: 7.58 to 10.26), rehospitalization (OR: 1.99, 95% CI: 1.86 to 2.13), and death/rehospitalization (OR: 2.65, 95% CI: 2.44 to 2.89). The 30-day mortality model demonstrated good discrimination (c-index 0.75) while the rehospitalization and death/rehospitalization models demonstrated more modest discrimination (c-indices of 0.59 and 0.62), with similar performance in the validation cohort and for patients with preserved and reduced EF.These predictive models allow for risk stratification of 30-day outcomes for patients hospitalized with HF and may provide a validated, point-of-care tool for clinical decision making.

Abstract

Hospitals are challenged to reduce length of stay (LOS), yet simultaneously reduce readmissions for patients with heart failure (HF). This study investigates whether 30-day rehospitalization or an alternative measure of total inpatient days over an episode of care (EOC) is the best indicator of resource use, HF quality, and outcomes.Using data from the American Heart Association's Get With The Guidelines-Heart Failure Registry linked to Medicare claims, we ranked and compared hospitals by LOS, 30-day readmission rate, and overall EOC metric, defined as all hospital days for an HF admission and any subsequent admissions within 30 days. We divided hospitals into quartiles by 30-day EOC and 30-day readmission rates. We compared performance by EOC and readmission rate quartiles with respect to quality of care indicators and 30-day postdischarge mortality.The population had a mean age of 80 ± 7.95 years, 45% were male, and 82% were white. Hospital-level unadjusted median index LOS and overall EOC were 4.9 (4.2-5.6) and 6.2 (5.3-7.4) days, respectively. Median 30-day readmission rate was 23.2%. Hospital HF readmission rate was not associated with initial hospital LOS, only slightly associated with total EOC rank (r = 0.26, P = .001), and inversely related to HF performance measures. After adjustment, there was no association between 30-day readmission and decreased 30-day mortality. In contrast, better performance on the EOC metric was associated with decreased odds of 30-day mortality.Although hospital 30-day readmission rate was poorly correlated with LOS, quality measures, and 30-day mortality, better performance on the EOC metric was associated with better 30-day survival. Total inpatient days during a 30-day EOC may more accurately reflect overall resource use and better serve as a target for quality improvement efforts.

Abstract

Sudden cardiac death (SCD) is an important cause of death in patients with left ventricular systolic dysfunction. Mineralocorticoid receptor antagonists (MRAs) may attenuate this risk. The objective of this meta-analysis was to assess the impact of MRAs on SCD in patients with left ventricular systolic dysfunction.We systematically searched PubMed, EMBASE, Cochrane, and other databases through March 30, 2012, without language restrictions. We included trials that enrolled patients with left ventricular ejection fraction of ?45%, randomized subjects to MRAs versus control and reported outcomes on SCD, total and cardiovascular mortality. Eight published trials that enrolled 11 875 patients met inclusion criteria. Of these, 6 reported data on SCD and cardiovascular mortality, and 7 reported data on total mortality. No heterogeneity was observed among the trials. Patients treated with MRAs had 23% lower odds of experiencing SCD compared with controls (odds ratio, 0.77; 95% confidence interval, 0.66-0.89; P=0.001). Similar reductions were observed in cardiovascular (0.75; 95% confidence interval, 0.68-0.84; P<0.001) and total mortality (odds ratio, 0.74; 95% confidence interval, 0.63-0.86; P<0.001). Although publication bias was observed, the results did not change after a trim and fill test, suggesting that the impact of this bias was likely insignificant.MRAs reduce the risk of SCD in patients with left ventricular systolic dysfunction. Comparative effectiveness studies of MRAs on SCD in usual care as well as studies evaluating the efficacy of other therapies to prevent SCD in patients receiving optimal MRA therapy are needed to guide clinical decision-making.

Abstract

We hypothesized that obstructive sleep apnea(OSA) has a dose-dependent impact on mortality in those with ischemic heart disease or previous myocardial injury.We performed a retrospective cohort study of 281 consecutive OSA patients with a history of myocardial injury as determined by elevated troponin levels or with known existing ischemic heart disease. We compared survival between those with severe OSA [apnea–hypopneaindex (AHI) ?30] and those with mild to moderate OSA(AHI >5 and <30).Of the 281 patients (mean age 65 years, mean BMI34, 98% male, 58% with diabetes), 151 patients had mild moderate OSA and 130 had severe OSA. During a mean follow-up of 4.1 years, there were significantly greater deaths in the severe OSA group compared to the mild moderate OSA group [53 deaths (41%) vs. 44 deaths(29%), respectively, p00.04]. The adjusted hazard ratio for mortality with severe OSA was 1.72 (95% confidence interval1.01–2.91, p00.04).The severity of obstructive sleep apnea is associated with increased risk of death, and risk stratification based on OSA severity is relevant even in the diseased cardiac patient.

Abstract

Reducing 30-day heart failure readmission rates is a national priority. Yet, little is known about how hospitals address the problem and whether hospital-based processes of care are associated with reductions in readmission rates.We surveyed 100 randomly selected hospitals participating in the Get With the Guidelines-Heart Failure quality improvement program regarding common processes of care aimed at reducing readmissions. We grouped processes into 3 domains (ie, inpatient care, discharge and transitional care, and general quality improvement) and scored hospitals on the basis of survey responses using processes selected a priori. We used linear regression to examine associations between these domain scores and 30-day risk-standardized readmission rates. Of the 100 participating sites, 28% were academic centers and 64% were community hospitals. The median readmission rate among participating sites (24.0%; 95% CI, 22.6%-25.7%) was comparable with the national average (24.6%; 23.5-25.9). Sites varied substantially in care processes used for inpatient care, education, discharge process, care transitions, and quality improvement. Overall, neither inpatient care nor general quality improvement domains were associated with 30-day readmission rates. Hospitals in the lowest readmission rate quartile had modestly higher discharge and transitional care domain scores (P=0.03).A variety of strategies are used by hospitals in an attempt to improve 30-day readmission rates for patients hospitalized with heart failure. Although more complete discharge and transitional care processes may be modestly associated with lower 30-day readmission rates, most current strategies are not associated with lower readmission rates.

Abstract

Left ventricular ejection fraction (EF) is a key component of heart failure quality measures used within the Department of Veteran Affairs (VA). Our goals were to build a natural language processing system to extract the EF from free-text echocardiogram reports to automate measurement reporting and to validate the accuracy of the system using a comparison reference standard developed through human review. This project was a Translational Use Case Project within the VA Consortium for Healthcare Informatics.We created a set of regular expressions and rules to capture the EF using a random sample of 765 echocardiograms from seven VA medical centers. The documents were randomly assigned to two sets: a set of 275 used for training and a second set of 490 used for testing and validation. To establish the reference standard, two independent reviewers annotated all documents in both sets; a third reviewer adjudicated disagreements.System test results for document-level classification of EF of <40% had a sensitivity (recall) of 98.41%, a specificity of 100%, a positive predictive value (precision) of 100%, and an F measure of 99.2%. System test results at the concept level had a sensitivity of 88.9% (95% CI 87.7% to 90.0%), a positive predictive value of 95% (95% CI 94.2% to 95.9%), and an F measure of 91.9% (95% CI 91.2% to 92.7%).An EF value of <40% can be accurately identified in VA echocardiogram reports.An automated information extraction system can be used to accurately extract EF for quality measurement.

Abstract

Heart failure with preserved ejection fraction (EF) is a common syndrome, but trends in treatments and outcomes are lacking.We analyzed data from 275 hospitals in Get With the Guidelines-Heart Failure from January 2005 to October 2010. Patients were stratified by EF as reduced EF (EF <40% [HF-reduced EF]), borderline EF (40%?EF<50% [HF-borderline EF]), or preserved (EF ?50% [HF-preserved EF]). Using multivariable models, we examined trends in therapies and outcomes. Among 110 621 patients, 50% (55 083) had HF-reduced EF, 14% (15 184) had HF-borderline EF, and 36% (40 354) had HF-preserved EF. From 2005 to 2010, the proportion of hospitalizations for HF-preserved EF increased from 33% to 39% (P<0.0001). In multivariable analyses, use of angiotensin-converting enzyme inhibitors/angiotensin receptor blockers at discharge decreased in all EF groups, and ?-blocker use increased. Patients with HF-preserved EF less frequently achieved blood pressure control (adjusted odds ratio, 0.44 versus HF-reduced EF; P<0.001) and were more likely discharged to skilled nursing (adjusted odds ratio, 1.16 versus HF-reduced EF; P<0.001). In-hospital mortality for HF-preserved EF decreased from 3.32% in 2005 to 2.35% in 2010 (adjusted odds ratio, 0.89 per year; P=0.01) but was stable for patients with HF-reduced EF (3.03%-2.83%; adjusted odds ratio, 0.93 per year; P=0.10).Hospitalization for HF-preserved EF is increasing relative to HF-reduced EF. Although in-hospital mortality for patients with HF-preserved EF declined over the study period, an important opportunity remains for identifying evidence-based therapies in patients with HF-preserved EF.

Abstract

Guidelines recommend hospice care as a treatment option for end-stage heart failure (HF) patients. Little is known regarding utilization of hospice care in a contemporary cohort of patients hospitalized with HF and how this may vary by estimated mortality risk.We analyzed HF patients ?65 years (n = 58,330) from 214 hospitals participating in the Get With the Guidelines-HF program. Univariate analysis comparing patients discharged to hospice versus other patients was performed. Hospice utilization was evaluated for deciles of estimated 90-day mortality risk using a validated model. Multivariate analysis using admission patient and hospital characteristics was also performed to determine factors associated with hospice discharge.There were 1,442 patients discharged to hospice, and rates of referral varied widely by hospital (interquartile range 0-3.7%) as shown in the univariate analysis. Patients discharged to hospice were significantly older and more often white, had lower left ventricular ejection fraction, higher B-type natriuretic peptide, and lower systolic blood pressure on admission. Utilization rates for each decile of 90-day estimated mortality risk ranged from 0.3% to 8.6%. Multivariable analysis found that factors associated with hospice utilization included increased age, low systolic blood pressure on admission, and increased blood urea nitrogen.Hospice utilization remains low among HF patients, even those with the highest predicted risk of death.

Abstract

Despite demonstrated efficacy in randomized trials, aldosterone antagonist therapy is not used in many eligible patients with heart failure. Questions remain about its clinical effectiveness and safety for patients who are underrepresented in randomized trials and those at risk for hyperkalemia.The proposed study will evaluate the effectiveness of aldosterone antagonist therapy in eligible Medicare beneficiaries ? 65 years old hospitalized for heart failure between 2005 and 2008. Data are from the GWTG-HF registry linked with Medicare inpatient and prescription drug event files. We will use inverse probability-weighted estimators to assess differences in mortality, cardiovascular readmission, and readmission for hyperkalemia between patients who receive or do not receive aldosterone antagonist therapy.The initial data set included 33,652 patients; 5,463 (16.2%) met all inclusion criteria. Compared with patients who did not meet the inclusion criteria, patients in the final cohort were more likely to be younger (77.3 vs 80.3 years) and male (63.8% vs 41.3%) and to have ischemic heart failure (74.2% vs 59.5%) (all P < .001). Mortality rates were 24.7% at 1 year and 50.7% at 3 years; cardiovascular readmission rates were 50.1% at 1 year and 65.2% at 3 years.The proposed study will evaluate the clinical effectiveness of aldosterone antagonist therapy in Medicare beneficiaries hospitalized for heart failure with reduced ejection fraction, an underrepresented population in clinical trials. By addressing this evidence gap, the study has the potential to inform clinical decision making and improve patient outcomes.

Abstract

To investigate the association of preoperative ?-blocker usage and maximal heart rates observed during the induction of general anesthesia.Retrospective descriptive, univariate, and multivariate analyses of electronic hospital and anesthesia medical records.A tertiary-care medical center within the Veterans Health Administration.Consecutive adult elective and emergent patients presenting for vascular surgery during calendar years 2005 to 2011.None.Of the 430 eligible cases, 218 were prescribed ?-blockers, and 212 were not taking ?-blockers. The two groups were comparable across baseline patient factors (ie, demographic, morphometric, surgical duration, and surgical procedures) and induction medication doses. The ?-blocker group experienced a lower maximal heart rate during the induction of general anesthesia compared with the non-?-blocker group (105 ± 41 beats/min v 115 ± 45 beats/min, respectively; p < 0.01). Adjusted linear regression found a statistically significant association between lower maximal heart rate and the use of ?-blockers (? = -11.1 beats/min, p < 0.01). There was no difference between groups in total intraoperative ?-blocker administration.Preoperative ?-blockade of vascular surgery patients undergoing general anesthesia is associated with a lower maximal heart rate during anesthetic induction. There may be potential benefits in administering ?-blockers to reduce physiologic stress in this surgical population at risk for perioperative cardiac morbidity. Future research should further explore intraoperative hemodynamic effects in light of existing practice guidelines for optimal medication selection, dosage, and heart rate control.

Abstract

Tachycardia has been associated with worse outcomes for patients with heart failure and is also thought to have a direct adverse impact on the myocardium. This report highlights the current evidence for heart rate as both a risk factor and mediator for poor outcome for patients with heart failure. We summarize the large number of studies evaluating heart rate in patients with systolic dysfunction and newer studies that examine patients with preserved systolic function. The effect on outcomes in heart failure of medications known to slow the heart rate such as ?-blockers and the more recently developed drug ivabradine are discussed. The data clearly show that a high heart rate is a marker of increased mortality. There is also a strong suggestion that a higher heart rate directly worsens outcome and that this can be mitigated by heart rate-reducing medications.

Abstract

Although aortic sclerosis has been associated with an increase in adverse cardiovascular outcomes, no proven therapy has been shown to slow its progression to overt aortic stenosis (AS). Thus, the hypothesis was assessed that treatment with angiotensin-converting enzyme inhibitors (ACE-Is), angiotensin receptor blockers (ARBs) or statins may be associated with an improvement in the clinical outcome of these patients.A total of 4,105 patients with evidence of aortic sclerosis seen on transthoracic echocardiography (defined as thickening or calcification with a mean valve gradient < or = 15 mmHg) was identified. Patients with a sclerotic valve who were treated with ACE-Is/ARBs or statins were followed for a mean period of 1,078 +/- 615 days. After adjustment for the propensity to receive ACE-Is/ARBs or statins, mortality, hemodynamic progression to AS, hospitalization for ischemic heart disease (IHD), and congestive heart failure (CHF) were assessed and related to the medical treatment.At baseline, patients with aortic sclerosis who were treated with an ACE-I/ARB or a statin suffered significantly more from comorbidities such as IHD, CHF, hypertension, diabetes, and peripheral arterial disease, when compared to subjects with sclerotic valves not treated with these drugs. After adjustment for confounding factors, treatment with statins was associated with a significant reduction in mortality (odds ratio [OR] 0.73, 95% CI 0.56-0.98, p = 0.001), admission for IHD (OR 0.81, 95% CI 0.66-0.99, p = 0.03), admission for CHF (OR 0.68, 95% CI 0.55-0.85, p = 0.01) and progression to AS (OR 0.64, 95% CI 0.42-0.97, p = 0.03). While ACE-I treatment resulted in a significant reduction in admission for IHD (OR 0.80, 95% CI 0.65-0.98, p = 0.03) and CHF (OR 0.76, 95% CI 0.62-0.94, p = 0.01), the beneficial trend towards reduced mortality and delayed progression to AS was not significant.Treatment of this patient population with statins led to a significant reduction in mortality and also slowed the progression to AS--an effect that was not statistically significant with ACE-I treatment.

Abstract

Prior studies have demonstrated low use of implantable cardioverter defibrillators (ICDs) as primary prevention, particularly among women and blacks. The degree to which the overall use of ICD therapy and disparities in use have changed is unclear.We examined 11 880 unique patients with a history of heart failure and left ventricular ejection fraction ?35% who were ?65 years old and enrolled in the Get With the Guidelines-Heart Failure (GWTG-HF) program from January 2005 through December 2009. We determined the rate of ICD use by year for the overall population and for sex and race groups. From 2005 to 2007, overall ICD use increased from 30.2% to 42.4% and then remained unchanged in 2008 to 2009. After adjustment for potential confounders, ICD use increased significantly in the overall study population during 2005 to 2007 (odds ratio, 1.28; 95% confidence interval, 1.11-1.48 per year; P=0.0008) and in black women (odds ratio, 1.82; 95% confidence interval, 1.28-2.58 per year; P=0.0008), white women (odds ratio, 1.30; 95% confidence interval, 1.06-1.59 per year; P=0.010), black men (odds ratio, 1.54; 95% confidence interval, 1.19-1.99 per year; P=0.0009), and white men (odds ratio, 1.25; 95% confidence interval, 1.06-1.48 per year; P=0.0072). The increase in ICD use was greatest among blacks.In the GWTG-HF quality improvement program, a significant increase in ICD therapy use was observed over time in all sex and race groups. The previously described racial disparities in ICD use were no longer present by the end of the study period; however, sex differences persisted.

Abstract

Prior studies have indicated that the magnitude of risk association of microvolt T-wave alternans (MTWA) testing appears to vary with the population studied. We performed a meta-analysis to determine the ability of MTWA to modify risk assessment of ventricular tachyarrhythmic events (VTEs) and sudden cardiac death (SCD) across a series of patient risk profiles using likelihood ratio (LR) testing, a measure of test performance independent of disease prevalence.We identified original research articles published from January 1990 to January 2011 that investigate spectrally derived MTWA. Ventricular tachyarrhythmic event was defined as the total and arrhythmic mortality and nonfatal sustained or implantable cardioverter-defibrillator-treated ventricular tachyarrhythmias. Summary estimates were created for positive and nonnegative MTWA results using a random-effects model and were expressed as positive (LR+) and negative (LR-) LRs.Of 1,534 articles, 20 prospective cohort studies met our inclusion criteria, consisting of 5,945 subjects predominantly with prior myocardial infarction or left ventricular dysfunction. Although there was a modest association between positive MTWA and VTE (relative risk 2.45, 1.58-3.79) and nonnegative MTWA and VTE (3.68, 2.23-6.07), test performance was poor (positive MTWA: LR+ 1.78, LR- 0.43; nonnegative MTWA: LR+ 1.38, LR- 0.56). Subgroup analyses of subjects classified as prior VTE, post-myocardial infarction, SCD-HeFT type, and MADIT-II type had a similar poor test performance. A negative MTWA result would decrease the annualized risk of VTE from 8.85% to 6.37% in MADIT-II-type patients and from 5.91% to 2.60% in SCD-HeFT-type patients.Despite a modest association, results of spectrally derived MTWA testing do not sufficiently modify the risk of VTE to change clinical decisions.

Abstract

The purpose of this study was to determine patient and hospital characteristics associated with 4 measures of quality of inpatient heart failure care used by both the primary payer of heart failure care in the United States (Center for Medicare and Medicaid Services) and the main hospital accrediting organization (The Joint Commission).We used data from Get With The Guidelines Program for patients hospitalized with heart failure. Eligibility for receiving care based on the Center for Medicare and Medicaid Services performance measures was determined for assessment of left ventricular ejection fraction (LVEF; n = 60,601), use of angiotensin-converting enzyme inhibitors (ACEi) or angiotensin receptor blockers (ARB) if LVEF<40% and no contraindications (24,130), discharge instructions (49,383), and smoking cessation counseling (10,152). Patient and hospital characteristics that were significantly associated with performance measures in univariate analyses were entered into multivariate logistic regression models.Overall, documentation for LVEF assessment was noted in 95%, ACEi/ARB use in 87%, discharge instruction in 82%, and smoking cessation counseling in 91% of eligible patients. In adjusted analyses, older patients and those with evidence of renal failure were significantly less likely to receive each care measure except for discharge instructions (no age effect). Patients with higher body mass index were more likely to receive ACEi/ARB and discharge instructions but less likely to have LVEF documented or to receive smoking cessation counseling. Small hospitals (<200 beds) were less likely to provide each of the performance measures compared with larger hospitals.Recommended heart failure care is less likely in patients with certain characteristics (older age and abnormal renal function) and those cared for in smaller hospitals. Programs to improve evidence-based care for heart failure should consider interventions specifically targeting and tailored to smaller facilities and patients who are older with comorbidities.

Abstract

Atrial fibrillation is an abnormal heart rhythm characterized by rapid, disorganized activation (fibrillation) of the left and right atria of the heart, and is responsible for 15% of 700,000 strokes in the United States each year. There are multiple pharmacologic and nonpharmacologic therapies used for stroke prevention in atrial fibrillation, including vitamin K antagonists such as warfarin, antiplatelet agents such as aspirin and clopidogrel, and newer agents such as dabigatran, rivaroxaban and apixaban. Nonpharmacologic therapies involve excluding the left atrial appendage from the systemic circulation by surgical ligation or excision, percutaneous ligation, or endovascular implantation of a left atrial occlusion device. Because atrial fibrillation-related stroke is preventable, a comparison of the value of these interventions by cost-effectiveness analysis (CEA) could inform clinical and health policy recommendations. In this article, we review the principles of CEA and identify 11 articles that examine CEA of stroke prophylaxis strategies in atrial fibrillation. Although most studies evaluate aspirin and warfarin across a variety of atrial fibrillation stroke risk profiles, we also review new literature on new pharmacologic therapies such as direct thrombin inhibitors and discuss the potential value of device-based therapies.

Abstract

Noninvasive stress testing might guide the use of aspirin and statins for primary prevention of coronary heart disease, but it is unclear if such a strategy would be cost effective.We compared the status quo, in which the current national use of aspirin and statins was simulated, with 3 other strategies: (1) full implementation of Adult Treatment Panel III guidelines, (2) a treat-all strategy in which all intermediate-risk persons received statins (men and women) and aspirin (men only), and (3) a test-and-treat strategy in which all persons with an intermediate risk of coronary heart disease underwent stress testing and those with a positive test were treated with high-intensity statins (men and women) and aspirin (men only). Healthcare costs, coronary heart disease events, and quality-adjusted life years from 2011 to 2040 were projected. Under a variety of assumptions, the treat-all strategy was the most effective and least expensive strategy. Stress electrocardiography was more effective and less expensive than other test-and-treat strategies, but it was less expensive than treat all only if statin cost exceeded $3.16/pill or if testing increased adherence from <22% to >75%. However, stress electrocardiography could be cost effective in persons initially nonadherent to the treat-all strategy if it raised their adherence to 5% and cost saving if it raised their adherence to 13%.When generic high-potency statins are available, noninvasive cardiac stress testing to target preventive medications is not cost effective unless it substantially improves adherence.

Abstract

The outcomes of procedures are often better when they are performed by more experienced physicians. We assessed whether the rate of complications after implantable cardioverter-defibrillator (ICD) placement varied with the volume of procedures a physician performed.We studied 356 515 initial ICD implantations in the National Cardiovascular Data Registry-ICD Registry, performed by 4011 physicians in 1463 hospitals. We examined the relationship between physician annual ICD implantation volume and in-hospital complications, using hierarchical logistic regression to adjust for patient characteristics, implanting physician certification, hospital characteristics, hospital annual procedure volume, and the clustering of patients within hospitals and by physician. We repeated this analysis for ICD subtypes: single chamber, dual chamber, and biventricular. There were 10 994 patients (3.1%) with a complication after ICD implantation, and 1375 died (0.39%). The complication rate decreased with increasing physician procedure volume from 4.6% in the lowest quartile to 2.9% in the highest quartile (P<0.0001), and the mortality rate decreased from 0.72% to 0.36% (P<0.0001). The inverse relationship between physician procedure volume and complications remained significant after adjusting for patient, physician, and hospital characteristics (OR 1.55 for complications in lowest-volume quartile compared with highest; 95% confidence interval, 1.34-1.79; P<0.0001). This inverse relationship was independent of physician specialty and of hospital volume, was consistent across ICD subtypes, and was also evident for in-hospital mortality.Physicians who implant more ICDs have lower rates of procedural complications and in-hospital mortality, independent of hospital procedure volume, physician specialty, and ICD type.

Abstract

Composite indices of health care performance are an aggregation of underlying individual performance measures and are increasingly being used to rank hospitals. We sought to conduct an observational analysis to determine the influence of the opportunity-based and all-or-none composite performance measures on hospital rankings.We examined 194 245 patients hospitalized with acute myocardial infarction between July 2006 and June 2009 from 334 hospitals participating in the Get With The Guidelines--Coronary Artery Disease (GWTG-CAD) quality improvement program. We analyzed hospital opportunity-based and all-or-none composite scores and 30-day risk-standardized all-cause mortality and readmission rates. We found that the median calculated opportunity-based score for these hospitals was 95.5 (interquartile range, 90.4, 98.0). The median all-or-none score was 88.9 (interquartile range, 79.7, 94.4). The 2 scoring methods were significantly correlated with one another (r=0.98, P<0.001). Rankings generated by the two methods were significantly correlated (r=0.93, P<0.001). The two methods had a modest correlation with the 30-day risk-standardized mortality rate (opportunity-based score: r=-0.25, P<0.001; all-or-none score: r=-0.24, P<0.001). Neither composite measure correlated with the 30-day risk-standardized readmission rate. Over time, the number of hospitals new to the top and bottom quintiles of hospital rankings diminished similarly for both composite measures. When including additional performance measures into the composite score, the two methods produced similar changes in hospital rankings.The opportunity-based and all-or-none coronary artery disease composite indices are highly correlated and yield similar ranking of the top and bottom quintiles of hospitals. The two methods provide similarly modest correlations with 30-day mortality, but not readmission.

Abstract

This study examined the association of hematocrit (Hct) levels measured upon intensive care unit (ICU) admission and red blood cell transfusions to long-term (1-year or 180-day) mortality for both surgical and medical patients.Administrative and laboratory data were collected retrospectively on 2393 consecutive medical and surgical male patients admitted to the ICU between 2003 and 2009. We stratified patients based on their median Hct level during the first 24 hours of their ICU stay (Hct < 25.0%, 25% ? Hct < 30%, 30% ? Hct < 39%, and 39.0% and higher). An extended Cox regression analysis was conducted to identify the time period after ICU admission (0 to <180, 180 to 365 days) when low Hct (<25.0) was most strongly associated with mortality. The unadjusted and adjusted relationship between admission Hct level, receipt of a transfusion, and 180-day mortality was assessed using Cox proportional hazards regression modeling.Patients with an Hct level of less than 25% who were not transfused had the worst mortality risk overall (hazard ratio [HR], 6.26; 95% confidence interval [CI], 3.05-12.85; p < 0.001) during the 6 months after ICU admission than patients with a Hct level of 39.0% or more who were not transfused. Within the subgroup of patients with a Hct level of less than 25% only, receipt of a transfusion was associated with a significant reduction in the risk of mortality (HR, 0.40; 95% CI, 0.19-0.85; p = 0.017).Anemia of a Hct level of less than 25% upon admission to the ICU, in the absence of a transfusion, is associated with long-term mortality. Our study suggests that there may be Hct levels below which the transfusion risk-to-benefit imbalance reverses.

Abstract

Early physician follow-up after a heart failure (HF) hospitalization is associated with lower risk of readmission. However, factors associated with early physician follow-up are not well understood. We identified 30,136 patients with HF ?65 years at 225 hospitals participating in the Organized Program to Initiate Lifesaving Treatment in Hospitalized Patients With Heart Failure (OPTIMIZE) registry or the Get With The Guidelines-Heart Failure (GWTG-HF) registry from January 1, 2003 through December 31, 2006. We linked these clinical data to Medicare claims data for longitudinal follow-up. Using logistic regression models with site-level random effects, we identified predictors of physician follow-up within 7 days of hospital discharge. Overall 11,420 patients (37.9%) had early physician follow-up. Patients residing in hospital referral regions with higher physician concentration were significantly more likely to have early follow-up (odds ratio 1.29, 95% confidence interval 1.12 to 1.48, for highest vs lowest quartile). Patients in rural areas (0.84, 0.78 to 0.91) and patients with lower socioeconomic status (0.79, 0.74 to 0.85) were less likely to have early follow-up. Women (0.87, 0.83 to 0.91) and black patients (0.84, 0.77 to 0.92) were less likely to receive early follow-up. Patients with greater co-morbidity were less likely to receive early follow-up. In conclusion, physician follow-up within 7 days after discharge from a HF hospitalization varied according to regional physician density, rural location, socioeconomic status, gender, race, and co-morbid conditions. Strategies are needed to ensure access among vulnerable populations to this supply-sensitive resource.

Abstract

The aim of this study was to analyze the relationship between payment source and quality of care and outcomes in heart failure (HF).HF is a major cause of morbidity and mortality. There is a lack of studies assessing the association of payment source with HF quality of care and outcomes.A total of 99,508 HF admissions from 244 sites between January 2005 and September 2009 were analyzed. Patients were grouped on the basis of payer status (private/health maintenance organization, no insurance, Medicare, or Medicaid) with private/health maintenance organization as the reference group.The no-insurance group was less likely to receive evidence-based beta-blockers (adjusted odds ratio [OR]: 0.73; 95% confidence interval [CI]: 0.62 to 0.86), implantable cardioverter-defibrillator (OR: 0.59; 95% CI: 0.50 to 0.70), or anticoagulation for atrial fibrillation (OR: 0.73; 95% CI: 0.61 to 0.87). Similarly, the Medicaid group was less likely to receive evidence-based beta-blockers (OR: 0.86; 95% CI: 0.78 to 0.95) or implantable cardioverter-defibrillators (OR: 0.86; 95% CI: 0.78 to 0.96). Angiotensin-converting enzyme inhibitors or angiotensin receptor blockers and beta-blockers were prescribed less frequently in the Medicare group (OR: 0.89; 95% CI: 0.81 to 0.98). The Medicare, Medicaid, and no-insurance groups had longer hospital stays. Higher adjusted rates of in-hospital mortality were seen in patients with Medicaid (OR: 1.22; 95% CI: 1.06 to 1.41) and in patients with reduced systolic function with no insurance.Decreased quality of care and outcomes for patients with HF were observed in the no-insurance, Medicaid, and Medicare groups compared with the private/health maintenance organization group.

Abstract

Consistent with the growing national focus on healthcare quality, the American College of Cardiology Foundation (ACCF) and the American Heart Association (AHA) have taken a leadership role over the past decade in developing measures of the quality of cardiovascular care by convening a joint ACCF/AHA Task Force on Performance Measures. The Task Force is charged with identifying the clinical topics appropriate for the development of performance measures and with assembling writing committees composed of clinical and methodological experts in collaboration with appropriate subspecialty societies. The Task Force has also created methodology documents that offer guidance in the development of process, outcome, composite, and efficiency measures. Cardiovascular performance measures using existing ACCF/AHA methodology are based on Class I or Class III guidelines recommendations, usually with Level A evidence. These performance measures, based on evidence-based ACCF/AHA guidelines, remain the most rigorous quality measures for both internal quality improvement and public reporting. However, many of the tools for diagnosis and treatment of cardiovascular disease involve advanced technologies, such as cardiac imaging, for which there are often no underlying guideline documents. Because these technologies affect the quality of cardiovascular care and also have the potential to contribute to cardiovascular health expenditures, there is a need for more critical assessment of the use of technology, including the development of quality and performance measures in areas in which guideline recommendations are absent. The evaluation of quality in the use of cardiovascular technologies requires consideration of multiple parameters that differ from other healthcare processes. The present document describes methodology for development of 2 new classes of quality measures in these situations, appropriate use measures and structure/safety measures. Appropriate use measures are based on specific indications, processes, or parameters of care for which high level of evidence data and Class I or Class III guideline recommendations may be lacking but are addressed in ACCF appropriate use criteria documents. Structure/safety measures represent measures developed to address structural aspects of the use of healthcare technology (e.g., laboratory accreditation, personnel training, and credentialing) or quality issues related to patient safety when there are neither guidelines recommendations nor appropriate use criteria. Although the strength of evidence for appropriate use measures and structure/safety measures may not be as strong as that for formal performance measures, they are quality measures that are otherwise rigorously developed, reviewed, tested, and approved in the same manner as ACCF/AHA performance measures. The ultimate goal of the present document is to provide direction in defining and measuring the appropriate use-avoiding not only underuse but also overuse and misuse-and proper application of cardiovascular technology and to describe how such appropriate use measures and structure/safety measures might be developed for the purposes of quality improvement and public reporting. It is anticipated that this effort will help focus the national dialogue on the use of cardiovascular technology and away from the current concerns about volume and cost alone to a more holistic emphasis on value.

Abstract

Consistent with the growing national focus on healthcare quality, the American College of Cardiology Foundation (ACCF) and the American Heart Association (AHA) have taken a leadership role over the past decade in developing measures of the quality of cardiovascular care by convening a joint ACCF/AHA Task Force on Performance Measures. The Task Force is charged with identifying the clinical topics appropriate for the development of performance measures and with assembling writing committees composed of clinical and methodological experts in collaboration with appropriate subspecialty societies. The Task Force has also created methodology documents that offer guidance in the development of process, outcome, composite, and efficiency measures. Cardiovascular performance measures using existing ACCF/AHA methodology are based on Class I or Class III guidelines recommendations, usually with Level A evidence. These performance measures, based on evidence-based ACCF/AHA guidelines, remain the most rigorous quality measures for both internal quality improvement and public reporting. However, many of the tools for diagnosis and treatment of cardiovascular disease involve advanced technologies, such as cardiac imaging, for which there are often no underlying guideline documents. Because these technologies affect the quality of cardiovascular care and also have the potential to contribute to cardiovascular health expenditures, there is a need for more critical assessment of the use of technology, including the development of quality and performance measures in areas in which guideline recommendations are absent. The evaluation of quality in the use of cardiovascular technologies requires consideration of multiple parameters that differ from other healthcare processes. The present document describes methodology for development of 2 new classes of quality measures in these situations, appropriate use measures and structure/safety measures. Appropriate use measures are based on specific indications, processes, or parameters of care for which high level of evidence data and Class I or Class III guideline recommendations may be lacking but are addressed in ACCF appropriate use criteria documents. Structure/safety measures represent measures developed to address structural aspects of the use of healthcare technology (e.g., laboratory accreditation, personnel training, and credentialing) or quality issues related to patient safety when there are neither guidelines recommendations nor appropriate use criteria. Although the strength of evidence for appropriate use measures and structure/safety measures may not be as strong as that for formal performance measures, they are quality measures that are otherwise rigorously developed, reviewed, tested, and approved in the same manner as ACCF/AHA performance measures. The ultimate goal of the present document is to provide direction in defining and measuring the appropriate use-avoiding not only underuse but also overuse and misuse-and proper application of cardiovascular technology and to describe how such appropriate use measures and structure/safety measures might be developed for the purposes of quality improvement and public reporting. It is anticipated that this effort will help focus the national dialogue on the use of cardiovascular technology and away from the current concerns about volume and cost alone to a more holistic emphasis on value.

Abstract

Accumulating evidence suggests that collaborative models of care enhance communication among primary care providers, improving quality of care and outcomes for patients with chronic conditions. We sought to determine whether a multifaceted intervention that used a collaborative care model and was directed through primary care providers would improve symptoms of angina, self-perceived health, and concordance with practice guidelines for managing chronic stable angina.We conducted a prospective trial, cluster randomized by provider, involving patients with symptomatic ischemic heart disease recruited from primary care clinics at 4 academically affiliated Department of Veterans Affairs health care systems. Primary end points were changes over 12 months in symptoms on the Seattle Angina Questionnaire, self-perceived health, and concordance with practice guidelines.In total, 183 primary care providers and 703 patients participated in the study. Providers accepted and implemented 91.6% of 701 recommendations made by collaborative care teams. Almost half were related to medications, including adjustments to ?-blockers, long-acting nitrates, and statins. The intervention did not significantly improve symptoms of angina or self-perceived health, although end points favored collaborative care for 10 of 13 prespecified measures. While concordance with practice guidelines improved 4.5% more among patients receiving collaborative care than among those receiving usual care (P < .01), this was mainly because of increased use of diagnostic testing rather than increased use of recommended medications.A collaborative care intervention was well accepted by primary care providers and modestly improved receipt of guideline-concordant care but not symptoms or self-perceived health in patients with stable angina.

Abstract

To determine whether atrial fibrillation (AF) patients with mental health conditions (MHCs) were less likely than AF patients without MHCs to be prescribed warfarin and, if receiving warfarin, to maintain an International Normalized Ratio (INR) within the therapeutic range.Detailed chart review of AF patients using a Veterans Health Administration (VHA) facility in 2003.For a random sample of 296 AF patients, records identified clinician-diagnosed MHCs (independent variable) and AF-related care in 2003 (dependent variables), receipt of warfarin, INR values below/above key thresholds, and time spent within the therapeutic range (2.0-3.0) or highly out of range. Differences between the MHC and comparison groups were examined using X2 tests and logistic regression controlling for age and comorbidity.Among warfarin-eligible AF patients (n = 246), 48.5% of those with MHCs versus 28.9% of those without MHCs were not treated with warfarin (P = .004). Among those receiving warfarin and monitored in VHA, highly supratherapeutic INRs were more common in the MHC group; for example, 27.3% versus 1.6% had any INR >5.0 (P

Abstract

With the advent of cardiac resynchronization therapy, it was unclear what percentage of biventricular pacing would be required to obtain maximal symptomatic and mortality benefit from the therapy. The optimal percentage of biventricular pacing and the association between the amount of continuous pacing and survival is unknown.The purpose of this study was to assess the optimal percentage of biventricular pacing and any association with survival in a large cohort of networked patients.A large cohort of 36,935 patients followed up in a remote-monitoring network, the LATITUDE Patient Management system (Boston Scientific Corp., Natick, Massachusetts), was assessed to determine the association between the percentage of biventricular pacing and mortality.The greatest magnitude of reduction in mortality was observed with a biventricular pacing achieved in excess of 98% of all ventricular beats. Atrial fibrillation and native atrial ventricular condition can limit a high degree of biventricular pacing. Incremental increases in mortality benefit are observed with an increasing percentage of biventricular pacing.Every effort should be made to reduce native atrioventricular conduction with cardiac resynchronization therapy systems in an attempt to achieve biventricular pacing as close to 100% as possible.

Abstract

We evaluated the frequency of appropriate and inappropriate shocks and survival in patients using dual-zone programming versus single-zone programming.For the ALTITUDE REDUCES study, patients were followed for 1.6 ± 1.1 years. The 12-month incidence of any shock was lower for dual-versus single-zone programmed detection at rates ?170 bpm and between 170-200 bpm (P < 0.001). Appropriate shock rates at 1 year were also lower with dual-zone programming in these rate intervals (single zone 9.1%, 5.4%, P < 0.001, dual zone 6.7%, 4.7%, P < 0.02). There were no detectable differences between single- and dual-zone shock incidence at detection rates ? 200 bpm (P = 0.14). Inappropriate shock incidence was less with dual- versus single-zone detection at all detect rates <200 bpm, but not at rates ?200 bpm (P < 0.001, P = 0.37). The lowest risk of appropriate and inappropriate shock was associated with dual-zone programming and detection rates ?200 bpm (2.1%). Dual-zone detection was associated with more nonsustained and diverted therapy episodes but these patients did not have an increased risk of death compared to patients with single-zone programming. Patients programmed to low detection rate, single-zone detection and shock-only therapy also had the highest preshock mortality risk (P = 0.05).Shock incidence is lowest with either single- or dual-zone detection ?200 bpm. For detection rates <200 bpm, dual-zone programming is associated with a reduction in the incidence of total shocks, appropriate shocks, and inappropriate shocks.?

Abstract

Process and outcome measures are often used to quantify quality of care in hospitals. Whether these quality measures correlate with one another and the degree to which hospital provider rankings shift on the basis of the performance metric is uncertain.Heart failure patients ? 65 years of age hospitalized in the Get With the Guidelines-Heart Failure registry of the American Heart Association were linked to Medicare claims from 2005 to 2006. Hospitals were ranked by (1) composite adherence scores for 5 heart failure process measures, (2) composite adherence scores for emerging quality measures, (3) risk-adjusted 30-day death after admission, and (4) risk-adjusted 30-day readmission after discharge. Hierarchical models using shrinkage estimates were performed to adjust for case mix and hospital volume. There were 19 483 patients hospitalized from 2005 to 2006 from 153 hospitals. The overall median composite adherence rate to heart process measures was 85.8% (25th, 75th percentiles 77.5, 91.4). Median 30-day risk-adjusted mortality was 9.0% (7.9, 10.4). Median risk-adjusted 30-day readmission was 22.9% (22.1, 23.5). The weighted ? for remaining within the top 20th percentile or bottom 20th percentile was ? 0.15 and the Spearman correlation overall was ? 0.21 between the different measures of quality of care. The average shift in ranks was 33 positions (13, 68) when criteria were changed from 30-day mortality to readmission and 51 positions (22, 76) when ranking metric changed from 30-day mortality to composite process adherence.Agreement between different methods of ranking hospital-based quality of care and 30-day mortality or readmission rankings was poor. Profiling quality of care will require multidimensional ranking methods and/or additional measures.

Abstract

This study examined the degree to which hospital performance for acute myocardial infarction (AMI) and heart failure (HF) care processes are correlated.Although AMI and HF care processes may be amenable to similar quality improvement interventions, whether these are indeed correlated and whether hospitals with consistently superior performance for both care metrics have the best outcomes remains unknown.We compared hospital performance of the Centers for Medicare & Medicaid Services AMI and HF core measures in 283 hospitals submitting 10 or more patients to the Get With The Guidelines AMI and HF programs between January 2005 and April 2009.Median hospital adherence to AMI and HF composite measures were 93% (interquartile range: 87% to 97%) and 92% (interquartile range: 85% to 96%), respectively, with only a modest correlation between hospital performance on these 2 composite metrics (r = 0.50; 95% confidence interval: 0.41 to 0.58). Hospitals with superior performance to both AMI and HF processes had significantly longer duration of Get With The Guidelines participation and lower adjusted in-hospital mortality (odds ratio: 0.79; 95% confidence interval: 0.63 to 0.99) for AMI and HF patients, whereas hospitals with superior adherence to either alone had similar mortality rates as hospitals with superior adherence to neither measure.Hospitals that had consistent, superior performance for both AMI and HF care had significantly lower risk-adjusted mortality than those with superior performance either alone or for neither measure. Whether a single scoring system to assess global, rather than condition-specific, quality of cardiovascular care would facilitate care quality improvement more consistently and would optimize patient outcomes merits further investigation.

Abstract

This study was undertaken to identify predictors of hospital length of stay (LOS) for heart failure (HF) patients using clinical variables available at the time of admission and hospital characteristics.A cohort of 70,094 HF patients discharged to home from 246 hospitals participating in the Get With the Guidelines-Heart Failure was analyzed for admission predictors for LOS. The analysis incorporated patient characteristics (PC) first, then added hospital characteristics (HC) followed by standard laboratory evaluations (SL), including troponin and brain natriuretic peptide (BNP). There were 31,995 patients (45.6%) with LOS < 4 days, 26,750 (38.2%) with LOS 4 to 7 days, and 11,349 (16.2%) with LOS > 7 days. Patients with longer LOS had more comorbidities and a higher severity of disease on admission. Overall models explained a modest amount of LOS variation, with an r(2) of 4.8%, with PC responsible for 1.3% of variation and together with SL explained 2.2% of variation. HC did not change the variation.Based on admission vital signs and BNP levels, patients with longer LOS have more comorbidities and a higher disease severity. The ability to risk stratify for LOS based on patient admission and hospital characteristics is limited.

Abstract

Hospitalized patients with heart failure and decreased ejection fraction are at substantial risk for mortality and rehospitalization, yet no acute therapies are proven to decrease this risk. Therefore, in-hospital use of medications proved to decrease long-term mortality is a critical strategy to improve outcomes. Although endorsed in guidelines, predictors of initiation and continuation of angiotensin-converting enzyme (ACE) inhibitors/angiotensin receptor blockers (ARBs), ? blockers, and aldosterone antagonists have not been well studied. We assessed noncontraindicated use patterns for the 3 medications using the Get With the Guidelines-Heart Failure (GWTG-HF) registry from February 2009 through March 2010. Medication continuation was defined as treatment on admission and discharge. Multivariable logistic regression using generalized estimating equations was used to determine factors associated with discharge use. In total 9,474 patients were enrolled during the study period. Of those treated before hospitalization, overall continuation rates were 88.5% for ACE inhibitors/ARBs, 91.6% for ? blockers, and 71.9% for aldosterone-antagonists. Of patients untreated before admission, 87.4% had ACE inhibitors/ARBs and 90.1% had ? blocker initiated during hospitalization or at discharge, whereas only 25.2% were started on an aldosterone antagonist. In multivariate analysis, admission therapy was most strongly associated with discharge use (adjusted odds ratios 7.4, 6.0, and 20.9 for ACE inhibitors/ARBs, ? blockers, and aldosterone antagonists, respectively). Western region, younger age, and academic affiliation were also associated with higher discharge use. Although ACE inhibitor/ARB and ?-blocker continuation rates were high, aldosterone antagonist use was lower despite potential eligibility. In conclusion, being admitted on evidence-based medications is the most powerful, independent predictor of discharge use.

Abstract

Although multiple therapies have been shown to lower mortality in patients with heart failure (HF) and reduced left ventricular ejection fraction, their application in clinical practice has been less than ideal. To date, empiric estimation of the potential benefits that could be gained from eliminating these existing treatment gaps with optimal implementation has not been quantified.Eligibility criteria for each evidence-based HF therapy, the estimated frequency of use/nonuse of specific treatments, the case fatality rates, and the risk reductions due to treatment were obtained from published sources. The numbers of deaths prevented or postponed because of each guideline-recommended therapy and overall were determined.Among patients with HF with reduced left ventricular ejection fraction in the United States (n = 2,644,800), the number eligible but not currently treated ranged from 139,749 for hydralazine/isorbide dinitrate to 852,512 for implantable cardioverter defibrillators. The comparative number of deaths that could potentially be prevented per year with optimal implementation of angiotensin-converting enzyme inhibitor/angiotensin receptor antagonist is 6,516; ?-blockers, 12,922; aldosterone antagonists, 21,407; hydralazine/isorbide dinitrate, 6,655; cardiac resynchronization therapy, 8,317; and implantable cardioverter defibrillators, 12,179. If these treatment benefits were additive, optimal implementation of all 6 therapies could potentially prevent 67,996 deaths a year.A substantial number of HF deaths in this country could potentially be prevented by optimal implementation of evidence-based therapies. These data may underscore the importance of performance improvement efforts to translate evidence-based therapy to routine clinical practice so as to reduce contemporary HF mortality.

Abstract

Heart failure (HF) is the leading cause of hospitalization among older Americans. Subsequent discharge to skilled nursing facilities (SNF) is not well described.We performed an observational analysis of Medicare beneficiaries ?65 years of age, discharged alive to SNF or home after ?3-day hospitalization for HF in 2005 and 2006 within the Get With The Guidelines-HF Program. Among 15 459 patients from 149 hospitals, 24.1% were discharged to an SNF, 22.3% to home with home health service, and 53.6% to home with self-care. SNF use varied significantly among hospitals (median, 10.2% versus 33.9% in low versus high tertiles), with rates highest in the Northeast. Patient factors associated with discharge to SNF included longer length of stay, advanced age, female sex, hypotension, higher ejection fraction, absence of ischemic heart disease, and a variety of comorbidities. Performance measures were modestly lower for patients discharged to SNF. Unadjusted absolute event rates were higher at 30 days (death, 14.4% versus 4.1%; rehospitalization, 27.0% versus 23.5%) and 1 year (death, 53.5% versus 29.1%; rehospitalization, 76.1% versus 72.2%) after discharge to SNF versus home, respectively (P<0.0001 for all). After adjustment for measured patient characteristics, discharge to SNF remained associated with increased death (hazard ratio, 1.76; 95% confidence interval, 1.66 to 1.87) and rehospitalization (hazard ratio, 1.08; 95% confidence interval, 1.03 to 1.14).Discharge to SNF is common among Medicare patients hospitalized for HF, and these patients face substantial risk for adverse events, with more than half dead within 1 year. These findings highlight the need to better characterize this unique patient population and understand the SNF care they receive.

Procedure Volume and Outcome You Should Take Into Account Each Hospital ReplyJOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGYFreeman, J. V., Wang, Y., Curtis, J. P., Heidenreich, P. A., Hlatky, M. A.2011; 57 (16): 1714-1714

Abstract

Black and Hispanic populations are at increased risk for developing heart failure (HF) at a younger age and experience differential morbidity and possibly differential mortality compared with whites. Yet, there have been insufficient data characterizing the clinical presentation, quality of care, and outcomes of patients hospitalized with HF as a function of race/ethnicity.We analyzed 78,801 patients from 257 hospitals voluntarily participating in the American Heart Association's Get With The Guidelines-HF Program from January 2005 thru December 2008. There were 56,266 (71.4%) white, 17,775 (22.6%) black, and 4,760 (6.0%) Hispanic patients. In patients hospitalized with HF, we sought to assess clinical characteristics, adherence to core and other guideline-based HF care measures, and in-hospital mortality as a function of race and ethnicity.Relative to white patients, Hispanic and black patients were significantly younger (median age 78.0, 63.0, 64.0 years, respectively), had lower left ventricular ejection fractions, and had more diabetes mellitus and hypertension. With few exceptions, the provision of guideline-based care was comparable for black, Hispanic, and white patients. Black and Hispanic patients had lower in-hospital mortality than white patients: black/white odds ratio 0.69, 95% CI 0.62-0.78, P < .001 and Hispanic/white odds ratio 0.81, 95% CI 0.67-0.98, P = .03.Hispanic and black patients hospitalized with HF have more cardiovascular risk factors than white patients; however; they have similar or better in-hospital mortality rates. Within the context of a national HF quality improvement program, HF care was equitable and improved in all racial/ethnic groups over time.

Abstract

Prior studies demonstrating the prognostic value of echocardiographic measures of diastolic function have been limited by sample size, have included only select clinical populations, and have not incorporated newer measures of diastolic function nor determined their independent prognostic value. The objective of this study is to determine the independent prognostic value of established and new echocardiographic parameters of diastolic function.We included 3,604 consecutive patients referred to 1 of 3 echocardiography laboratories over a 2-year period. We obtained measurements of mitral inflow velocities, pulmonary vein filling pattern, mitral annulus motion (e'), and propagation velocity (V(p)). The primary end point was 1-year all-cause mortality.The mean age of the patients was 68 years, and 95% were male. There were 277 deaths during a mean follow-up of 248 ± 221 days. For patients with reduced left ventricular ejection fraction (LVEF), all measured parameters except for e' were associated with mortality (P < .05) on univariate analysis. For patients with preserved LVEF, the E-wave velocity was significantly associated with mortality (P < .05) on univariate analysis. The deceleration time/E-wave velocity ratio, V(p), and pulmonary vein filling pattern were borderline significant (P < .10). With multivariate analysis, only V(p) was associated with survival for both reduced (P = .02) and preserved LVEF groups (P = .01).In a large, clinically diverse population, most measures of diastolic function were predictive of all-cause mortality without adjustment for patient characteristics. On multivariate analysis, only V(p) was independently associated with total mortality. This association with mortality may be related to factors other than diastolic function and warrants further investigation.

Abstract

Practice guidelines do not recommend use of an implantable cardioverter-defibrillator (ICD) for primary prevention in patients recovering from a myocardial infarction or coronary artery bypass graft surgery and those with severe heart failure symptoms or a recent diagnosis of heart failure.To determine the number, characteristics, and in-hospital outcomes of patients who receive a non-evidence-based ICD and examine the distribution of these implants by site, physician specialty, and year of procedure.Retrospective cohort study of cases submitted to the National Cardiovascular Data Registry-ICD Registry between January 1, 2006, and June 30, 2009.In-hospital outcomes.Of 111,707 patients, 25,145 received non-evidence-based ICD implants (22.5%). Patients who received a non-evidence-based ICD compared with those who received an evidence-based ICD had a significantly higher risk of in-hospital death (0.57% [95% confidence interval {CI}, 0.48%-0.66%] vs 0.18% [95% CI, 0.15%-0.20%]; P

Abstract

Administrative claims data are used routinely for risk adjustment and hospital profiling for heart failure outcomes. As clinical data become more readily available, the incremental value of adding clinical data to claims-based models of mortality and readmission is unclear.We linked heart failure hospitalizations from the Get With The Guidelines-Heart Failure registry with Medicare claims data for patients discharged between January 1, 2004, and December 31, 2006. We evaluated the performance of claims-only and claims-clinical regression models for 30-day mortality and readmission, and compared hospital rankings from both models. There were 25 766 patients from 308 hospitals in the mortality analysis, and 24 163 patients from 307 hospitals in the readmission analysis. The claims-clinical mortality model (area under the curve [AUC], 0.761; generalized R(2)=0.172) had better fit than the claims-only mortality model (AUC, 0.718; R(2)=0.113). The claims-only readmission model (AUC, 0.587; R(2)=0.025) and the claims-clinical readmission model (AUC, 0.599; R(2)=0.031) had similar performance. Among hospitals ranked as top or bottom performers by the claims-only mortality model, 12% were not ranked similarly by the claims-clinical model. For the claims-only readmission model, 3% of top or bottom performers were not ranked similarly by the claims-clinical model.Adding clinical data to claims data for heart failure hospitalizations significantly improved prediction of mortality, and shifted mortality performance rankings for a substantial proportion of hospitals. Clinical data did not meaningfully improve the discrimination of the readmission model, and had little effect on performance rankings.

Abstract

the study aimed to evaluate the prognostic importance of an incidental small pericardial effusion found on echocardiography.we identified 10,067 consecutive patients undergoing echocardiography at 1 of 3 laboratories. Patients were excluded if they were referred for evaluation of the pericardium (n = 133), had cardiac surgery within the previous 60 days (n = 393), had a moderate or greater pericardial effusion (>1 cm if circumferential, n = 29), had no follow-up (n = 153), or had inadequate visualization of the pericardial space (n = 9). The Social Security Death Index was used to determine survival.a small pericardial effusion was noted in 534 (5.7%) of 9,350 patients. Compared to patients without a small effusion, those with an effusion were slightly older (68 ± 13 vs 67 ± 12 years, P = .02) and had a lower ejection fraction (52% vs 55%, P < .0001). Mortality at 1 year was greater for patients with a small effusion (26%) compared to those without an effusion (11%, P < .0001). After adjustment for demographics, medical history, patient location, and other echocardiographic findings, small pericardial effusion remained associated with higher mortality (hazard ratio 1.17, 95% CI 1.09-1.28, P = .0002). Of 211 with an effusion and follow-up echocardiography (mean 547 days), 136 (60%) had resolution, 63 (28%) showed no change, and 12 (5%) had an increase in size, although no patient developed a large effusion or cardiac tamponade.the presence of a small asymptomatic pericardial effusion is associated with increased mortality.

Abstract

Although the use of implantable cardioverter-defibrillators (ICDs) for the primary prevention of sudden cardiac death varies by sex, race, and hospital, geographic variation in ICD use remains unexplored. Our objective was to quantify regional variations in the utilization of primary prevention ICDs in the United States, and to evaluate if an association exists between utilization and physician supply or the proportion of patients meeting the trial inclusion criteria.This is a cross-sectional analysis among the Medicare, fee-for-service population from the National Cardiovascular Data Registry. Using hospital referral regions, we calculated the age-, sex-, and race-adjusted rates of ICD placement for each region and assessed the correlation between these rates and (1) physician supply and (2) the proportion of patients meeting trial inclusion criteria. Substantial variation was found across quintiles of rate ratios of ICD implantation, ranging from 0.39 to 1.77 (compared with a national mean rate of 1.0). This ratio was not correlated with the supply of cardiologists (R(2)=0.01), electrophysiologists (R(2)=0.01), or with the proportion of patients meeting trial inclusion criteria (R(2)<0.01). Over all, 13% of all patients receiving ICDs did not meet trial criteria.Marked geographic variation in the use of primary prevention ICDs exists across the United States that is not correlated with physician supply. Although >1 in 10 patients received ICDs outside of trial criteria, this potential overuse did not explain the variation. Future studies should consider underuse or misuse of primary prevention ICDs as causes of geographic variation.

Abstract

Elevated resting heart rates have been associated with increased mortality and morbidity in patients with heart failure and decreased left ventricular ejection fraction (EF). It is unclear, though, if this association applies to those with heart failure and preserved EF.We determined outcome for 685 consecutive patients with a prior diagnosis of heart failure and a preserved EF (>50%) documented on echocardiography at 1 of 3 laboratories. Patients with non-sinus rhythm were excluded from the analysis. We determined adjusted mortality rates at 1 year after the echocardiogram. The mean age of the cohort was 70 ± 11 years. Of the 685 included patients, 87% had a history of hypertension, 50% had diabetes, and the mean EF was 60% ± 6%. All-cause mortality at 1 year was significantly lower in the group with heart rate below 60 beats/min (10%) when compared with the group with heart rates between 60 and 70 beats/min (18%), 71-90 beats/min (20%), and >90 beats/min (35%) (P < .0001). After adjustment for patient history, demographics, laboratory values, and echocardiographic findings, the hazard ratios for total mortality (relative to a heart rate of <60) were 1.26 (95% CI, 0.88-1.80) for HR 60-69, 1.47 (95% CI, 1.02-2.07) for HR 70-90, and 2.00 (95% CI, 1.31-3.04) for HR>90 (P = .01 across all groups).These data suggest that an elevated resting heart rate is a marker for increased mortality in patients with heart failure and preserved systolic function. Heart rate may be useful in these patients for improved cardiovascular risk assessment.

Abstract

We sought to examine the relationship between hospital implantable cardioverter-defibrillator (ICD) implantation volume and procedural complications in a contemporary, representative population.Hospitals that perform higher volumes of procedures generally have better clinical outcomes.We examined initial ICD implantations between January 2006 and December 2008 at hospitals participating in the NCDR (National Cardiovascular Data Registry) ICD Registry and evaluated the relationship between hospital annual implant volume and in-hospital adverse outcomes.The rate of adverse events declined progressively with increasing procedure volume (p trend < 0.0001). This relationship remained significant (p trend < 0.0001) after adjustment for patient clinical characteristics, operator characteristics, and hospital characteristics. The volume-outcome relationship was evident for all ICD subtypes, including single-chamber (p trend = 0.004), dual-chamber (p trend < 0.0001), and biventricular ICDs (p trend = 0.02).Patients who have an ICD implanted at a high-volume hospital are less likely to have an adverse event associated with the procedure than patients who have an ICD implanted at a low-volume hospital.

Abstract

To determine if B-natriuretic peptide (BNP), handheld ultrasound, and echo interpretation was an accurate and reliable screening for stage B heart failure.One hundred and forty-five indigent diabetic patients were prospectively enrolled, and their BNP levels were measured. Each patient underwent a handheld echo.BNP was correlated with ejection fraction, but not with diastolic dysfunction. The area under the receiver-operator characteristic was 0.77. Kappa statistics for reliability in interpreting handheld echoes was 1.0.Results from this study suggested that BNP may be able to serve as a reliable screening tool for stage B heart failure in diabetic populations. Because BNP is an inexpensive blood test, it could be incorporated into the congestive heart failure diagnostic algorithm to determine which patients need imaging studies, namely echocardiography. Handheld echocardiography had interobserver reliability and is a promising alternative screening method.

Abstract

Previous reports have demonstrated that participation in GWTG-CAD, a national quality initiative of the American Heart Association, is associated with improved guideline adherence for patients hospitalized with CAD. We sought to establish whether these benefits from participation in GWTG-CAD were sustained over time.We used the Centers for Medicare and Medicaid Services Hospital Compare database to examine 6 performance measures and one composite score for 3 consecutive 12-month periods including aspirin and beta-blocker on arrival/discharge, angiotensin-converting enzyme inhibitor (ACE-I) for left ventricular systolic dysfunction (LVSD), and adult smoking cessation counseling. The differences in guideline adherence between the GWTG-CAD hospitals (n = 440, 439, 429) and non-GWTG-CAD hospitals (n = 2,438, 2,268, 2,140) were evaluated for each 12-month period. A multivariate mixed-effects model was used to estimate the independent effect of GWTG-CAD over time adjusting for hospital characteristics.Compared with non-GWTG hospitals, the GWTG-CAD hospitals demonstrated higher guideline adherence for 6 performance measures. The largest differences existed for (1) aspirin at arrival (2.3%, 2.1%, and 1.6% for each 12-month period, respectively), (2) aspirin at discharge (3.4%, 2.2%, and 2.3%), (3) beta-blocker at arrival (3.4%, 2.9%, and 2.6%), and (4) beta-blocker at discharge (2.8%, 1.8%, and 1.5%). In multivariate analysis, the GWTG-CAD hospitals were independently associated with better adherence for 4 of the 6 measures (the exceptions were ACE-I for LVSD and smoking cessation counseling). Superior performance was also found for the composite measures. Although there was some narrowing between groups, GWTG-CAD hospitals maintained superior guideline adherence than non-GWTG-CAD hospitals for the entire 3-year period (adjusted differences 1.8%, 1.6%, and 1.4%).Hospitals participating in GWTG-CAD had modestly superior acute cardiac care and secondary prevention measures performance relative to non-GWTG-CAD. These benefits of GWTG-CAD participation were sustained over time and independent of hospital characteristics.

Abstract

Studies document better survival in heart failure patients with decreased left ventricular ejection fraction (EF) and higher body mass index (BMI; kg/m(2)) compared to those with a lower BMI. However, it is unknown if this "obesity paradox" applies to heart failure patients with preserved EF or if it extends to the very obese (BMI >35).We determined all-cause mortality for 1,236 consecutive patients with a prior diagnosis of heart failure and a preserved EF (> or =50%).Obesity (BMI>30) was noted in 542 patients (44%). The mean age was 71 +/- 12 years, but this varied depending on BMI. One-year all-cause mortality decreased with increasing BMI, except at BMI >45 where mortality began to increase (55% if BMI <20, 38% if BMI 20-25, 26% if BMI 26-30, 25% if BMI 31-35, 17% if BMI 36-40, 18% if BMI 41-45, and 25% if BMI>45, P < .001). After adjustment for patient age, history, medications, and laboratory and echocardiographic parameters, the hazard ratios for total mortality (relative to BMI 26-30) were 1.68 (95% CI, 1.04-2.69) for BMI <20, 1.25 (95% CI, 0.92-1.68) for BMI 20 to 25, 0.99 (95% CI, 0.71-1.36) for BMI 31-35, 0.58 (95% CI, 0.35-0.97) for BMI 36 to 40, 0.79 (95% CI, 0.44-1.4) for BMI 41 to 45, and 1.38 (95% CI 0.74-2.6) for BMI >45 (P < .0001).Low BMI is associated with increased mortality in patients with heart failure and preserved systolic function. However, with a BMI of >45, mortality increased, raising the possibility of a U-shaped relationship between BMI and survival.

Abstract

Little is known about the clinical profile of end-stage renal disease (ESRD) patients who undergo implantable cardioverter-defibrillator (ICD) implantation.This study sought to analyze the risk profile of ESRD patients admitted for ICD implantation.Patients undergoing first-time device implantation in National Cardiovascular Data Registry/ICD registry from 01/01/06 to 12/31/07 were analyzed (n = 164,069). Patients with ESRD (defined as those requiring dialysis) were compared with patients without ESRD. Primary outcome was in-hospital complications. Because length of hospital stay for ERSD patients was significantly longer (8 vs. 4 days), complications within 2 days of ICD implantation were also examined. The proportion of patients meeting approved indications for ICD implantation was evaluated.ESRD patients (n = 6,851, 4.4%) had higher rates of comorbid medical conditions, major complications, and total complications, and were less likely to receive an ICD for primary prevention. ESRD patients who received ICD implantation for primary prevention were more likely to meet trial criteria. ESRD patients were less likely to receive beta-blockers and angiotensin inhibitors (P

Abstract

Metoprolol succinate, carvedilol, and bisoprolol are approved for use in heart failure. Other beta-blockers have been found to be inferior (metoprolol tartrate) or have not been studied (atenolol). The authors compared all-cause mortality following treatment with either atenolol, carvedilol, or metoprolol tartrate for 974 patients with left ventricular function < or =40%. The unadjusted mortality at 6 months was lower with atenolol (3.2%) and carvedilol (4.2%) when compared with metoprolol tartrate (7.5%, P< or =.039). However, patients with atenolol were older but had less prior heart failure. After adjustment for the propensity to be treated with atenolol, patients actually treated with atenolol had a significantly lower risk of death compared with treatment with metoprolol tartrate and comparable outcome to those treated with carvedilol. These results suggest that atenolol may be useful for patients with heart failure treatment and highlight the need for a randomized trial comparing atenolol with established beta-blockers.

Abstract

Fewer women than men undergo implantable cardioverter defibrillator (ICD) implantation for the primary prevention of sudden cardiac death. The criteria used to select patients for ICD implantation may be more permissive among men than for women. We hypothesized that women who undergo primary prevention ICD implantation more often meet strict trial enrollment criteria for this therapy.We studied 59,812 patients in the National Cardiovascular Data Registry ICD registry undergoing initial primary prevention ICD placement between January 2005 and April 2007. Patients were classified as meeting or not meeting enrollment criteria of either the MADIT-II or SCD-HeFT trials. Multivariable analyses assessed the association between gender and concordance with trial criteria adjusting for demographic, clinical, and system characteristics.Among the cohort, 27% (n = 16,072) were women. Overall, 85.2% of women and 84.5% of men met enrollment criteria of either trial (P = .05). In multivariable analyses, women were equally likely to meet trial criteria (OR 1.04, 95% CI 0.99-1.10) than men. Significantly more women than men met the trial enrollment criteria among patients older than age 65 (86.6% of women vs 85.3% of men, OR 1.11, 95% CI 1.03-1.19), but this difference was not found among younger patients (82.5% of women vs 83.0% of men, OR 0.97, 95% CI 0.89-1.07).In a national cohort undergoing primary prevention ICD implantation, older women were only slightly more likely then men to meet the enrollment criteria for MADIT II or SCD-HeFT. Relative overutilization in men is not an important explanation for gender differences in ICD implantation.

ACCF/ASNC/ACR/AHA/ASE/SCCT/SCMR/SNM 2009 Appropriate Use Criteria for Cardiac Radionuclide Imaging A Report of the American College of Cardiology Foundation Appropriate Use Criteria Task Force, the American Society of Nuclear Cardiology, the American College of Radiology, the American Heart Association, the American Society of Echocardiography, the Society of Cardiovascular Computed Tomography, the Society for Cardiovascular Magnetic Resonance, and the Society of Nuclear Medicine Endorsed by the American College of Emergency PhysiciansJOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGYHendel, R. C., Berman, D. S., Di Carli, M. F., Heidenreich, P. A., Henkin, R. E., Pellikka, P. A., Pohost, G. M., Williams, K. A.2009; 53 (23): 2201-2229

ACCF/ASNC/ACR/AHA/ASE/SCCT/SCMR/SNM 2009 appropriate use criteria for cardiac radionuclide imaging: a report of the American College of Cardiology Foundation Appropriate Use Criteria Task Force, the American Society of Nuclear Cardiology, the American College of Radiology, the American Heart Association, the American Society of Echocardiography, the Society of Cardiovascular Computed Tomography, the Society for Cardiovascular Magnetic Resonance, and the Society of Nuclear Medicine.CirculationHendel, R. C., Berman, D. S., Di Carli, M. F., Heidenreich, P. A., Henkin, R. E., Pellikka, P. A., Pohost, G. M., Williams, K. A.2009; 119 (22): e561-87

Abstract

The American College of Cardiology Foundation (ACCF), along with key specialty and subspecialty societies, conducted an appropriate use review of common clinical scenarios where cardiac radionuclide imaging (RNI) is frequently considered. This document is a revision of the original Single-Photon Emission Computed Tomography Myocardial Perfusion Imaging (SPECT MPI) Appropriateness Criteria, published 4 years earlier, written to reflect changes in test utilization and new clinical data, and to clarify RNI use where omissions or lack of clarity existed in the original criteria. This is in keeping with the commitment to revise and refine appropriate use criteria (AUC) on a frequent basis. The indications for this review were drawn from common applications or anticipated uses, as well as from current clinical practice guidelines. Sixty-seven clinical scenarios were developed by a writing group and scored by a separate technical panel on a scale of 1 to 9 to designate appropriate use, inappropriate use, or uncertain use. In general, use of cardiac RNI for diagnosis and risk assessment in intermediate- and high-risk patients with coronary artery disease (CAD) was viewed favorably, while testing in low-risk patients, routine repeat testing, and general screening in certain clinical scenarios were viewed less favorably. Additionally, use for perioperative testing was found to be inappropriate except for high selected groups of patients. It is anticipated that these results will have a significant impact on physician decision making, test performance, and reimbursement policy, and will help guide future research.

Abstract

Women are at higher risk than men for adverse events with certain invasive cardiac procedures. Our objective was to compare rates of in-hospital adverse events in men and women receiving implantable cardioverter- defibrillator (ICD) therapy in community practice.Using the National Cardiovascular Data Registry ICD Registry, we identified patients undergoing first-time ICD implantation between January 2006 and December 2007. Outcomes included in-hospital adverse events after ICD implantation. Multivariable analysis assessed the association between gender and in-hospital adverse events, with adjustment for demographic, clinical, procedural, physician, and hospital characteristics. Of 161,470 patients, 73% were male, and 27% were female. Women were more likely to have a history of heart failure (81% versus 77%, P<0.01), worse New York Heart Association functional status (57% versus 50% in class III and IV, P<0.01), and nonischemic cardiomyopathy (44% versus 27%, P<0.01) and were more likely to receive biventricular ICDs (39% versus 34%, P<0.01). In unadjusted analyses, women were more likely to experience any adverse event (4.4% versus 3.3%, P<0.001) and major adverse events (2.0% versus 1.1%, P<0.001). In multivariable models, women had a significantly higher risk of any adverse event (OR 1.32, 95% CI 1.24 to 1.39) and major adverse events (OR 1.71, 95% CI 1.57 to 1.86).Women are more likely than men to have in-hospital adverse events related to ICD implantation. Efforts are needed to understand the reasons for higher ICD implantation-related adverse event rates in women and to develop strategies to reduce the risk of these events.

Abstract

Racial/ethnic differences in the use of cardiac resynchronization therapy with defibrillator (CRT-D) may result from underprovision or overprovision relative to published guidelines.The purpose of this study was to examine the National Cardiovascular Data Registry (NCDR) ICD Registry for ethnic/racial differences in use of CRT-D.We studied white, black, and Hispanic patients who received either an implantable cardioverter-defibrillator (ICD) or CRT-D between January 2005 and April 2007. Two multivariate logistic regression models were fit with the following outcome variables: (1) receipt of either ICD or CRT-D and (2) receipt of CRT-D outside of published guidelines.Of 108,341 registry participants, 22,205 met inclusion criteria for the first analysis and 27,165 met criteria for the second analysis. Multivariate analysis indicated CRT-eligible black (odds ratio [OR] 0.84; 95% confidence interval [CI], 0.75-0.95; P

Abstract

Outpatient care accounts for a significant proportion of total heart failure (HF) expenditures. This observation, plus an expanding list of treatment options, has led to the development of the disease-specific HF clinic.The goals of the HF clinic are to reduce mortality and rehospitalization rates and improve quality of life for patients with HF through individualized patient care. A variety of staffing configurations can serve to meet these goals. Successful HF clinics require an ongoing commitment of resources, the application of established clinical practice guidelines, an appropriate infrastructure, and a culture of quality assessment.This consensus statement will identify the components of HF clinics, focusing on systems and procedures most likely to contribute to the consistent application of guidelines and, consequently, optimal patient care. The domains addressed are: disease management, functional assessment, quality of life assessment, medical therapy and drug evaluation, device evaluation, nutritional assessment, follow-up, advance planning, communication, provider education, and quality assessment.

Abstract

The assessment of medical practice is evolving rapidly in the United States. An initial focus on structure and process performance measures assessing the quality of medical care is now being supplemented with efficiency measures to quantify the "value" of healthcare delivery. This statement, building on prior work that articulated standards for publicly reported outcomes measures, identifies preferred attributes for measures used to assess efficiency in the allocation of healthcare resources. The attributes identified in this document combined with the previously published standards are intended to serve as criteria for assessing the suitability of efficiency measures for public reporting. This statement identifies the following attributes to be considered for publicly reported efficiency measures: integration of the quality and cost; valid cost measurement and analysis; minimal incentive to provide poor quality care; and proper attribution of the measure. The attributes described in this statement are relevant to a wide range of efforts to profile the efficiency of various healthcare providers, including hospitals, healthcare systems, managed-care organizations, physicians, group practices, and others that deliver coordinated care.

Abstract

The assessment of medical practice is evolving rapidly in the United States. An initial focus on structure and process performance measures assessing the quality of medical care is now being supplemented with efficiency measures to quantify the "value" of healthcare delivery. This statement, building on prior work that articulated standards for publicly reported outcomes measures, identifies preferred attributes for measures used to assess efficiency in the allocation of healthcare resources. The attributes identified in this document combined with the previously published standards are intended to serve as criteria for assessing the suitability of efficiency measures for public reporting. This statement identifies the following attributes to be considered for publicly reported efficiency measures: integration of the quality and cost; valid cost measurement and analysis; minimal incentive to provide poor quality care; and proper attribution of the measure. The attributes described in this statement are relevant to a wide range of efforts to profile the efficiency of various healthcare providers, including hospitals, healthcare systems, managed-care organizations, physicians, group practices, and others that deliver coordinated care.

Abstract

Because the burden of sudden cardiac death (SCD) is substantial, it is important to use all guideline-driven therapies to prevent SCD. Among those therapies is the implantable cardioverter defibrillator (ICD). When indicated, ICD use is beneficial and cost-effective. Unfortunately, studies suggest that most patients who have indications for this therapy for primary or secondary prevention of SCD are not receiving it. To explore potential reasons for this underuse and to propose potential facilitators for ICD dissemination, the Duke Center for the Prevention of SCD at the Duke Clinical Research Institute (Durham, NC) organized a think tank meeting of experts on this issue. The meeting took place on December 12 and 13, 2007, and it included representatives of clinical cardiology, cardiac electrophysiology, general internal medicine, economics, health policy, the US Food and Drug Administration, the Centers for Medicare and Medicaid Services, the Agency for Health care Research and Quality, and the device and pharmaceutical industry. Although the meeting was funded by industry participants, this article summarizing the presentations and discussions that occurred at the meeting presents the expert opinion of the authors.

Abstract

Growth hormone (GH) is required to maintain normal cardiac structure and function and has a positive effect on cardiac remodeling in experimental and possibly human disease. Cardiac resistance to GH develops in the uremic state, perhaps predisposing to the characteristic cardiomyopathy associated with uremia. It was hypothesized that administration of low-dosage GH may have a salutary effect on the cardiac remodeling process in uremia, but because high levels of GH have adverse cardiac effects, administration of high-dosage GH may worsen uremic cardiomyopathy. In rats with chronic renal failure, quantitative cardiac morphology revealed a decrease in total capillary length and capillary length density and an increase in mean intercapillary distance and fibroblast volume density evident. Low-dosage GH prevented these changes. Collagen and TGF-beta immunostaining, increased in chronic renal failure, were also reduced by GH, suggesting a mechanism for its salutary action. Low-dosage GH also prevented thickening of the carotid artery but did not affect aortic pathology. In contrast, high-dosage GH worsened several of these variables. These results suggest that low-dosage GH may benefit the heart and possibly the carotid arteries in chronic renal failure.

Abstract

Heart failure (HF) disease management programs have shown impressive reductions in hospitalizations and mortality, but in studies limited to short time frames and high-risk patient populations. Current guidelines thus only recommend disease management targeted to high-risk patients with HF.This study applied a new technique to infer the degree to which clinical trials have targeted patients by risk based on observed rates of hospitalization and death. A Markov model was used to assess the incremental life expectancy and cost of providing disease management for high-risk to low-risk patients. Sensitivity analyses of various long-term scenarios and of reduced effectiveness in low-risk patients were also considered.The incremental cost-effectiveness ratio of extending coverage to all patients was $9700 per life-year gained in the base case. In aggregate, universal coverage almost quadrupled life-years saved as compared to coverage of only the highest quintile of risk. A worst case analysis with simultaneous conservative assumptions yielded an incremental cost-effectiveness ratio of $110,000 per life-year gained. In a probabilistic sensitivity analysis, 99.74% of possible incremental cost-effectiveness ratios were

Abstract

Innovation of medical technology is a major driving force behind the increase in medical expenditures in developed countries. Previous studies identified that the diffusion of medical technology varied across countries according to the characteristics of regulatory policy and payment systems. Based on Roger's diffusion of innovation theory, this study purported to see how local practice norms, the evolving nature of diffusing technology, and local clinical needs in addition to differences in politico-economic systems would affect the process of innovation diffusion. Taking a case of coronary stenting, an innovative therapeutic technology in early 1990s, we provided a case study of hospital-based data between two teaching high-tech hospitals in Japan and the US for discussion. Stenting began to be widely used in both countries when complementary new technology modified its clinical efficacy, but the diffusion process still differed between the two hospitals due to (1) distinctive payment systems for hospitals and physicians, (2) practice norms in favor of percutaneous intervention rather than bypass surgery that was shaped by payment incentives and cultural attitudes, and (3) local patient's clinical characteristics that the technology had to be tailored for. The case study described the diffusion of stent technology as a dynamic process between patients, physicians, hospitals, health care systems, and technology under global and local conditions.

Abstract

Although the clinical efficacy of implantable cardioverter-defibrillators (ICDs) has been convincingly demonstrated in clinical trials, the impact of ICDs on health care costs and recipients' quality of life (QOL) is less certain. The existing medical research on the health care costs and QOL effects of ICDs was reviewed and summarized. Medline and the Institute for Scientific Information's Web of Knowledge were searched for publications reporting costs of care and QOL assessments of ICD recipients. Unpublished and non-peer-reviewed "gray" publications were excluded. Reports were included if they reported primary, original patient data that were collected after 1993, when nonthoracotomy defibrillators entered clinical practice. Two reviewers independently evaluated publications for relevance and quality, abstracted study data, and summarized the findings. Excessive heterogeneity among studies prevented formal meta-analysis, so a narrative synthesis was performed, and key themes were identified from the published research. There were limited published data on the costs of ICD care, especially for the primary prevention of sudden cardiac death. The published research on ICD QOL lacked large, multicenter, longitudinal studies. Many ICD QOL studies were performed in small numbers of patients at single centers. Initial ICD implantation costs ranged (in 2006 United States dollars) from $28,500 to $55,200, with annual follow-up costs ranging from $4,800 to $17,000. QOL was higher for ICD recipients than for patients treated with antiarrhythmic drugs, but there was a substantial prevalence of anxiety, depression, and "loss of control" in ICD recipients, particularly in those who had received ICD shocks. In conclusion, ICD implantation remains costly but may be becoming less expensive over time, and ICD recipients' QOL is significantly affected by their devices.

Are registry hospitals different? A comparison of patients admitted to hospitals of a commercial heart failure registry with those from national and community cohortsAMERICAN HEART JOURNALHeidenreich, P. A., Fonarow, G. C.2006; 152 (5): 935-939

Abstract

Clinical registries have been created to address questions that are difficult to answer with clinical trials. However, the applicability of registry findings to the general population has been questioned because of concerns over potential bias in the selection of participating hospitals. The purpose of this study was to determine if patients admitted to hospitals participating in a heart failure registry (ADHERE) are comparable with patients admitted to other hospitals, including those admitted to Framingham area hospitals.We used a 20% random sample of all Medicare patients discharged during 1984 to 2001 to determine rates of hospitalization, procedure use, and survival after a first admission for heart failure (none in the prior 3 years). Hospitals were classified as participating in the ADHERE registry (n = 189), located within or near Framingham, MA (n = 9), or other (n = 5541).A total of 725,702 first admissions were identified, including 80,338 to ADHERE hospitals and 1716 to Framingham area hospitals. Minimal differences in patient characteristics were noted between patients admitted to ADHERE and non-ADHERE hospitals, although patients admitted to Framingham area hospitals were more likely to be white (95%) than were patients admitted to ADHERE (84%) or other hospitals (87%, P < .0001). Mortality at 1 year was 35.8% for ADHERE, 36.2% for other hospitalized patients, and 32.9% for Framingham patients (P < .0001). Rehospitalization for heart failure at 90 days was 13.0% for following admission to ADHERE, 13.0% to other hospitals, and 16.4% to Framingham hospitals (P = .0004). After adjustment for patient characteristics, differences in outcome between ADHERE and non-ADHERE hospitals remained minimal.Patients admitted with heart failure to ADHERE registry hospitals had similar baseline characteristics and outcomes to other patients.

Abstract

A wide variety of instruments have been used to assess the functional capabilities and health status of patients with chronic heart failure (HF), but it is not known how well these tests are correlated with one another, nor which one has the best association with measured exercise capacity.Forty-one patients with HF were assessed with commonly used functional, health status, and quality of life measures, including maximal cardiopulmonary exercise testing, the Duke Activity Status Index (DASI), the Veterans Specific Activity Questionnaire (VSAQ), the Kansas City Cardiomyopathy Questionnaire (KCCQ), and 6-minute walk distance. Pretest clinical variables, including age, resting pulmonary function tests (forced expiratory volume in 1 s and forced vital capacity), and ejection fraction (EF) were also considered. The association between performance on these functional tools, clinical variables, and exercise test responses including peak VO2 and the VO2 at the ventilatory threshold, was determined. Peak oxygen uptake (VO2) was significantly related to VO2 at the ventilatory threshold (r = 0.76, P < .001) and estimated METs from treadmill speed and grade (r = 0.72, P < .001), but had only a modest association with 6-minute walk performance (r = 0.49, P < .01). The functional questionnaires had modest associations with peak VO2 (r = 0.37, P < .05 and r = 0.26, NS for the VSAQ and DASI, respectively). Of the components of the KCCQ, peak VO2 was significantly related only to quality of life score (r = 0.46, P < .05). Six-minute walk performance was significantly related to KCCQ physical limitation (r = 0.53, P < .01) and clinical summary (r = 0.44, P < .05) scores. Among pretest variables, only age and EF were significantly related to peak VO2 (r = -0.58, and 0.46, respectively, P < .01). Multivariately, age and KCCQ quality of life score were the only significant predictors of peak VO2, accounting for 72% of the variance in peak VO2.Commonly used functional measures, symptom tools, and quality of life assessments for patients with HF are poorly correlated with one another and are only modestly associated with exercise test responses. These findings suggest that exercise test responses, non-exercise test estimates of physical function, and quality of life indices reflect different facets of health status in HF and one should not be considered a surrogate for another.

Abstract

Despite dramatic changes in heart failure management during the 1990s, little is known about the national heart failure mortality trends during this time period, particularly among the elderly. The purpose of this study was to determine temporal trends in outcomes of elderly patients with heart failure between 1992 and 1999.We analyzed a national sample of 3,957,520 Medicare beneficiaries aged 65 years or more who were hospitalized with heart failure between 1992 and 1999, assessing temporal trends in 30-day and 1-year all-cause mortality and 30-day and 6-month all-cause hospital readmission. In risk-adjusted analyses, mortality and readmission for each year between 1994 and 1999 were compared with the referent year of 1993.Crude 30-day and 1-year mortality decreased slightly (range for 1992-1999: 11.0%-10.3% and 32.5%-31.7%, respectively), whereas 30-day and 6-month readmission increased (10.2%-13.8% and 35.4%-40.3%, respectively). After risk adjustment, there was no change in 30-day mortality between 1993 and 1999 (eg, for 1999 vs 1993, odds ratio [OR] 1.01, 95% confidence interval [CI], 1.00-1.02). One-year mortality was lower in 1994 compared with 1993 (OR 0.91, 95% CI, 0.90-0.92), but data from subsequent years suggested no continuous improvement after 1994 (1999 vs 1993: OR 0.93, 95% CI, 0.92-0.94). Thirty-day readmission increased (1999 vs 1993: OR 1.09, 95% CI, 1.07-1.10), but there was no change in 6-month readmission (1999 vs 1993: OR 1.00, 95% CI, 0.99-1.01).We found no substantial improvement in mortality and hospital readmission during the 1990s among elderly patients hospitalized with heart failure. These findings suggest that recent innovations in heart failure management have not yet translated into better outcomes in this population.

Abstract

Heart failure (HF) guidelines recommend treatment with multiple medications to improve survival, functioning, and quality of life. Yet, HF treatments can be costly, resulting in significant economic burden for some patients. To date, there are few data on the impact of patients' perceived difficulties in affording medical care on their health outcomes.Comprehensive clinical data, health status, and the perceived economic burden of 539 HF outpatients from 13 centers were assessed at baseline and 1 year later. Health status was quantified with the Kansas City Cardiomyopathy Questionnaire overall summary score. Cross-sectional and longitudinal (1-year) analyses were conducted comparing the health status of patients with and without self-reported economic burden. Patients with economic burden had significantly lower health status scores at both baseline and 1 year later. Although baseline perceptions of economic burden were associated with poorer health status, patients' perceived difficulty affording medical care at 1 year was a more important determinant of lower 1-year health status.HF patients reporting difficulty affording their medical care had lower perceived health status than those reporting little to no economic burden. More research is needed to further evaluate this association and to determine whether addressing perceived economic difficulties affording health care can improve HF patients' health status.

Abstract

The inferior vena cava (IVC) morphology is often used to estimate right atrial pressure; however, the association of IVC morphology and outcome is poorly described.We evaluated 4383 consecutive outpatients (98% men) undergoing echocardiography at 1 of 3 Veterans Affairs laboratories.Of the 3729 with adequate images, 3295 (88%) had a normal IVC (< 2 cm), 358 (10%) had a dilated IVC that collapsed at least 50% with inspiration, and 76 (2%) had dilated IVC that did not collapse. Compared with patients with a normal IVC, those with a dilated IVC were older (66 +/- 13 vs 69 +/- 12 years if dilated with collapse and 70 +/- 12 years if dilated without collapse, P = .0005) and were more likely to have a history of heart failure (11% vs 18% if dilated with collapse and 38% if dilated without collapse, P < .0001). The 90-day and 1-year survival rates were 99% and 95% for those with a normal IVC, 98% and 91% for those with a dilated IVC with collapse, and 89% and 67% for those with a dilated IVC without collapse (P < .0001). After adjustment for clinical and echocardiographic characteristics including left and right ventricular function and pulmonary artery pressure, a dilated IVC without collapse remained associated with increased mortality: hazard ratio 1.43 (1.29-1.57 compared with a normal IVC, P < .0001).A dilated IVC without collapse with inspiration is associated with worse survival in men independent of a history of heart failure, other comorbidities, ventricular function, and pulmonary artery pressure.

Abstract

The purpose of this study was to determine whether baseline physical examination and history are useful in identifying patients with cardiac edema as defined by echocardiography, and to compare survival for patients with cardiac and noncardiac causes of edema.Physical examination and history data can help to identify patients with edema who have significant cardiac disease.We reviewed the medical records of 278 consecutive patients undergoing echocardiography for evaluation of peripheral edema. We classified cardiac edema as the presence of any of the following: left ventricular ejection fraction < 45%, systolic pulmonary artery pressure > 45 mmHg, reduced right ventricular function, enlarged right ventricle, and a dilated inferior vena cava.The mean age of the 243 included patients was 67 +/- 12 years and 92% were male. A cardiac cause of edema was found in 56 (23%). Independent predictors of a cardiac cause of edema included chronic obstructive pulmonary disease (COPD, odds ratio [OR] 1.74, 95% confidence interval [CI] 1.14-2.60) and crackles (OR 1.98, 95% CI 1.26-3.10). The specificity for a cardiac cause of edema was high (91% for COPD, 93% for crackles); however, the sensitivity was quite low (27% for COPD, for 24% crackles). Compared with patients without a cardiac cause of edema, those with a cardiac cause had increased mortality (25 vs. 8% at 2 years, p < 0.01), even after adjustment for other characteristics (hazard ratio 1.55, 95% CI 1.08-2.24).A cardiac cause of edema is difficult to predict based on history and examination and is associated with high mortality.

Abstract

Mediastinal irradiation is known to cause cardiac disease, but its effect on left ventricular diastolic function is unknown. The purpose of this study was to determine the prevalence of diastolic dysfunction and its association with prognosis in asymptomatic patients after mediastinal irradiation.We recruited 294 patients who had received at least 35 Gy to the mediastinum for treatment of Hodgkin disease. Each patient underwent resting echocardiography, stress echocardiography, and nuclear scintigraphy. Survival free from cardiac events was determined during 3.2 years of follow-up.The mean age of the included patients was 42 years, and 49% were male. Adequate measurements of diastolic function were obtained in 282 (97%) patients. Diastolic dysfunction was considered mild in 26 (9%) and moderate in 14 (5%). Exercise-induced ischemia was more common in patients with diastolic dysfunction (23%) than those with normal diastolic function (11%, P = .008). After adjustment for patient demographics, clinical characteristics, and radiation history, patients with diastolic dysfunction had worse event-free survival than patients with normal function (hazard ratio 1.66, 95% CI 1.06-2.4).There is a high prevalence of diastolic dysfunction in asymptomatic patients after mediastinal irradiation, and the presence of diastolic dysfunction is associated with stress-induced ischemia and a worse prognosis. Screening with Doppler echocardiography may be helpful in identifying patients at risk for subsequent cardiac events.

Abstract

Although monitoring the clinical status of patients with heart failure rests at the core of clinical medicine, the ability of different techniques to reflect clinical change has not been evaluated. This study sought to describe changes in various measures of disease status associated with gradations of clinical change.A prospective, 14-center cohort of 476 outpatients was assessed at baseline and 6 +/- 2 weeks to compare changes in 7 heart failure measures with clinically observed change. Measures included health status instruments (the Kansas City Cardiomyopathy Questionnaire [KCCQ], Short Form-12, and EQ-5D), physician-assessed functional class (New York Heart Association [NYHA]), an exercise test (6-minute walk), patient weight, and a biomarker (B-type natriuretic peptide). Cardiologists, blinded to all measures except weight and NYHA, categorized clinical change ranging from large deterioration to large improvement.The KCCQ, NYHA, and 6-minute walk test were most sensitive to clinical change. For patients with large, moderate, and small deteriorations, the KCCQ decreased by 25 +/- 16, 17 +/- 14, and 5.3 +/- 11 points, respectively. For patients with small, moderate, and large improvements, the KCCQ increased by 5.7 +/- 16, 10.5 +/- 16, and 22.3 +/- 16 points, respectively (P < .01 for all compared with the no change group). New York Heart Association and 6-minute walk distance were significantly different for those with moderate and large changes (P < .05) but neither revealed a difference between those with small versus no clinical deterioration. The KCCQ had the highest c statistic for monitoring individual patients, followed by NYHA and 6-minute walk.The KCCQ, followed by the NYHA and the 6-minute walk test, most accurately reflected clinical change in patients with heart failure.

Abstract

Although B-type natriuretic peptide (BNP) levels have been proposed as a means of assessing disease severity in patients with heart failure, it is not known if BNP levels are correlated with health status (symptom burden, functional limitation, and quality of life).We studied 342 outpatients with systolic heart failure from 14 centers at baseline and 6 +/- 2 weeks with BNP levels and the Kansas City Cardiomyopathy Questionnaire (KCCQ), a heart-failure-specific health status instrument. We assessed the correlation between KCCQ scores and BNP at baseline and changes in KCCQ according to changes in BNP levels between baseline and follow-up. Mean baseline BNP levels were 379 +/- 387 pg/mL and mean KCCQ summary scores were 62 +/- 23 points. Although baseline BNP and KCCQ were both associated with New York Heart Association classification (P < .001 for both), BNP and KCCQ were not correlated (r(2) = 0.008, P = .15). There was no significant relationship between changes in BNP and KCCQ regardless of the threshold used to define a clinically meaningful BNP change. For example, using >50% BNP change threshold, KCCQ improved by 3.7 +/- 14.2 in patients with decreasing BNP, improved by 1.7 +/- 13.6 in patients with no BNP change, and improved by 1.0 +/- 13.4 in patients with increasing BNP (P = .6).BNP and health status are not correlated in outpatients with heart failure in the short term. This suggests that these measures may assess different aspects of heart failure severity, and that physiologic measures do not reflect patients' perceptions of the impact of heart failure on their health status.

Abstract

The aim of this study was to evaluate which parameter of right ventricular (RV) echocardiographic best mirrors the clinical status of patients with pulmonary arterial hypertension. Patients with pulmonary arterial hypertension on epoprostenol therapy were identified via hospital registry. Twenty patients, (16 females, 4 males) were included in the study, 9 with primary pulmonary hypertension and 11 with other diseases. Echocardiograms before therapy and at 22.7 (+/-9.3) months into therapy were compared. The right ventricular myocardial performance index (RVMPI) was measured as the sum of the isometric contraction time and the isometric relaxation time divided by right ventricular ejection time. Other measures included peak tricuspid regurgitation jet velocity (TRV), pulmonary artery systolic pressure (PASP), pulmonary valve velocity time integral (PVVTI), PASP/PVVTI (as an index of total pulmonary resistance) and symptoms by New York Heart Association (NYHA) functional class. Echo parameters of right ventricular function were analyzed in patients, before and during therapy. There was significant improvement of NYHA class in patients following epoprostenol therapy (P < 0.0001). Peak tricuspid regurgitant jet velocity (pre 4.2 +/- 0.6 m/sec, post 3.8 +/- 0.7 m/sec, P = 0.02) and PASP/PVVTI (pre 6.7 +/- 3.3 mmHg/m per second, post 4.8 +/- 2.2 mmHg/m per second, P < 0.0001) were significantly improved during treatment. RVMPI did not improve (pre 0.6 +/- 0.3, post 0.6 +/- 0.3, P = 0.54). Changes in NYHA class did not correlate with changes in RVMPI (P = 0.33) or changes in PASP/PVVTI (P = 0.58). Despite significant improvements in TRV, PASP/PVVTI, and NYHA class, there was no significant change in RVMPI on epoprostenol therapy. Changes in right ventricular indices were not correlated with changes in NYHA class.

Abstract

The study was designed to determine whether racial disparity in utilization of the implantable cardioverter-defibrillator (ICD) has improved over time, and whether small-area geographic variation in ICD utilization contributed to national levels of racial disparity.Although racial disparities in cardiac procedures have been well-documented, it is unknown whether there has been improvement over time. Low ICD utilization rates in predominantly black geographic areas may have exacerbated national levels of disparity.Discharge abstracts from elderly black and white Medicare beneficiaries hospitalized with ventricular arrhythmias from 1990 to 2000 were analyzed to determine if ICD implantation occurred within 90 days of initial hospitalization. Multivariate logistic regression models were constructed to assess the relationship between ICD implantation, year of admission, and the percentage of black inhabitants in each patient's county of hospitalization while controlling for clinical, hospital, and demographic characteristics.There was improvement in ICD implantation racial disparity: In the period 1990 to 1992, black patients had an odds ratio of 0.52 (95% confidence interval [CI] 0.42 to 0.64) for receiving an ICD compared with whites. However, by 1999 to 2000, the odds ratio for blacks had risen to 0.69 (95% CI 0.61 to 0.78) (test-for-trend p=0.01). Approximately 20% of this trend could be explained by reduction in geographic variation in ICD use between areas with larger black and predominantly white populations.Rates of ICD implants became more equal among whites and blacks during the 1990s, although persistent disparity remained at the decade's end. Geographic equalization in cardiovascular procedure rates may be an essential mechanism in rectifying disparities in health care.

Abstract

Sub-acute thrombosis is a serious complication of coronary artery stenting. Clopidogrel plus aspirin is the accepted prophylactic regimen, but has yet to be proven superior to ticlopidine plus aspirin, and a new regimen combining cilostazol and aspirin has been introduced.We conducted a meta-analysis of all trials that compared >or=2 oral anti-thrombotic strategies in patients undergoing coronary stent placement to determine which treatment optimally prevents adverse cardiac events in the 30 days following stent insertion. We used meta-regression to compare all strategies to a shared control strategy: ticlopidine plus aspirin. We also compared randomized trials to historically controlled and other non-randomized trials. We conducted sensitivity analysis and subgroup analysis to assess for possible heterogeneity.In comparison to ticlopidine plus aspirin the odds-ratios for cardiac events, with 95% confidence intervals were: aspirin alone, 4.29 (3.09-5.97), coumadin plus aspirin, 2.65 (2.18-3.21), clopidogrel plus aspirin, 1.06 (0.86-1.31), cilostazol plus aspirin, 0.73 (0.47-1.14). Among trials that compared clopidogrel plus aspirin to ticlopidine plus aspirin, historically controlled trials were statistically distinct from randomized trials. The analysis of cilostazol was sensitive to the small size of the included studies.Neither clopidogrel plus aspirin nor cilostazol plus aspirin can be statistically distinguished from ticlopidine plus aspirin for the prevention of adverse cardiac events in the 30 days after stenting. A randomized trial including cilostazol is warranted.

Abstract

The finding of aortic regurgitation at a classical examination is a diastolic murmur.Aortic regurgitation is more likely to be associated with a systolic than with a diastolic murmur during routine screening by a noncardiologist physician.In all, 243 asymptomatic patients (mean age 42 +/- 10 years) with no known cardiac disease but at risk for aortic valve disease due to prior mediastinal irradiation (> or = 35 Gy) underwent auscultation by a noncardiologist followed by echocardiography. A systolic murmur was considered benign if it was grade < or = II/VI, not holosystolic, was not heard at the apex, did not radiate to the carotids, and was not associated with a diastolic murmur.Of the patients included, 122 (49%) were male, and 86 (35%) had aortic regurgitation, which was trace in 20 (8%), mild in 52 (21%), and moderate in 14 (6%). A systolic murmur was common in patients with aortic regurgitation, occurring in 12 (86%) with moderate, 26 (50%) with mild, 6 (30%) with trace, and 27 (17%) with no aortic regurgitation (p < 0.0001). The systolic murmurs were classified as benign in 21 (78%) patients with mild and 8 (67%) with moderate aortic regurgitation. Diastolic murmurs were rare, occurring in two (14%) with moderate, two (4%) with mild, and three (2%) with no aortic regurgitation (p=0.15).An isolated systolic murmur is a common auscultatory finding by a noncardiologist in patients with moderate or milder aortic regurgitation. A systolic murmur in patients at risk for aortic valve disease should prompt a more thorough physical examination for aortic regurgitation.

Abstract

While the beneficial effect of exercise capacity on mortality is well-accepted, its effect on health-care costs remains uncertain. This study investigates the relationship between exercise capacity and health-care costs.The Veterans Affairs Health Care System recently implemented a Decision Support System that provides data on patterns of care, patient outcomes, workload, and costs. Total inpatient and outpatient costs were derived from existing administrative and clinical data systems, were adjusted for relative value units, and were expressed in relative cost units. We used univariable and multivariable analyses to evaluate the 1-year total costs in the year following a standard exercise test. Costs were compared with exercise capacity estimated in metabolic equivalents (METs), other test results, and clinical variables for 881 consecutive patients who were referred for clinical reasons for treadmill testing at the Palo Alto Veterans Affairs Health Care System facility between October 1, 1998, and September 30, 2000.The patients had a mean age of 59 years, 95% were men, and 74% were white. Eight patients (< 1%) died during the year of follow-up. Exercise testing showed an average maximum heart rate of 138 beats/min, 8.2 METs, and a peak Borg scale of 17. In unadjusted analysis, costs were incrementally lower by an average of 5.4% per MET increase (p < 0.001). In a multivariable analysis adjusting for demographic variables, treadmill test performance and results, and clinical history, METs were found to be the most significant predictor of cost (F-statistic, 21.8; p < 0.001).These findings are consistent with the hypothesis that exercise capacity is inversely associated with health-care costs.

Abstract

Although cardiac devices have been found to reduce symptoms and mortality rates in appropriate patient populations, the implications of certain important risks, such as infection, are incompletely understood. The purpose of this study was to use a large population-based database to define the population that is at risk for cardiac device infections, determine the prevalence of device infections, and study changes in the rates of cardiac device implantation and infection in the past decade.Patients with cardiac device implantations and infections were identified with claims files from the Health Care Finance Administration for Medicare beneficiaries from January 1, 1990, through December 31, 1999. Rates of implantation of cardiac devices were determined. Time trend analyses were performed to determine the significance of the observed change in rates.Cardiac device implantation rates increased from 3.26 implantations per 1000 beneficiaries in 1990 to 4.64 implantations per 1000 beneficiaries in 1999, which represents an increase of 42% in 10 years (P for trend

Impact of tricuspid regurgitation on long-term survivalJOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGYNath, J., Foster, E., Heidenreich, P. A.2004; 43 (3): 405-409

Abstract

The goal of this study was to examine mortality associated with tricuspid regurgitation (TR) after controlling for left ventricular ejection fraction (LVEF), right ventricular (RV) dilation and dysfunction, and pulmonary artery systolic pressure (PASP).Tricuspid regurgitation is a frequent echocardiographic finding; however, the association with prognosis is unclear.We retrospectively identified 5,223 patients (age 66.5 +/- 12.8 years; predominantly male) undergoing echocardiography at one of three Veterans Affairs Medical Center laboratories over a period of four years. Follow-up data were available for four years (mean 498 +/- 402 days). Kaplan-Meier and proportional hazards methods were used to compare differences in survival among TR grades.Mortality increased with increasing severity of TR. The one-year survival was 91.7% with no TR, 90.3% with mild TR, 78.9% with moderate TR, and 63.9% with severe TR. Moderate or greater TR was associated with increased mortality regardless of PASP (hazard ratio [HR] 1.31, 95% confidence interval [CI] 1.16 to 1.49 for PASP >40 mm Hg; HR 1.32, 95% CI 1.05 to 1.62 for PASP < or =40 mm Hg) and LVEF (HR 1.49, 95% CI 1.34 to 1.66 for EF <50%; HR 1.54, 95% CI 1.37 to 1.71 for EF > or =50%). When adjusted for age, LVEF, inferior vena cava size, and RV size and function, survival was worse for patients with moderate (HR 1.17, 95% CI 0.96 to 1.42) and severe TR (HR 1.31, 95% CI 1.05 to 1.66) than for those with no TR.We conclude that increasing TR severity is associated with worse survival in men regardless of LVEF or pulmonary artery pressure. Severe TR is associated with a poor prognosis, independent of age, biventricular systolic function, RV size, and dilation of the inferior vena cava.

Abstract

Ethnic differences in the relationship between access to health care and survival are difficult to define because of many confounding factors, such as socioeconomic status and baseline differences in health. Because the Veterans Affairs health care system offers health care largely without financial considerations, it provides an ideal setting in which to identify and understand ethnic differences in health outcomes. Previous studies in this area have lacked clinical and cardiovascular data with which to adjust for baseline differences in patients' health.Data were collected from consecutive men referred for resting electrocardiography (ECG) (n = 41 087) or exercise testing (n = 6213) during 12 years. We compared ethnic differences in survival between whites, blacks, and Hispanics after considering baseline differences in age and hospitalization status. We also adjusted for electrocardiogram abnormalities and cardiac risk factors, exercise test results, and cardiovascular comorbidities.White patients tended to be older and had more baseline comorbidities and cardiovascular interventions when they presented for testing. White patients had increased mortality rates compared with blacks and Hispanics. In the ECG population, after adjusting for demographics and baseline electrocardiogram abnormalities, Hispanics had improved survival compared with whites and blacks. In the exercise test population, after adjusting for the same factors, as well as adjusting for the presence of cardiovascular comorbidities, cardiac risk factors, and exercise test findings, Hispanics also exhibited improved survival compared with the other 2 ethnicities. There were no differences in mortality rates between whites and blacks.Our findings demonstrate that the health care provided to veterans referred for routine ECG or exercise testing is not associated with poorer survival in ethnic minorities.

Abstract

This study was designed to evaluate the potential benefit of screening previously irradiated patients with echocardiography.Mediastinal irradiation is known to cause cardiac disease. However, the prevalence of asymptomatic cardiac disease and the potential for intervention before symptom development are unknown.We recruited 294 asymptomatic patients (mean age 42 +/- 9 years, 49% men, mean mantle irradiation dose 43 +/- 0.3 Gy) treated with at least 35 Gy to the mediastinum for Hodgkin's disease. After providing written consent, each patient underwent electrocardiography and transthoracic echocardiography. Valvular disease was common and increased with time following irradiation. Patients who had received irradiation more than 20 years before evaluation had significantly more mild or greater aortic regurgitation (60% vs. 4%, p < 0.0001), moderate or greater tricuspid regurgitation (4% vs. 0%, p = 0.06), and aortic stenosis (16% vs. 0%, p = 0.0008) than those who had received irradiation within 10 years. The number needed to screen to detect one candidate for endocarditis prophylaxis was 13 (95% confidence interval [CI] 7 to 44) for patients treated within 10 years and 1.6 (95% CI 1.3 to 1.9) for those treated at least 20 years ago. Compared with the Framingham Heart Study population, mildly reduced left ventricular fractional shortening (<30%) was more common (36% vs. 3%), and age- and gender-adjusted left ventricular mass was lower (90 +/- 27 g/m vs. 117 g/m) in irradiated patients.There is a high prevalence of asymptomatic heart disease in general, and aortic valvular disease in particular, following mediastinal irradiation. Screening echocardiography should be considered for patients with a history of mediastinal irradiation.

Abstract

It is unknown whether white and black Medicare beneficiaries have different rates of cardiac procedure utilization or long-term survival after cardiac arrest.A total of 5948 elderly Medicare beneficiaries (5429 white and 519 black) were identified who survived to hospital discharge between 1990 and 1999 after admission for cardiac arrest. Demographic, socioeconomic, and clinical information about these patients was obtained from Medicare administrative files, the US census, and the American Hospital Association's annual institutional survey. A Cox proportional hazard model that included demographic and clinical predictors indicated a hazard ratio for mortality of 1.30 (95% CI 1.09 to 1.55) for blacks aged 66 to 74 years compared with whites of the same age. The addition of cardiac procedures to this model lowered the hazard ratio for blacks to 1.23 (95% CI 1.03 to 1.46). In analyses stratified by race, implantable cardioverter-defibrillators (ICDs) had a mortality hazard ratio of 0.53 (95% CI 0.45 to 0.62) for white patients and 0.50 (95% CI 0.27 to 0.91) for black patients. Logistic regression models that compared procedure rates between races indicated odds ratios for blacks aged 66 to 74 years of 0.58 (95% CI 0.36 to 0.94) to receive an ICD and 0.50 (95% CI 0.34 to 0.75) to receive either revascularization or an ICD.There is racial disparity in long-term mortality among elderly cardiac arrest survivors. Both black and white patients benefited from ICD implantation, but blacks were less likely to undergo this potentially life-saving procedure. Lower rates of cardiac procedures may explain in part the lower survival rates among black patients.

Abstract

This study sought to assess the effect of angiotensin-converting enzyme (ACE) inhibitors and beta-blockers on all-cause mortality in patients with left ventricular (LV) systolic dysfunction according to gender, race, and the presence of diabetes.Major randomized clinical trials have established that ACE inhibitors and beta-blockers have life-saving benefits in patients with LV systolic dysfunction. Most patients enrolled in these trials were Caucasian men. Whether an equal effect is achieved in women, non-Caucasians, and patients with major comorbidities has not been established.The authors performed a meta-analysis of published and individual patient data from the 12 largest randomized clinical trials of ACE inhibitors and beta-blockers to produce random effects estimates of mortality for subgroups.Data support beneficial reductions in all-cause mortality for the use of beta-blockers in men and women, the use of ACE inhibitors and some beta-blockers in black and white patients, and the use of ACE inhibitors and beta-blockers in patients with or without diabetes. Women with symptomatic LV systolic dysfunction probably benefit from ACE inhibitors, but women with asymptomatic LV systolic dysfunction may not have reduced mortality when treated with ACE inhibitors (pooled relative risk = 0.96; 95% confidence interval: 0.75 to 1.22). The pooled estimate of three beta-blocker studies supports a beneficial effect in black patients with heart failure, but one study assessing bucindolol reported a nonsignificant increase in mortality.Angiotensin-converting enzyme inhibitors and beta-blockers provide life-saving benefits in most of the subpopulations assessed. Women with asymptomatic LV systolic dysfunction may not achieve a mortality benefit when treated with ACE inhibitors.

Abstract

Most patients come to the catheterization laboratory without prior functional tests, which makes the cost-effective treatment of patients with intermediate coronary lesions a practical challenge.We developed a decision model to compare the long-term costs and benefits of 3 strategies for treating patients with an intermediate coronary lesion and no prior functional study: 1) deferring the decision for percutaneous coronary intervention (PCI) to obtain a nuclear stress imaging study (NUC strategy); 2) measuring fractional flow reserve (FFR) at the time of angiography to help guide the decision for PCI (FFR strategy); and 3) stenting all intermediate lesions (STENT strategy). On the basis of the literature, we estimated that 40% of intermediate lesions would produce ischemia, 70% of patients treated with PCI and 30% of patients treated medically would be free of angina after 4 years, and the quality-of-life adjustment for living with angina was 0.9 (1.0 = perfect health). We estimated the cost of FFR to be 761 dollars, the cost of nuclear stress imaging to be 1093 dollars, and the cost of medical treatment for angina to be 1775 dollars per year. The extra cost of splitting the angiogram and PCI as dictated by the NUC strategy was 3886 dollars by use of hospital cost-accounting data. Sensitivity and threshold analyses were performed to determine which variables affected our results.The FFR strategy saved 1795 dollars per patient compared with the NUC strategy and 3830 dollars compared with the STENT strategy. Quality-adjusted life expectancy was similar among the 3 strategies (NUC-FFR = 0.8 quality-adjusted days, FFR-STENT = 6 quality-adjusted life days). Compared with the FFR strategy, the NUC strategy was expensive (>800,000 dollars per quality-adjusted life year gained). Both screening strategies were superior to (less cost, better outcomes) the STENT strategy. Sensitivity analysis indicated that the NUC strategy would only become attractive (<50,000 dollars/quality-adjusted life years compared with FFR) if the specificity of nuclear stress imaging was >25% better than FFR. Our results were not altered significantly by changing the other assumptions.In patients with an intermediate coronary lesion and no prior functional study, measuring FFR to guide the decision to perform PCI may lead to significant cost savings compared with performing nuclear stress imaging or with simply stenting lesions in all patients.

Abstract

Patients with end-stage renal disease are known to have decreased survival after myocardial infarction, but the association of less severe renal dysfunction with survival after myocardial infarction is unknown.To determine how patients with renal insufficiency are treated during hospitalization for myocardial infarction and to determine the association of renal insufficiency with survival after myocardial infarction.Cohort study.All nongovernment hospitals in the United States.130 099 elderly patients with myocardial infarction hospitalized between April 1994 and July 1995.Patients were categorized according to initial serum creatinine level: no renal insufficiency (creatinine level < 1.5 mg/dL [<132 micromol/L]; n = 82 455), mild renal insufficiency (creatinine level, 1.5 to 2.4 mg/dL [132 to 212 micromol/L]; n = 36 756), or moderate renal insufficiency (creatinine level, 2.5 to 3.9 mg/dL [221 to 345 micromol/L]; n = 10 888). Vital status up to 1 year after discharge was obtained from Social Security records.Compared with patients with no renal insufficiency, patients with moderate renal insufficiency were less likely to receive aspirin, beta-blockers, thrombolytic therapy, angiography, and angioplasty during hospitalization. One-year mortality was 24% in patients with no renal insufficiency, 46% in patients with mild renal insufficiency, and 66% in patients with moderate renal insufficiency (P < 0.001). After adjustment for patient and treatment characteristics, mild (hazard ratio, 1.68 [95% CI, 1.63 to 1.73]) and moderate (hazard ratio, 2.35 [CI, 2.26 to 2.45]) renal insufficiency were associated with substantially elevated risk for death during the first month of follow-up. This increased mortality risk continued until 6 months after myocardial infarction.Renal insufficiency was an independent risk factor for death in elderly patients after myocardial infarction. Targeted interventions may be needed to improve treatment for this high-risk population.

Abstract

Sudden cardiac death is a prominent feature of the natural history of heart disease. The efficacy of antiarrhythmic drugs and devices in preventing sudden death and reducing total mortality is uncertain.We reviewed randomized trials and quantitative overviews of type I and type III antiarrhythmic drugs. We also reviewed the randomized trials of implantable cardioverter defibrillators and combined these outcomes in a quantitative overview.Randomized trials of type I antiarrhythmic agents used as secondary prevention after myocardial infarction show an overall 21% increase in mortality rate. Randomized trials of amiodarone suggest a 13% to 19% decrease in mortality rate, and sotalol has been effective in several small trials. Trials of pure type III agents, however, have shown no mortality benefit. An overview of implantable defibrillator trials shows a 24% reduction in mortality rate (CI 15%-33%) compared with alternative therapy, most often amiodarone.Amiodarone is effective in reducing the total mortality rate by 13% to 19%, and the implantable defibrillator reduces the mortality rate by a further 24%.

Abstract

Implantable cardioverter defibrillators (ICDs) effectively prevent sudden cardiac death, but selection of appropriate patients for implantation is complex. We evaluated whether risk stratification based on risk of sudden cardiac death alone was sufficient to predict the effectiveness and cost-effectiveness of the ICD.We developed a Markov model to evaluate the cost-effectiveness of ICD implantation compared with empiric amiodarone treatment. The model incorporated mortality rates from sudden and nonsudden cardiac death, noncardiac death and costs for each treatment strategy. We based our model inputs on data from randomized clinical trials, registries, and meta-analyses. We assumed that the ICD reduced total mortality rates by 25%, relative to use of amiodarone.The relationship between cost-effectiveness of the ICD and the total annual cardiac mortality rate is U-shaped; cost-effectiveness becomes unfavorable at both low and high total cardiac mortality rates. If the annual total cardiac mortality rate is 12%, the cost-effectiveness of the ICD varies from $36,000 per quality-adjusted life-year (QALY) gained when the ratio of sudden cardiac death to nonsudden cardiac death is 4 to $116,000 per QALY gained when the ratio is 0.25.The cost-effectiveness of ICD use relative to amiodarone depends on total cardiac mortality rates as well as the ratio of sudden to nonsudden cardiac death. Studies of candidate diagnostic tests for risk stratification should distinguish patients who die suddenly from those who die nonsuddenly, not just patients who die suddenly from those who live.

Abstract

To determine the effect of patient refusal on racial and sex differences in the use of coronary angiography and in outcomes among elderly patients with acute myocardial infarction.We included Medicare beneficiary patients admitted to hospitals performing coronary angiography from February 1994 through July 1995. In-hospital use and refusal of coronary angiography were determined, and adjusted for patient, hospital, and physician characteristics.Of 124,691 patients, 53,671 (43%) underwent angiography during hospitalization and 2881 (2.3%) refused. Patients refusing angiography were more likely to be female (odds ratio [OR] = 1.37; 95% confidence interval [CI]: 1.23 to 1.53), black (OR = 1.26 vs. whites; 95% CI: 1.02 to 1.56), and older (OR = 2.25 per 10-year increase; 95% CI: 2.05 to 2.43) than patients who underwent angiography. Angiography use was lower in blacks (OR = 0.78; 95% CI: 0.72 to 0.83) than in whites, and lower in women (OR = 0.83; 95% CI: 0.80 to 0.86) than in men. Increased refusal explained 6% (95% CI: -3% to 15%) of the difference in angiography use between whites and blacks, and 16% (95% CI: 10% to 22%) of the difference between men and women. After adjustment for patient characteristics, refusal of angiography was not associated with worse survival at 1 year (OR = 0.99; 95% CI: 0.82 to 1.20).Among Medicare beneficiaries, elderly female and black patients are more likely to refuse angiography than are male and white patients. However, patient refusal is uncommon and accounts for only a small fraction of the racial and sex differences in use of angiography after myocardial infarction.

Abstract

Published data on the risk of colorectal neoplasia in patients with ulcerative colitis with and without primary sclerosing cholangitis are conflicting. A meta-analysis was performed to synthesize available publications and to compare the risk of colorectal neoplasia in patients with ulcerative colitis with and without primary sclerosing cholangitis.By using MEDLINE and manual search methods, studies were identified that compared the risk of colorectal neoplasia (dysplasia and carcinoma) in patients with ulcerative colitis with and without primary sclerosing cholangitis. In addition, citations were reviewed in relevant articles and proceedings from gastroenterology meetings, and investigators were contacted when data were incomplete. The summary odds ratio (OR) was then calculated for the risk for patients with ulcerative colitis and primary sclerosing cholangitis of having colorectal neoplasia develop compared with that of patients with ulcerative colitis without primary sclerosing cholangitis.Eleven studies met all eligibility criteria for the meta-analysis. Patients with ulcerative colitis and primary sclerosing cholangitis are at increased risk of colorectal dysplasia and carcinoma compared with patients with ulcerative colitis alone; OR 4.79: 95% CI [3.58, 6.41] with the Mantel-Haenszel method, and OR 5.11: 95% CI [3.15, 8.29] with the Der Simonian and Laird method. This increased risk is present even when the risk of colorectal carcinoma alone is considered; OR 4.09: 95% CI [2.89, 5.76] and OR 4.26: 95% CI [2.80, 6.48] by using, respectively, the Mantel-Haenszel and the Der Simonian and Laird methods.Patients with ulcerative colitis and primary sclerosing cholangitis have a significantly higher risk for the development of colorectal neoplasia than patients with ulcerative colitis but not primary sclerosing cholangitis. More intensive colonoscopic surveillance should be considered for patients with ulcerative colitis and primary sclerosing cholangitis.

Abstract

Clinical trials have shown that implantable cardioverter defibrillators (ICDs) improve survival in patients with sustained ventricular arrhythmias.To determine the efficacy necessary to make prophylactic ICD or amiodarone therapy cost-effective in patients with myocardial infarction.Markov model-based cost utility analysis.Survival, cardiac death, and inpatient costs were estimated on the basis of the Myocardial Infarction Triage and Intervention registry. Other data were derived from the literature.Patients with past myocardial infarction who did not have sustained ventricular arrhythmia.Lifetime.Societal.ICD or amiodarone compared with no treatment.Life-years, quality-adjusted life-years (QALYs), costs, number needed to treat, and incremental cost-effectiveness.Compared with no treatment, ICD use led to the greatest QALYs and the highest expenditures. Amiodarone use resulted in intermediate QALYs and costs. To obtain acceptable cost-effectiveness thresholds (=$75,000/QALY), ICDs had to reduce arrhythmic death by 50% and amiodarone had to reduce total death by 7% in patients with depressed ejection fraction.For moderate efficacies, in patients with ejection fractions less than or equal to 0.3, 0.31 to 0.4, and greater than 0.4, the cost-effectiveness of amiodarone compared with no therapy was $43,100/QALY, $66,500/QALY, and $132,500/QALY, respectively, and the cost-effectiveness of ICD compared with amiodarone was $71,800/QALY, $195,700/QALY, and $557,900/QALY, respectively.Use of ICD or amiodarone in patients with past myocardial infarction and severely depressed left ventricular function may provide substantial clinical benefit at an acceptable cost. These results highlight the importance of clinical trials of ICDs in patients with low ejection fractions who have had myocardial infarction.

Abstract

This study was designed to compare the prognostic value of an abnormal troponin level derived from studies of patients with non-ST elevation acute coronary syndromes (ACS).Risk stratification for patients with suspected ACS is important for determining need for hospitalization and intensity of treatment.We identified clinical trials and cohort studies of consecutive patients with suspected ACS without ST-elevation from 1966 through 1999. We excluded studies limited to patients with acute myocardial infarction and studies not reporting mortality or troponin results.Seven clinical trials and 19 cohort studies reported data for 5,360 patients with a troponin T test and 6,603 with a troponin I test. Patients with positive troponin (I or T) had significantly higher mortality than those with a negative test (5.2% vs. 1.6%, odds ratio [OR] 3.1). Cohort studies demonstrated a greater difference in mortality between patients with a positive versus negative troponin I (8.4% vs. 0.7%, OR 8.5) than clinical trials (4.8% if positive, 2.1% if negative, OR 2.6, p = 0.01). Prognostic value of a positive troponin T was also slightly greater for cohort studies (11.6% mortality if positive, 1.7% if negative, OR 5.1) than for clinical trials (3.8% if positive, 1.3% if negative, OR 3.0, p = 0.2)In patients with non-ST elevation ACS, the short-term odds of death are increased three- to eightfold for patients with an abnormal troponin test. Data from clinical trials suggest a lower prognostic value for troponin than do data from cohort studies.

Abstract

Randomized trials comparing medical and surgical therapies for the treatment of chronic stable angina were completed in the early 1980s. Therapies developed since then have decreased mortality and myocardial infarction rates from coronary artery disease. Using decision analysis and incorporating current recommendations for treatment, we simulated a trial comparing coronary artery bypass graft surgery and medical therapy.A Markov decision analysis model was constructed to compare the 5-year and 10-year outcomes of a simulated trial of medical therapy versus bypass surgery for stable chronic angina. Baseline data were obtained from a meta-analysis of trials comparing the two treatments. Data on risk reduction from contemporary therapies were obtained from randomized trials and meta-analyses.All subgroups experienced modest gains in survival with current therapies. At 5 years, the survival rate was 90% in the medical group (an absolute gain of 6%) and 94% in the surgical group (an absolute gain of 4%). Similar results were obtained for patients with triple-vessel disease. Among patients with a low ejection fraction, the 5-year survival rate was 85% for medical patients and 92% for surgical patients. Sensitivity analyses did not substantially affect the conclusions.Advances in the treatment of chronic stable angina have improved the outcome both for patients treated initially with surgery and for those treated initially with medical therapy. The improvements were of similar magnitude in both groups, so the fundamental conclusions of the bypass trials are unchanged.

Abstract

beta-blockers are underused in patients who have myocardial infarction (MI), despite the proven efficacy of these agents. New evidence indicates that beta-blockers can have benefit in patients with conditions that have been considered relative contraindications. Understanding the consequences of underuse of beta-blockers is important because of the implications for current policy debates over quality-of-care measures and Medicare prescription drug coverage.To examine the potential health and economic impact of increased use of beta-blockers in patients who have had MI.We used the Coronary Heart Disease (CHD) Policy Model, a computer-simulation Markov model of CHD in the US population, to estimate the epidemiological impact and cost-effectiveness of increased beta-blocker use from current to target levels among survivors of MI aged 35 to 84 years. Simulations included 1 cohort of MI survivors in 2000 followed up for 20 years and 20 successive annual cohorts of all first-MI survivors in 2000-2020. Mortality and morbidity from CHD were derived from published meta-analyses and recent studies. This analysis used a societal perspective.Prevented MIs, CHD mortality, life-years gained, and cost per quality-adjusted life-year (QALY) gained in 2000-2020.Initiating beta-blocker use for all MI survivors except those with absolute contraindications in 2000 and continuing treatment for 20 years would result in 4300 fewer CHD deaths, 3500 MIs prevented, and 45,000 life-years gained compared with current use. The incremental cost per QALY gained would be $4500. If this increase in beta-blocker use were implemented in all first-MI survivors annually over 20 years, beta-blockers would save $18 million and result in 72,000 fewer CHD deaths, 62,000 MIs prevented, and 447,000 life-years gained. Sensitivity analyses demonstrated that the cost-effectiveness of beta-blocker therapy would always be less than $11,000 per QALY gained, even under unfavorable assumptions, and may even be cost saving. Restricting beta-blockers only to ideal patients (those without absolute or relative contraindications) would reduce the epidemiological impact of beta-blocker therapy by about 60%.Our simulation indicates that increased use of beta-blockers after MI would lead to impressive gains in health and would be potentially cost saving. JAMA. 2000;284:2748-2754.

Abstract

Radiofrequency ablation is an established but expensive treatment option for many forms of supraventricular tachycardia. Most cases of supraventricular tachycardia are not life-threatening; the goal of therapy is therefore to improve the patient's quality of life.To compare the cost-effectiveness of radiofrequency ablation with that of medical management of supraventricular tachycardia.Markov model.Costs were estimated from a major academic hospital and the literature, and treatment efficacy was estimated from reports from clinical studies at major medical centers. Probabilities of clinical outcomes were estimated from the literature. To account for the effect of radiofrequency ablation on quality of life, assessments by patients who had undergone the procedure were used.Cohort of symptomatic patients who experienced 4.6 unscheduled visits per year to an emergency department or a physician's office while receiving long-term drug therapy for supraventricular tachycardia.Patient lifetime.Societal.Initial radiofrequency ablation, long-term antiarrhythmic drug therapy, and treatment of acute episodes of arrhythmia with antiarrhythmic drugs.Costs, quality-adjusted life-years, life-years, and marginal cost-effectiveness ratios.Among patients who have monthly episodes of supraventricular tachycardia, radiofrequency ablation was the most effective and least expensive therapy and therefore dominated the drug therapy options. Radiofrequency ablation improved quality-adjusted life expectancy by 3.10 quality-adjusted life-years and reduced lifetime medical expenditures by $27 900 compared with long-term drug therapy. Long-term drug therapy was more effective and had lower costs than episodic drug therapy.The findings were highly robust over substantial variations in assumptions about the efficacy and complication rate of radiofrequency ablation, including analyses in which the complication rate was tripled and efficacy was decreased substantially.Radiofrequency ablation substantially improves quality of life and reduces costs when it is used to treat highly symptomatic patients. Although the benefit of radiofrequency ablation has not been studied in less symptomatic patients, a small improvement in quality of life is sufficient to give preference to radiofrequency ablation over drug therapy.

Abstract

To determine the effect of treatment by a cardiologist on mortality of elderly patients with acute myocardial infarction (AMI, heart attack), accounting for both measured confounding using risk-adjustment techniques and residual unmeasured confounding with instrumental variables (IV) methods.Medical chart data and longitudinal administrative hospital records and death records were obtained for 161,558 patients aged > or =65 admitted to a nonfederal acute care hospital with AMI from April 1994 to July 1995. Our principal measure of significant cardiologist treatment was whether a patient was admitted by a cardiologist. We use supplemental data to explore whether our analysis would differ substantially using alternative definitions of significant cardiologist treatment.This retrospective cohort study compared results using least squares (LS) multivariate regression with results from IV methods that accounted for additional unmeasured patient characteristics. Primary outcomes were 30-day and one-year mortality, and secondary outcomes included treatment with medications and revascularization procedures.Medical charts for the initial hospital stay of each AMI patient underwent a comprehensive abstraction, including dates of hospitalization, admitting physician, demographic characteristics, comorbid conditions, severity of clinical presentation, electrocardiographic and other diagnostic test results, contraindications to therapy, and treatments before and after AMI.Patients admitted by cardiologists had fewer comorbid conditions and less severe AMIs. These patients had a 10 percent (95 percent CI: 9.5-10.8 percent) lower absolute mortality rate at one year. After multivariate adjustment with LS regression, the adjusted mortality difference was 2 percent (95 percent CI: 1.4-2.6 percent). Using IV methods to provide additional adjustment for unmeasured differences in risk, we found an even smaller, statistically insignificant association between physician specialty and one-year mortality, relative risk (RR) 0.96 (0.88-1.04). Patients admitted by a cardiologist were also significantly more likely to have a cardiologist consultation within the first day of admission and during the initial hospital stay, and also had a significantly larger share of their physician bills for inpatient treatment from cardiologists. IV analysis of treatments showed that patients treated by cardiologists were more likely to undergo revascularization procedures and to receive thrombolytic therapy, aspirin, and calcium channel-blockers, but less likely to receive beta-blockers.In a large population of elderly patients with AMI, we found significant treatment differences but no significant incremental mortality benefit associated with treatment by cardiologists.

Abstract

Critically ill patients often pose special diagnostic problems to the clinician, intensified by limited physical examination findings and difficulty in transportation to imaging suites. Mechanical ventilation and the limited ability to position the patient make transthoracic echocardiography difficult. Transesophageal echocardiographic (TEE) imaging, however, is well suited to the critical care patient and is frequently used to evaluate hemodynamic status, the presence of vegetations, a cardioembolic source, and an intracardiac cause of hypoxemia. Using proper precautions, TEE can be performed safely in unstable patients and frequently leads to important changes in management.

Abstract

To evaluate power Doppler imaging as a possible screening examination for carotid artery stenosis.In the principal pilot study, a prospective, blinded comparison of power Doppler imaging with duplex Doppler imaging, the reference-standard method, was conducted in 100 consecutive patients routinely referred for carotid artery imaging at a large, private multispecialty clinic. In the validation pilot study, a prospective, blinded comparison of power Doppler imaging with digital subtraction angiography, the reference-standard method, was conducted in 20 consecutive patients routinely referred at a teaching hospital. Using conservative assumptions, the authors performed cost-effectiveness analysis.Power Doppler imaging produced diagnostic-quality images in 89% of patients. When the images of the patients with nondiagnostic examinations were regarded as positive, power Doppler imaging had an area under the receiver operating characteristic curve, A(z), of 0.87, sensitivity of 70%, and specificity of 91%. The validation study results were very similar. The cost-effectiveness of screening and, as indicated, duplex Doppler imaging as the definitive diagnostic examination and endarterectomy was $47,000 per quality-adjusted life-year.The A(z) value for power Doppler imaging compares well with that for mammography, a generally accepted screening examination, and with most other imaging examinations. Power Doppler imaging is likely to be a reasonably accurate and cost-effective screening examination for carotid artery stenosis in asymptomatic populations.

Abstract

The cost of medical care in the United States continues to spiral upward, partly as a result of new technological breakthroughs that promise improved length of life and quality of life for patients. But how good are these treatments in everyday practice? How do we make policies for adopting innovations that improve outcome but also increase costs? Cost-effectiveness studies are designed to answer these questions. They reveal important aspects of a particular medical decision and inform treatment choices by systematically analyzing the relationships between the costs and outcomes of alternative health care interventions. This article provides an introduction to the field of cost-effectiveness analysis and describes an approach to interpreting the rapidly proliferating cost-effectiveness literature.

Effect of a home monitoring system on hospitalization and resource use for patients with heart failureAMERICAN HEART JOURNALHeidenreich, P. A., Ruggerio, C. M., Massie, B. M.1999; 138 (4): 633-640

Abstract

Heart failure has a large medical and economic impact on the elderly. Past studies have shown that high-intensity multidisciplinary interventions at academic medical centers can reduce future hospitalizations. Our pilot study examined the effects of a low-intensity monitoring program on hospitalizations and cost of care for patients with heart failure treated by community physicians.We enrolled 68 patients with heart failure (mean age 73 +/- 13 years, 53% male) monitored by 31 physicians in a multidisciplinary program of patient education, daily self-monitoring, and physician notification of abnormal weight gain, vital signs, and symptoms. Comparisons of medical claims were made between the patients who received the intervention and a control group of 86 patients matched to the intervention group on medical claims during the preceding year.Compared with the prior year, medical claims per year decreased in the intervention group ($8500 +/- $13,000 to $7400 +/- $11,400), whereas they increased in the control group ($9200 +/- $15,000 to $18,800 +/- $34,000, P

Abstract

We sought to determine the appropriate use of echocardiography for patients with suspected endocarditis.We constructed a decision tree and Markov model using published data to simulate the outcomes and costs of care for patients with suspected endocarditis.Transesophageal imaging was optimal for patients who had a prior probability of endocarditis that is observed commonly in clinical practice (4% to 60%). In our base-case analysis (a 45-year-old man with a prior probability of endocarditis of 20%), use of transesophageal imaging improved quality-adjusted life expectancy (QALYs) by 9 days and reduced costs by $18 per person compared with the use of transthoracic echocardiography. Sequential test strategies that reserved the use of transesophageal echocardiography for patients who had an inadequate transthoracic study provided similar QALYs compared with the use of transesophageal echocardiography alone, but cost $230 to $250 more. For patients with prior probabilities of endocarditis greater than 60%, the optimal strategy is to treat for endocarditis without reliance on echocardiography for diagnosis. Patients with a prior probability of less than 2% should receive treatment for bacteremia without imaging. Transthoracic imaging was optimal for only a narrow range of prior probabilities (2% or 3%) of endocarditis.The appropriate use of echocardiography depends on the prior probability of endocarditis. For patients whose prior probability of endocarditis is 4% to 60%, initial use of transesophageal echocardiography provides the greatest quality-adjusted survival at a cost that is within the range for commonly accepted health interventions.

Abstract

Congestive heart failure is the most common cause of hospitalization for the older population. A previous study demonstrated that rehospitalizations, undertaken by 30% to 50% of elderly patients, can be prevented with intensive multidisciplinary intervention. A pilot study was designed to determine whether a less intensive program with patient education materials, automated reminders for medication compliance, self-monitoring of daily weights and vital signs, and facilitated telephone communication with a nurse-monitor could reduce hospitalizations and whether this benefit could be extended to younger outpatients. Twenty-seven male patients (mean age 62 years) with New York Heart Association class II to IV congestive heart failure caused by dilated cardiomyopathy underwent follow-up with an independent service, which provided the primary cardiologist with information concerning changes in vital signs or symptoms. The number of hospitalizations and hospital days during the mean value of 8.5 months in the program was compared patient by patient with the number during the equivalent period before entrance in the program. The number of hospitalizations for cardiovascular diagnoses and hospital days was reduced from 0.6 to 0.2 (p = 0.09) per patient year of follow-up and 7.8 to 0.7 days per patient per year (p < 0.05). Hospitalizations for all causes fell from 0.8 to 0.4 per patient per year (p = not significant) and 9.5 to 0.8 days per patient per year (p < 0.05). The greatest absolute and relative benefit was observed among patients with more severe congestive heart failure. The most frequent indication for intervention was an increase in weight, which was managed with adjustment of diuretic dosages. This preliminary experience suggests that close telephone monitoring by personnel from an independent service can prevent hospitalizations for heart failure among both recently discharged patients and ambulatory outpatients and among both elderly and middle-aged persons.

Abstract

We developed a decision-support system for evaluation of treatment alternatives for supraventricular and ventricular arrhythmias. The system uses independent decision models that evaluate the costs and benefits of treatment for recurrent atrioventricular-node reentrant tachycardia (AVNRT), and of therapies to prevent sudden cardiac death (SCD) in patients at risk for life-threatening ventricular arrhythmias. Each of the decision models is accessible through a web-based interface that enables remote users to browse the model's underlying evidence and to perform analyses of effectiveness, cost effectiveness, and sensitivity to input variables. Because the web-based interface is independent of the models, we can extend the functionality of the system by adding decision models. This system illustrates that the use of a library of web-accessible decision models provides decision support economically to widely dispersed users.

Abstract

We sought to evaluate the current evidence for an effect of beta-blockade treatment on mortality in patients with congestive heart failure (CHF).Although numerous small studies have suggested a benefit with beta-blocker therapy in patients with heart failure, a clear survival benefit has not been demonstrated. A recent combined analysis of several studies with the alpha- and beta-adrenergic blocking agent carvedilol demonstrated a significant survival advantage; however, the total number of events was small. Furthermore, it is unclear if previous studies with other beta-blockers are consistent with this finding.Randomized clinical trials of beta-blockade treatment in patients with CHF from January 1975 through February 1997 were identified using a MEDLINE search and a review of reports from scientific meetings. Studies were included if mortality was reported during 3 or more months of follow-up.We identified 35 reports, 17 of which met the inclusion criteria. These studies included 3,039 patients with follow-up ranging from 3 months to 2 years. Beta-blockade was associated with a trend toward mortality reduction in 13 studies. When all 17 reports were combined, beta-blockade significantly reduced all-cause mortality (random effect odds ratio [OR] 0.69, 95% confidence interval [CI] 0.54 to 0.88). A trend toward greater treatment effect was noted for nonsudden cardiac death (OR 0.58, 95% CI 0.40 to 0.83) compared with sudden cardiac death (OR 0.84, 95% CI 0.59 to 1.2). Similar reductions in mortality were observed for patients with ischemic (OR 0.69, 95% CI 0.49 to 0.98) and nonischemic cardiomyopathy (OR 0.69, 95% CI 0.47 to 0.99). The survival benefit was greater for trials of the drug carvedilol (OR 0.54, 95% CI 0.36 to 0.81) than for noncarvedilol drugs (OR 0.82, 95% CI 0.60 to 1.12); however, the difference did not reach statistical significance (p = 0.10).Pooled evidence suggests that beta-blockade reduces all-cause mortality in patients with CHF. Additional trials are required to determine whether carvedilol differs in its effect from other agents.

Abstract

To determine the impact of echocardiography on the use of antibiotic prophylaxis in patients with suspected mitral valve prolapse (MVP).We evaluated 147 consecutive patients who were referred for "rule out mitral valve prolapse" to a university hospital echocardiography laboratory. Chart review and phone contact were used to determine the demographic characteristics of the patients; past diagnosis of MVP, symptoms, and exam at referral; practice specialty of referring MD; echocardiographic findings; and change in prophylaxis usage as a result of the echocardiogram (ECHO). Prophylaxis was considered to be indicated if the echocardiogram demonstrated MVP with at least mild regurgitation or abnormal thickening of at least one mitral leaflet.Based on the ECHO a change in antibiotic prophylaxis was indicated in 20 of 147 (14%) patients including initiation of prophylaxis in 6, and discontinuation of prophylaxis in 14. However, only 4 of 20 patients (20%) actually changed their prophylaxis habits leading to an actual yield of 4 management changes per 131 ECHOs ordered (3%). This corresponded to 1 change in management per $36,250 in hospital and physician costs. Younger age, female gender, and presence of symptoms were associated with a benign ECHO. Indications for a change in management were not significantly different between physician specialities: 18% for generalists (internal medicine and family practice), 12% for cardiologists, and 7% for other specialists, P = 0.3.In patients referred for evaluation of MVP, echocardiography infrequently resulted in changes in antibiotic prophylaxis management and was associated with significant expense.

Abstract

The Asymptomatic Carotid Atherosclerosis Study (ACAS) showed that carotid endarterectomy was beneficial for symptom-free patients with carotid stenosis of 60% or more. This finding raises the question of whether widespread screening to identify cases of asymptomatic carotid stenosis should be implemented.To determine whether a screening program to identify cases of asymptomatic carotid stenosis would be a cost-effective strategy for stroke prevention.Cost-effectiveness analysis using published data from clinical trials.General population of asymptomatic 65-year-old men.Patients who were screened for carotid disease with duplex Doppler ultrasonography were compared with patients who were not screened. If ultrasonography found significant carotid stenosis (> or = 60%), disease was confirmed by angiography before carotid endarterectomy was done.Quality-adjusted life-years, costs, and marginal cost-effectiveness ratios.When the conditions and results of ACAS were modeled and it was assumed that the survival advantage produced by endarterectomy would last for 30 years, the lifetime marginal cost-effectiveness of screening relative to no screening was $120,000 per quality-adjusted life-year. Sensitivity analysis showed that marginal cost-effectiveness decreased to $50,000 or less per quality-adjusted life-year only under implausible conditions (for example, if a free screening instrument with perfect test characteristics was used or an asymptomatic population with a 40% prevalence of carotid stenosis was found).Surgery offers a real but modest absolute reduction in the rate of stroke at a substantial cost. A program to identify candidates for endarterectomy by screening asymptomatic populations for carotid stenosis costs more per quality-adjusted life-year than is usually considered acceptable.

Abstract

Implantable cardioverter defibrillators (ICDs) are remarkably effective in terminating ventricular arrhythmias, but they are expensive and the extent to which they extend life is unknown. The marginal cost-effectiveness of ICDs relative to amiodarone has not been clearly established.To compare the cost-effectiveness of a third-generation implantable ICD with that of empirical amiodarone treatment for preventing sudden cardiac death in patients at high or intermediate risk.A Markov model was used to evaluate health and economic outcomes of patients who received an ICD, amiodarone, or a sequential regimen that reserved ICD for patients who had an arrhythmia during amiodarone treatment.Life-years gained, quality-adjusted life-years gained, costs, and marginal cost-effectiveness.For the base-case analysis, it was assumed that treatment with an ICD would reduce the total mortality rate by 20% to 40% at 1 year compared with amiodarone and that the ICD generator would be replaced every 4 years. In high-risk patients, if an ICD reduces total mortality by 20%, patients who receive an ICD live for 4.18 quality-adjusted life-years and have a lifetime expenditure of $88,400. Patients receiving amiodarone live for 3.68 quality-adjusted life-years and have a lifetime expenditure of $51,000. Marginal cost-effectiveness of an ICD relative to amiodarone is $74,400 per quality-adjusted life-year saved. If an ICD reduces mortality by 40%, the cost-effectiveness of ICD use is $37,300 per quality-adjusted life-year saved. Both choice of therapy (an ICD or amiodarone) and the cost-effectiveness ratio are sensitive to assumptions about quality of life.Use of an ICD will cost more than $50,000 per quality-adjusted life-year gained unless it reduces all-cause mortality by 30% or more relative to amiodarone. Current evidence does not definitively support or exclude a benefit of this magnitude, but ongoing randomized trials have sufficient statistical power to do so.

Abstract

To determine the clinical variables that affect the prognosis of critically ill patients with sustained unexplained hypotension. A further goal was to develop a prognostic scoring system based on clinical data available at the onset of hypotension.Prospective cohort study.The intensive care units (ICUs) of an academic medical center.One hundred one adult ICU patients with sustained (> 60 mins) unexplained hypotension. Using the initial 50 patients (derivation set), a prognostic score was developed that was then tested in the next 51 patients (validation set).NoneThe main outcome variable was death or hospital discharge. The overall hospital mortality in the combined sets was 58%. Using a multivariable model we identified three independent (p < .05) predictors of hospital mortality, including the Acute Physiology and Chronic Health Evaluation (APACHE) II score at the time of hypotension, the time from hospital admission to hypotensive episode, and hospital admission for surgery or treatment of malignancy. These variables were weighted and combined to create a Hypotension Score which separated patients in the combined sets into three prognostic groups: a) Hypotension Score of < 40, mortality 7%, (n = 27); b) Hypotension Score of 40 to 64, mortality 70%, (n = 50); and c) Hypotension Score of > or = 65, mortality 92%, (n = 24). The area under the receiver operating characteristic curve was .85 for the derivation set and .83 for the validation set vs. .76 for the APACHE II score alone.The prognosis of hypotension in the critical care setting is highly variable, but can be predicted from patient characteristics.

Abstract

Percutaneous balloon mitral valvuloplasty (PBMV) is an effective means of palliating mitral stenosis, but it sometimes leads to adverse clinical outcomes and exorbitant in-hospital costs. Because echocardiographic score is known to be predictive of clinical outcome in patients undergoing PBMV, we examined whether it could also be used to predict in-hospital cost. Preprocedure echocardiographic scores, baseline clinical characteristics, and total in-hospital costs were examined among 45 patients who underwent PBMV between January 1, 1992, and January 1, 1994. Patients ranged in age from 18 to 71 years and had preprocedure echocardiographic scores that ranged from 4 to 12. Following PBMV, mean mitral valve area increased from 1.1 +/- 0.3 to 2.4 +/- 0.6 cm2 (p = 0.0001), and mean pressure gradient decreased from 18.3 +/- 5.9 to 6.7 +/- 2.7 mm Hg (p = 0.0001). In-hospital cost for the 45 patients ranged from $3,591 to $70,975 (mean $9,417; median $5,311). Univariate and multiple linear regression analyses demonstrated that among the variables examined, echocardiographic score (p = 0.0007), age (p = 0.01), and preprocedure mitral valve gradient (p = 0.03) were associated with in-hospital cost. Regression modeling suggested that every increase in preprocedure echocardiographic score of one grade was associated with an increase in in-hospital cost of $2,663. Because echocardiographic score is predictive of both clinical outcome and in-hospital cost, we conclude that patients with elevated scores should be considered for alternative therapy.

Abstract

This study was designed to determine the diagnostic value of 12-lead ECG for pericardial effusion and cardiac tamponade.Cross-sectional study.University hospital.Hospitalized patients with and without pericardial effusion and cardiac tamponade.In a blinded manner, we reviewed 12-lead ECGs from 136 patients with echocardiographically diagnosed pericardial effusions (12 of whom had cardiac tamponade) and from 19 control subjects without effusions. We examined the diagnostic value of three ECG signs: low voltage, PR segment depression, and electrical alternans. We found that all three ECG signs were specific but not sensitive for pericardial effusion (specificity, 89 to 100%; sensitivity, 1 to 17%) and cardiac tamponade (specificity, 86 to 99%; sensitivity, 0 to 42%). None of the ECG signs were associated with pericardial effusions of all sizes, but low voltage was associated with large and moderate pericardial effusions (odds ratio = 2.5; 95% confidence interval [CI] = 0.9 to 6.5; p = 0.06) and with cardiac tamponade (odds ratio = 4.7; 95% CI = 1.1 to 21.0; p = 0.004). In contrast, PR segment depression was associated only with cardiac tamponade (odds ratio = 2.0; 95% CI = 1.0 to 4.0; p = 0.05), while electrical alternans was not associated with either pericardial effusion or cardiac tamponade.Low voltage and PR segment depression are ECG signs that are suggestive, but not diagnostic, of pericardial effusion and cardiac tamponade. Because these ECG findings cannot reliably identify these conditions, we conclude that 12-lead ECG is poorly diagnostic of pericardial effusion and cardiac tamponade.

Abstract

The hospital charts and billing records of 250 consecutive admissions for percutaneous transluminal coronary angioplasty (PTCA) at a university hospital were reviewed. Clinical characteristics, performing physician, angiographic features of the dilated lesion, procedural outcome, length of stay, and total and departmental hospital costs were recorded for each patient. We identified several independent predictors of hospital cost, including the physician ($4,400 increase from highest- to lowest-cost physician, p=0.004), age ($790 increase per 10-year increase in age, p=0.002), urgency of the procedure ($4,100 increase for urgent vs elective, p < 0.001), and combined angiography and PTCA ($850 increase vs separate angiography, p=0.04). Independent predictors of catheterization laboratory cost included the physician ($1,280 increase from highest- to lowest-cost physician, p=0.03), American College of Cardiology/American Heart Association lesion type B2 or C ($320 increase, p=0.03), and combined angiography and PTCA ($430 increase, p=0.003). Expensive operators used more catheterization laboratory resources than inexpensive operators; however, there are no significant differences in success rate or need for emergent bypass surgery between physicians. PTCA cost is determined by both patient characteristics and the performing physician. The increase in cost due to the physician was not explained by patient variables, lesions characteristics, success rate, or complications.

Abstract

We demonstrated the use of the World Wide Web for the presentation and explanation of a medical decision model. We put on the web a treatment model developed as part of the Cardiac Arrhythmia and Risk of Death Patient Outcomes Research Team (CARD PORT). To demonstrate the advantages of our web-based presentation, we critiqued both the conventional paper-based and the web-based formats of this decision-model presentation with reference to an accepted published guide to understanding clinical decision models. A web-based presentation provides a useful supplement to paper-based publications by allowing authors to present their model in greater detail, to link model inputs to the primary evidence, and to disseminate the model to peer investigators for critique and collaborative modeling.

Abstract

Although pericardial effusion is known to be common among patients infected with HIV, the incidence of pericardial effusion and its relation to survival have never been described.To evaluate the incidence of pericardial effusion and its relation to mortality in HIV-positive subjects, 601 echocardiograms were performed on 231 subjects recruited over a 5-year period (inception cohort: 59 subjects with asymptomatic HIV, 62 subjects with AIDS-related complex, and 74 subjects with AIDS; 21 HIV-negative healthy gay men; and 15 subjects with non-HIV end-stage medical illness). Echocardiograms were performed every 3 to 6 months (82% had follow-up studies). Sixteen subjects were diagnosed with effusions (prevalence of effusion for AIDS subjects entering the study was 5%). Thirteen subjects developed effusions during follow-up; 12 of these were subjects with AIDS (incidence, 11%/y). The majority of effusions (80%) were small and asymptomatic. The survival of AIDS subjects with effusions was significantly shorter (36% at 6 months) than survival for AIDS subjects without effusions (93% at 6 months). This shortened survival remained significant (relative risk, 2.2, P = .01) after adjustment for lead time bias and was independent of CD4 count and albumin level.There is a high incidence of pericardial effusion in patients with AIDS, and the presence of an effusion is associated with shortened survival. The development of an effusion in the setting of HIV infection suggests end-stage HIV disease (AIDS).

Abstract

Tree-dimensional (3-D) reconstruction of acquired tomographic images in adults has recently been described. With an adaptation of this technique, we performed 3-D reconstruction of transabdominal images of the abdominal aorta to test the hypotheses that 3-D reconstruction of the abdominal aorta is feasible and that 3-D images have incremental value over 2-D in the detection of atheromatous plaque. Twenty-one patients undergoing contrast aortography (Aogram) for clinical indications (1 abdominal aorta (AA) aneurysm, 5 peripheral vascular disease, 1 renal artery stenosis, 14 renal donors) were studied using a 5-MHz annular array probe fitted to a mechanical registration device. In 13 of 21 patients, adequate 2-D ultrasound slices were acquired around a 180 degrees rotation and stored as a volumetric data set using a dedicated computer and 3-D images were reconstructed off-line. Three-dimensional and planar images were blindly compared with Aograms using the following scale: grade 1, normal; grade 2, increased echodensity of the intimal surface; grade 3, local intimal thickening and/or luminal irregularity; and grade 4, protruding mass. Analogous 3-D images were produced in all 13 patients with branching vessels visible in 3 of 13. In 10 patients, the Aogram was interpreted as normal. Compared with Aogram, blindly interpreted 3-D images were compared and correctly identified normal AA in 8 of 10 and atherosclerotic plaque (grade 3 or 4) in 2 of 3. Discordant results were present in 2 of 10 normal aortas and 1 of 3 disease aortas. When 2-D (planar) images were compared with Aograms, 8 of 10 identified normal AA and 3 of 3 aortas with grade 3 or 4 plaque. Thus, in 2 patients, 3-D and planar images suggested atherosclerotic changes not seen by Aogram. Transabdominal 3-D imaging of the abdominal aorta is a feasible technique. Early data suggest that 3-D imaging may distinguish normal from moderate to severe disease, but currently has no demonstrable incremental value over conventional 2-D images. These early results in a small number of patients suggest that this promising technique warrants further evaluation.

Abstract

This study sought to determine the prognostic yield and utility of transesophageal echocardiography in critically ill patients with unexplained hypotension.Transesophageal echocardiography is increasingly utilized in the intensive care setting and is particularly suited for the evaluation of hypotension; however, the prognostic yield of transesophageal echocardiography in these patients is unknown.We prospectively studied 61 adult patients in the intensive care unit with sustained (> 60 min) unexplained hypotension. Both transthoracic and transesophageal echocardiography were performed, and results were immediately disclosed to the primary physician, who reported any resulting changes in management. Patients were classified on the basis of transesophageal echocardiographic findings into one of three prognostic groups: 1) nonventricular (valvular, pericardial) cardiac limitation to cardiac output; 2) ventricular failure; and 3) noncardiac systemic disease (hypovolemia or low systemic vascular resistance, or both). Primary end points were death or discharge from the intensive care unit.A transesophageal echocardiographic diagnosis of nonventricular limitation to cardiac output was associated with improved survival to discharge from the intensive care unit (81%) versus a diagnosis of ventricular disease (41%) or hypovolemia/low systemic vascular resistance (44%, p = 0.03). Twenty-nine (64%) of 45 transthoracic echocardiographic studies were inadequate compared with 2 (3%) of 61 transesophageal echocardiographic studies (p < 0.001). Transesophageal echocardiography contributed new clinically significant diagnoses (not seen with transthoracic echocardiography) in 17 patients (28%), leading to operation in 12 (20%).Transesophageal echocardiography makes a clinically important contribution to the diagnosis and management of unexplained hypotension and predicts prognosis in the critical care setting.

Abstract

Velocity-encoded cine-magnetic resonance imaging (VEC-MRI) is a new method for quantitation of blood flow with the potential to measure high-velocity jets across stenotic valves. The objective of this study was to evaluate the ability of VEC-MRI to measure transmitral velocity in patients with mitral stenosis. Sixteen patients with known mitral stenosis were studied. A 1.5 Tesla superconducting magnet was used to obtain velocity-encoded images in the left ventricular short-axis plane. Images were obtained throughout the cardiac cycle at 3 consecutive slices beginning proximal to the mitral coaptation point. To determine the optimal slice thickness for MRI imaging, both 10 mm and 5 mm thicknesses were used. Echocardiography including continuous-wave Doppler was performed on every patient within 2 hours of MRI imaging. Peak velocity was determined for both VEC-MRI and Doppler-echo images. Two observers independently measured the VEC-MRI mitral inflow velocities. Of the 16 patients, imaged data were incomplete in only 1 study, and all images were adequate for analysis. Strong correlations were found for measurements of mitral valve gradient for both 10 mm (peak r = 0.89, mean r = 0.84) and 5 mm (peak r = 0.82, mean r = 0.95) slice thicknesses. Measurements of peak velocity with VEC-MRI (10 mm) agreed well with Doppler: mean 1.46 m/s, mean of differences (Doppler MRI) 0.38 m/s, standard deviation of differences 0.2 m/s. These findings suggest that VEC-MRI can noninvasively determine the severity of mitral stenosis.

Abstract

The feasibility of velocity-encoded cine nuclear magnetic resonance (NMR) imaging to measure regurgitant volume and regurgitant fraction in patients with mitral regurgitation was evaluated.Velocity-encoded cine NMR imaging has been reported to provide accurate measurement of the volume of blood flow in the ascending aorta and through the mitral annulus. Therefore, we hypothesized that the difference between mitral inflow and aortic systolic flow provides the regurgitant volume in the setting of mitral regurgitation.Using velocity-encoded cine NMR imaging at a magnet field strength of 1.5 T and color Doppler echocardiography, 19 patients with isolated mitral regurgitation and 10 normal subjects were studied. Velocity-encoded cine NMR images were acquired in the short-axis plane of the ascending aorta and from the short-axis plane of the left ventricle at the level of the mitral annulus. Two independent observers measured the ascending aortic flow volume and left ventricular inflow volume to calculate the regurgitant volume as the difference between left ventricular inflow volume and aortic flow volume, and the regurgitant fraction was calculated. Using accepted criteria of color flow Doppler imaging and spectral analysis, the severity of mitral regurgitation was qualitatively graded as mild, moderate or severe and compared with regurgitant volume and regurgitant fraction, as determined by velocity-encoded cine NMR imaging.In normal subjects the regurgitant volume was -6 +/- 345 ml/min (mean +/- SD). In patients with mild, moderate and severe mitral regurgitation, the regurgitant volume was 156 +/- 203, 1,384 +/- 437 and 4,763 +/- 2,449 ml/min, respectively. In normal subjects the regurgitant fraction was 0.7 +/- 6.1%. In patients with mild, moderate and severe mitral regurgitation, the regurgitant fraction was 3.1 +/- 3.4%, 24.5 +/- 8.9% and 48.6 +/- 7.6%, respectively. The regurgitant fraction correlated well with the echocardiographic severity of mitral regurgitation (r = 0.87). Interobserver reproducibilities for regurgitant volume and regurgitant fraction were excellent (r = 0.99, SEE = 238 ml; r = 0.98, SEE = 4.1%, respectively).These findings suggest that velocity-encoded NMR imaging can be used to estimate regurgitant volume and regurgitant fraction in patients with mitral regurgitation and can discriminate patients with moderate or severe mitral regurgitation from normal subjects and patients with mild regurgitation. It may be useful for monitoring the effect of therapy intended to reduce the severity of mitral regurgitation.

Abstract

Sonicated albumin microspheres, a digitalizing ultrasound system, and a mathematical model for flow were used to determine whether blood flow in the canine kidney could be assessed with contrast ultrasound. Albunex ultrasound contrast microspheres were injected into the aorta while ultrasound images of the kidney and aorta were recorded simultaneously. Ultrasound data were obtained during contrast injections at 93 different renal blood flow rates in nine dogs. Contrast dose was calibrated to ultrasound system response for both aortic and renal images. A linear relationship between microbubble concentration used and pixel intensity was established (r = 0.89 for aortic images and r = 0.91 for renal images). Renal blood flow was manipulated from baseline by means of a hydraulic renal artery occluder and by intravenous dopamine or fenoldopam infusion. Blood flow calculated with contrast ultrasonography was compared with direct measurement obtained with an electromagnetic flow probe at each flow rate. Direct measurement correlated with rates calculated with contrast ultrasonography (r = 0.84, 95% confidence limits from 0.75 to 0.90). Overall, calculations tended to overestimate absolute flow measurements, and overestimation of flow tended to be greater during pharmacologically manipulated flow rates. We conclude the changes and trends in renal blood flow can be serially assessed in vivo with contrast ultrasonography, but technical limitations of present commercial ultrasounds systems preclude absolute quantification at this time.

Abstract

Contrast echocardiography has been used for qualitative assessment of cardiac function, and its potential for quantitative assessment of blood flow is being explored. With the development of an ultrasound contrast agent capable of passage through the microcirculation, a mathematical model based on classic dye dilution theory, and a digital ultrasound acquisition system, absolute quantitation of myocardial perfusion may be feasible. This study validates the mathematical model in a simple in vitro tube system. Flow was delivered at variable rates through an in vitro tube system while a longitudinal section was imaged with a modified commercial ultrasound scanner. Albunex contrast agent was injected, and videointensity data were captured and analyzed off line. Time-intensity curves were generated, and flow was calculated by use of a mathematical model derived from classic dye dilution mathematics. For 39 different flow rates, ranging for 9.2 to 110 ml/seconds, a correlation coefficient of r = 0.928 (p < 0.001) with a slope of 0.97 was calculated. We conclude that (1) contrast ultrasonography is capable of quantitative determination of flow in an in vitro system, and (2) a mathematical model based on dye dilution theory can be used to calculate flow with accuracy and precision.

Abstract

Myocardial contrast echocardiography has been found to be a safe and useful technique for evaluating relative changes in myocardial perfusion and delineating areas at risk. Although earlier contrast agents required direct delivery into the coronary arteries or aortic root, a new echocardiographic contrast agent, sonicated albumin microspheres (Albunex), has been found to cross the pulmonary circulation in experimental models. To determine the safety and preliminary efficacy of intravenous injections of Albunex in humans, 71 patients at three independent medical institutions underwent two-dimensional echocardiographic examination before, during and after the administration of three intravenous doses of Albunex, ranging from 0.01 to 0.12 ml/kg body weight. All patients provided a complete history and underwent physical and neurologic examination and laboratory and electrocardiographic evaluation before the injections; all evaluations (except for the history) were repeated at 2 h and 3 days after the injections of Albunex. The efficacy of the injections was qualitatively assessed by two independent blinded observers using a grading system of 0 to +3, with 0 indicating an absence of contrast effect and +3 indicating full opacification of the cavities examined. All injections were well tolerated and no serious side effects were noted in any of the patients. Irrespective of dose group, a cavity opacification greater than or equal to +2 was seen in the right ventricle in 212 (88%) of 240 injections and in the left ventricle in 151 (63%) of 240 injections as judged by the independent observers. The degree of ventricular cavity opacification appeared to be dose and concentration related.(ABSTRACT TRUNCATED AT 250 WORDS)

Abstract

The ability of contrast echocardiography to assess regional myocardial perfusion during cardiopulmonary bypass in a dog model for coronary artery bypass surgery was evaluated. Sonicated Renograffin-76 microbubbles (meglumin diatrigoate and sodium diatrigoate) were injected into an aortic root proximal to an aortic occlusion clamp root while dogs were on cardiopulmonary bypass, with the heart arrested in diastole. Echocardiographic contrast-enhanced regions of myocardial perfusion were easily visualized. Differences in contrast-enhanced myocardial regions depended on coronary artery occlusion or patency. The contrast-enhanced images of myocardial perfusion showed that, for a given myocardial segment of the supplying vessel, the presence or absence of contrast effect reliably predicted vessel occlusion or patency (P less than .01). In the future contrast echocardiography may allow the direct assessment of regional myocardial perfusion in the operating room.

Abstract

Contrast ultrasonography, employing tracers behaving like red blood cells, is a promising technique to study regional blood flow distribution. Aim of this note is to quantitate renal blood flow in the dog using contrast ultrasonography. Mathematical formulae derived from the classical dye-dilution theory are applied. Ten different renal blood flow levels (ranging from 16 to 125 ml/min) were obtained by means of mechanical (stenosis and reperfusion) and pharmacological interventions (iv infusion of adrenaline, noradrenaline and fenoldopam). Renal blood flow was measured by electromagnetic flow-meter and contemporary calculated by contrast ultrasonography. The correlation coefficient between measured and calculated flow was 0.92 (p less than 0.01). Contrast ultrasonography is a technique capable of measuring renal blood flow at a wide range of different flow levels.

Abstract

Folate deficiency has been associated with dysplasia in human cancer models. Patients with ulcerative colitis commonly have decreased folate levels, which are partially due to sulfasalazine, a competitive inhibitor of folate absorption. To study the effect of folate supplementation on the risk of dysplasia or cancer (neoplasia) in ulcerative colitis, records from 99 patients with pancolitis for greater than 7 yr and enrolled in a surveillance program were reviewed. Thirty-five patients with neoplasia were compared with 64 patients in whom dysplasia was never found to determine the effect of folate supplementation on the rate of development of neoplasia using case-control methodology. At the time of the index colonoscopy, patients with neoplasia were older (43 +/- 11 vs. 39 +/- 12 yr) and had disease of longer duration (20 +/- 8 vs. 15 +/- 7 yr, p less than 0.05). Folate supplementation was associated with a 62% lower incidence of neoplasia compared with individuals not receiving supplementation (odds ratio, 0.38; 95% confidence interval, 0.12-1.20). There was no appreciable change in this effect when models were fit to adjust for sulfasalazine dose, duration of disease, age at symptom onset, prednisone dose, sulfa allergy, sex, race, or family history of colon cancer. The statistical power of the association between folate supplementation and neoplasia was 72%. Correction of risk factors before the development of neoplasia may prevent this serious complication. Pending a larger case-control study, folate supplementation during sulfasalazine administration is recommended to possibly prevent the complication of dysplasia or cancer in ulcerative colitis.