Research & Scholarship

Current Research and Scholarly Interests

Clinical Trials

Rituximab in Progressive Immunoglobulin A (IgA) NephropathyRecruiting

This study was about IgA nephropathy, a form of kidney disease characterized by the presence
of blood and protein in the urine. This study was done to determine if the medication
rituximab could reduce protein in the patient's urine.
Hypothesis: In patients with progressive IgA nephropathy an intravenous infusion of 1000 mg
of rituximab on Day 1 and Day 15 and Days 168 and 182 is superior to conventional therapy in
reducing 24 hour proteinuria, and slowing progression of chronic kidney disease.

Systolic Pressure Intervention Trial (SPRINT) is a large scale randomized trial of ~ 9250
adults aged 50 years or older with high cardiovascular risk sponsored by NIH. The study is
designed to recruit 45% of the study population with Chronic Kidney Disease (CKD). The trial
will test the effects of low systolic blood pressure (SBP) goal of < 120 mm Hg versus the
standard goal of < 140 mm Hg on the primary composite of cardiovascular events and death. One
of the pre-specified secondary outcome is the progression of kidney disease. In this
ancillary named SPRINT - Factors affecting Atherosclerosis STudy (FAST), the investigators
plan to take advantage of the unique opportunities afforded by the parent study to examine
issues that are of significant public health importance.
This is an observational study in SPRINT participants. This study will examine
mechanistically, the factors affecting the progression of atherosclerosis in CKD.

To compare the safety and efficacy of the PMX cartridge based on mortality at 28-days in
subjects with septic shock who have high levels of endotoxin and are treated with standard
medical care plus use of the PMX cartridge, versus subjects who receive standard medical care
alone.

Abstract

While data from the latter part of the twentieth century consistently showed that immigrants to high-income countries faced higher cardio-metabolic risk than their counterparts in low- and middle-income countries, urbanization and associated lifestyle changes may be changing these patterns, even for conditions considered to be advanced manifestations of cardio-metabolic disease (e.g., chronic kidney disease [CKD]).Using cross-sectional data from the Center for cArdiometabolic Risk Reduction in South Asia (CARRS, n = 5294) and Mediators of Atherosclerosis in South Asians Living in America (MASALA, n = 748) studies, we investigated whether prevalence of CKD is similar among Indians living in Indian and U.S. cities. We compared crude, age-, waist-to-height ratio-, and diabetes- adjusted CKD prevalence difference. Among participants identified to have CKD, we compared management of risk factors for its progression. Overall age-adjusted prevalence of CKD was similar in MASALA (14.0% [95% CI 11.8-16.3]) compared with CARRS (10.8% [95% CI 10.0-11.6]). Among men the prevalence difference was low (prevalence difference 1.8 [95% CI -1.6,5.3]) and remained low after adjustment for age, waist-to-height ratio, and diabetes status (-0.4 [-3.2,2.5]). Adjusted prevalence difference was higher among women (prevalence difference 8.9 [4.8,12.9]), but driven entirely by a higher prevalence of albuminuria among women in MASALA. Severity of CKD--i.e., degree of albuminuria and proportion of participants with reduced glomerular filtration fraction--was higher in CARRS for both men and women. Fewer participants with CKD in CARRS were effectively treated. 4% of CARRS versus 51% of MASALA participants with CKD had A1c < 7%; and 7% of CARRS versus 59% of MASALA participants blood pressure < 140/90 mmHg. Our analysis applies only to urban populations. Demographic--particularly educational attainment--differences among participants in the two studies are a potential source of bias.Prevalence of CKD among Indians living in Indian and U.S. cities is similar. Persons with CKD living in Indian cities face higher likelihood of experiencing end-stage renal disease since they have more severe kidney disease and little evidence of risk factor management.

Abstract

Most patients with end-stage kidney disease value their health-related quality of life (HRQoL) and want to know how it will be affected by their dialysis modality. We extended the findings of two prior clinical trial reports to estimate the effects of frequent compared to conventional hemodialysis on additional measures of HRQoL. The Daily Trial randomly assigned 245 patients to receive frequent (six times per week) or conventional (three times per week) in-center hemodialysis. The Nocturnal Trial randomly assigned 87 patients to receive frequent nocturnal (six times per week) or conventional (three times per week) home hemodialysis. All patients were on conventional hemodialysis prior to randomization, with an average feeling thermometer score of 70 to 75 (a visual analog scale from 0 to 100 where 100 is perfect health), an average general health scale score of 40 to 47 (a score from 0 to 100 where 100 is perfect health), and an average dialysis session recovery time of 2 to 3 hours. Outcomes are reported as the between-treatment group differences in one-year change in HRQoL measures and analyzed using linear mixed effects models. After one year in the Daily Trial, patients assigned to frequent in-center hemodialysis reported a higher feeling thermometer score, better general health, and a shorter recovery time after a dialysis session compared to standard thrice-weekly dialysis. After one year in the Nocturnal Trial, patients assigned to frequent home hemodialysis also reported a shorter recovery time after a dialysis session, but no statistical difference in their feeling thermometer or general health scores compared to standard home dialysis schedules. Thus, patients receiving day or nocturnal hemodialysis on average recovered approximately one hour earlier from a frequent compared to conventional hemodialysis session. Patients treated in an in-center dialysis facility reported better HRQoL with frequent compared to conventional hemodialysis.

Abstract

The Medicare program insures >80% of patients with ESRD in the United States. An emphasis on reducing outpatient dialysis costs has motivated consolidation among dialysis providers, with two for-profit corporations now providing dialysis for >70% of patients. It is unknown whether industry consolidation has affected patients' ability to choose among competing dialysis providers. We identified patients receiving in-center hemodialysis at the start of 2001 and 2011 from the national ESRD registry and ascertained dialysis facility ownership. For each hospital service area, we determined the maximum distance within which 90% of patients traveled to receive dialysis in 2001. We compared the numbers of competing dialysis providers within that same distance between 2001 and 2011. Additionally, we examined the Herfindahl-Hirschman Index, a metric of market concentration ranging from near zero (perfect competition) to one (monopoly) for each hospital service area. Between 2001 and 2011, the number of different uniquely owned competing providers decreased 8%. However, increased facility entry into markets to meet rising demand for care offset the effect of provider consolidation on the number of choices available to patients. The number of dialysis facilities in the United States increased by 54%, and patients experienced an average 10% increase in the number of competing proximate facilities from which they could choose to receive dialysis (P<0.001). Local markets were highly concentrated in both 2001 and 2011 (mean Herfindahl-Hirschman Index =0.46; SD=0.2 for both years), but overall market concentration did not materially change. In summary, a decade of consolidation in the United States dialysis industry did not (on average) limit patient choice or result in more concentrated local markets. However, because dialysis markets remained highly concentrated, it will be important to understand whether market competition affects prices paid by private insurers, access to dialysis care, quality of care, and associated health outcomes.

Abstract

Estimates of the incidence of radiocontrast-associated nephropathy vary widely and suffer from misclassification of the cause of AKI and confounding. Using the Nationwide Inpatient Sample, we created multiple estimates of the risk of radiocontrast-associated nephropathy among adult patients hospitalized in the United States in 2009. First, we stratified patients according to the presence or absence of 12 relatively common diagnoses associated with AKI and evaluated the rate of AKI between strata. Next, we created a logistic regression model, controlling for comorbidity and acuity of illness, to estimate the risk of AKI associated with radiocontrast administration within each stratum. Finally, we performed an analysis stratified by the degree of preexisting comorbidity. In general, patients who received radiocontrast did not develop AKI at a clinically significant higher rate. Adjusted only for the complex survey design, patients to whom radiocontrast was and was not administered developed AKI at rates of 5.5% and 5.6%, respectively. After controlling for comorbidity and acuity of illness, radiocontrast administration associated with an odds ratio for AKI of 0.93 (95% confidence interval, 0.88 to 0.97). In conclusion, the risk of radiocontrast-associated nephropathy may be overstated in the literature and overestimated by clinicians. More accurate AKI risk estimates may improve clinical decision-making when attempting to balance the potential benefits of radiocontrast-enhanced imaging and the risk of AKI.

Abstract

In 2004, the Centers for Medicare & Medicaid Services changed reimbursement for physicians and advanced practitioners caring for patients receiving hemodialysis from a capitated to a tiered fee-for-service system, encouraging increased face-to-face visits. This early version of a pay-for-performance initiative targeted a care process: more frequent provider visits in hemodialysis. Although more frequent provider visits in hemodialysis are associated with fewer hospitalizations and rehospitalizations, it is unknown whether encouraging more frequent visits through reimbursement policy also yielded these benefits.We used a retrospective cohort interrupted time-series study design to examine whether the 2004 nephrologist reimbursement reform led to reduced hospitalizations and rehospitalizations. We also used published data to estimate a range of annual economic costs associated with more frequent visits.Medicare beneficiaries in the United States receiving hemodialysis in the 2 years prior to and following reimbursement reform.The 2 years following nephrologist reimbursement reform.Odds of hospitalization and 30-day hospital readmission for all causes and fluid overload; US dollars.We found no significant change in all-cause hospitalization or rehospitalization and slight reductions in fluid overload hospitalization and rehospitalization following reimbursement reform; the estimated economic cost associated with additional visits ranged from $13 to $87 million per year, depending on who (physicians or advanced practitioners) spent additional time visiting patients and how much additional effort was involved.Due to limited information about how much additional time providers spent seeing patients after reimbursement reform, we could only examine a range of potential economic costs associated with the reform.A Medicare reimbursement policy designed to encourage more frequent visits during outpatient hemodialysis may have been costly. The policy was associated with fewer hospitalizations and rehospitalizations for fluid overload, but had no effect on all-cause hospitalizations or rehospitalizations.

Abstract

Secondary hyperparathyroidism contributes to extraskeletal complications in chronic kidney disease.To evaluate the effect of the intravenous calcimimetic etelcalcetide on serum parathyroid hormone (PTH) concentrations in patients receiving hemodialysis.Two parallel, phase 3, randomized, placebo-controlled treatment trials were conducted in 1023 patients receiving hemodialysis with moderate to severe secondary hyperparathyroidism. Trial A was conducted in 508 patients at 111 sites in the United States, Canada, Europe, Israel, Russia, and Australia from March 12, 2013, to June 12, 2014; trial B was conducted in 515 patients at 97 sites in the same countries from March 12, 2013, to May 12, 2014.Intravenous administration of etelcalcetide (n?=?503) or placebo (n?=?513) after each hemodialysis session for 26 weeks.The primary efficacy end point was the proportion of patients achieving greater than 30% reduction from baseline in mean PTH during weeks 20-27. A secondary efficacy end point was the proportion of patients achieving mean PTH of 300 pg/mL or lower.The mean age of the 1023 patients was 58.2 (SD, 14.4) years and 60.4% were men. Mean PTH concentrations at baseline and during weeks 20-27 were 849 and 384 pg/mL vs 820 and 897 pg/mL in the etelcalcetide and placebo groups, respectively, in trial A; corresponding values were 845 and 363 pg/mL vs 852 and 960 pg/mL in trial B. Patients randomized to etelcalcetide were significantly more likely to achieve the primary efficacy end point: in trial A, 188 of 254 (74.0%) vs 21 of 254 (8.3%; P?.001), for a difference in proportions of 65.7% (95% CI, 59.4%-72.1%) and in trial B, 192 of 255 (75.3%) vs 25 of 260 (9.6%; P?.001), for a difference in proportions of 65.7% (95% CI, 59.3%-72.1%). Patients randomized to etelcalcetide were significantly more likely to achieve a PTH level of 300 pg/mL or lower: in trial A, 126 of 254 (49.6%) vs 13 of 254 (5.1%; P?.001), for a difference in proportions of 44.5% (95% CI, 37.8%-51.2%) and in trial B, 136 of 255 (53.3%) vs 12 of 260 (4.6%; P?.001), for a difference in proportions of 48.7% (95% CI, 42.1%-55.4%). In trials A and B, respectively, patients receiving etelcalcetide had more muscle spasms (12.0% and 11.1% vs 7.1% and 6.2% with placebo), nausea (12.4% and 9.1% vs 5.1% and 7.3%), and vomiting (10.4% and 7.5% vs 7.1% and 3.1%).Among patients receiving hemodialysis with moderate to severe secondary hyperparathyroidism, use of etelcalcetide compared with placebo resulted in greater reduction in serum PTH over 26 weeks. Further studies are needed to assess clinical outcomes as well as longer-term efficacy and safety.clinicaltrials.gov Identifiers: NCT01788046.

Abstract

Secondary hyperparathyroidism contributes to extraskeletal calcification and is associated with all-cause and cardiovascular mortality. Control is suboptimal in the majority of patients receiving hemodialysis. An intravenously (IV) administered calcimimetic could improve adherence and reduce adverse gastrointestinal effects.To evaluate the relative efficacy and safety of the IV calcimimetic etelcalcetide and the oral calcimimetic cinacalcet.A randomized, double-blind, double-dummy active clinical trial was conducted comparing IV etelcalcetide vs oral placebo and oral cinacalcet vs IV placebo in 683 patients receiving hemodialysis with serum parathyroid hormone (PTH) concentrations higher than 500 pg/mL on active therapy at 164 sites in the United States, Canada, Europe, Russia, and New Zealand. Patients were enrolled from August 2013 to May 2014, with end of follow-up in January 2015.Etelcalcetide intravenously and oral placebo (n?=?340) or oral cinacalcet and IV placebo (n?=?343) for 26 weeks. The IV study drug was administered 3 times weekly with hemodialysis; the oral study drug was administered daily.The primary efficacy end point was noninferiority of etelcalcetide at achieving more than a 30% reduction from baseline in mean predialysis PTH concentrations during weeks 20-27 (noninferiority margin, 12.0%). Secondary end points included superiority in achieving biochemical end points (>50% and >30% reduction in PTH) and self-reported nausea or vomiting.The mean (SD) age of the trial participants was 54.7 (14.1) years and 56.2% were men. Etelcalcetide was noninferior to cinacalcet on the primary end point. The estimated difference in proportions of patients achieving reduction in PTH concentrations of more than 30% between the 198 of 343 patients (57.7%) randomized to receive cinacalcet and the 232 of 340 patients (68.2%) randomized to receive etelcalcetide was -10.5% (95% CI, -17.5% to -3.5%, P for noninferiority,

Abstract

For almost 50 years, ebolaviruses and related filoviruses have been repeatedly reemerging across the vast equatorial belt of the African continent to cause epidemics of highly fatal hemorrhagic fever. The 2013-2015 West African epidemic, by far the most geographically extensive, most fatal, and longest lasting epidemic in Ebola's history, presented an enormous international public health challenge, but it also provided insights into Ebola's pathogenesis and natural history, clinical expression, treatment, prevention, and control. Growing understanding of ebolavirus pathogenetic mechanisms and important new clinical observations of the disease course provide fresh clues about prevention and treatment approaches. Although viral cytopathology and immune-mediated cell damage in ebolavirus disease often result in severe compromise of multiple organs, tissue repair and organ function recovery can be expected if patients receive supportive care with fluids and electrolytes; maintenance of oxygenation and tissue perfusion; and respiratory, renal, and cardiovascular support. Major challenges for managing future Ebola epidemics include establishment of early and aggressive epidemic control and earlier and better patient care and treatment in remote, resource-poor areas where Ebola typically reemerges. In addition, it will be important to further develop Ebola vaccines and to adopt policies for their use in epidemic and pre-epidemic situations.

Abstract

Biosimilars are biologic medicines highly similar to the reference product with no meaningful clinical differences in terms of safety, purity, and potency. All biologic medicines are produced by living cells, resulting in an inherent heterogeneity in their higher order structures and post-translational modifications. In 2010, the US Congress enacted legislation to streamline the approval process for biosimilars of products losing patent protection, with the goal of decreasing costs and improving patient access to therapeutically important but expensive biologic agents. In 2015, the US Food and Drug Administration approved the first biosimilar agent through this pathway. Approval of additional biosimilar agents in the United States, including those used by nephrologists, is anticipated. Given the relative lack of knowledge regarding biosimilars and their approval process and a lack of trust by the nephrology community regarding their safety and efficacy, the National Kidney Foundation conducted a symposium, Introduction of Biosimilar Therapeutics Into Nephrology Practice in the U.S., September 17 to 18, 2015. Issues related to manufacturing, the regulatory approval process, interchangeability, substitution/switching, nomenclature, and clinician and patient awareness and acceptance were examined. This report summarizes the main discussions at the symposium, highlights several controversies, and makes recommendations related to public policy, professional and patient education, and research needs.

Abstract

The Evaluation of Cinacalcet HCl Therapy to Lower Cardiovascular Events (EVOLVE) clinical trial evaluated the effects of cinacalcet on clinical events in patients with secondary hyperparathyroidism (sHPT) who were on hemodialysis. Health-related quality of life (HRQoL) was assessed by a generic, preference-based health outcome measure (EQ-5D) at scheduled visits and after a study event. Here, we report the HRQoL analysis from EVOLVE.We assessed changes in HRQoL from baseline to scheduled visits, and estimated the acute (3 mo) and chronic (beyond 3 mo) effects of sHPT-related events on HRQoL using generalized estimating equation analysis controlling for baseline HRQoL and randomized assignment.Data on HRQoL were available for 3547 of 3883 subjects, with 1650 events in the placebo and 1502 in the cinacalcet arm. At the study end, no difference in change from baseline HRQoL was observed in the direct comparison of EQ-5D by treatment arms. The regression analysis showed significant effects of events on HRQoL and a modest positive effect of cinacalcet. Estimated quality-adjusted life-year gains were of similar magnitude based on the observed data or the predictions from the model, with only a small gain in precision from the predicted analysis.By contrast with a conventional comparison, a regression analysis demonstrated large decrements in HRQoL after events and a modest improvement in HRQoL with cinacalcet. As randomized controlled trials are rarely powered to detect differences in HRQoL, a prespecified regression analysis may be acceptable to improve precision of the effects and understand their origin.

Abstract

Introduction End-stage renal disease is associated with elevations in circulating prolactin concentrations, but the association of prolactin concentrations with intermediate health outcomes and the effects of hemodialysis frequency on changes in serum prolactin have not been examined. Methods The FHN Daily and Nocturnal Dialysis Trials compared the effects of conventional thrice weekly hemodialysis with in-center daily hemodialysis (6 days/week) and nocturnal home hemodialysis (6 nights/week) over 12 months and obtained measures of health-related quality of life, self-reported physical function, mental health and cognition. Serum prolactin concentrations were measured at baseline and 12-month follow-up in 70% of the FHN Trial cohort to examine the associations among serum prolactin concentrations and physical, mental and cognitive function and the effects of hemodialysis frequency on serum prolactin. Findings Among 177 Daily Trial and 60 Nocturnal Trial participants with baseline serum prolactin measurements, the median serum prolactin concentration was 65 ng/mL (25th-75th percentile 48-195 ng/mL) and 81% had serum prolactin concentrations >30 ng/mL. While serum prolactin was associated with sex (higher in women), we observed no association between baseline serum prolactin and age, dialysis vintage, and baseline measures of physical, mental and cognitive function. Furthermore, there was no significant effect of hemodialysis frequency on serum prolactin in either of the two trials. Discussion Serum prolactin concentrations were elevated in the large majority of patients with ESRD, but were not associated with several measures of health status. Circulating prolactin levels also do not appear to decrease in response to more frequent hemodialysis over a one-year period.

Abstract

Introduction The use of administrative data to capture 30-day readmission rates in end-stage renal disease is challenging since Medicare combines claims from acute care, inpatient rehabilitation (IRF), and long-term care hospital stays into a single "Inpatient" file. For data prior to 2012, the United States Renal Data System does not contain the variables necessary to easily identify different facility types, making it likely that prior studies have inaccurately estimated 30-day readmission rates. Methods For this report, we developed two methods (a "simple method" and a "rehabilitation-adjusted method") to identify acute care, IRF, and long-term care hospital stays from United States Renal Data System claims data, and compared them to methods used in previously published reports. Findings We found that prior methods overestimated 30-day readmission rates by up to 12.3% and overestimated average 30-day readmission costs by up to 11%. In contrast, the simple and rehabilitation-adjusted methods overestimated 30-day readmission rates by 0.1% and average 30-day readmission costs by 1.8%. The rehabilitation-adjusted method also accurately identified 96.8% of IRF stays. Discussion Prior research has likely provided inaccurate estimates of 30-day readmissions in patients undergoing dialysis. In the absence of data on specific facility types particularly when using data prior to 2012, future researchers could employ our method to more accurately characterize 30-day readmission rates and associated outcomes in patients with end-stage renal disease.

Abstract

The risk of hospital readmission in acute kidney injury survivors is not well understood. We estimated the proportion of acute kidney injury patients who were rehospitalized within 30 days and identified characteristics associated with hospital readmission.We conducted a population-based study of patients who survived a hospitalization complicated by acute kidney injury from 2003-2013 in Ontario, Canada. The primary outcome was 30-day hospital readmission. We used a propensity score model to match patients with and without acute kidney injury, and a Cox proportional hazards model with death as a competing risk to identify predictors of 30-day readmission.We identified 156,690 patients who were discharged from 197 hospitals after an episode of acute kidney injury. In the subsequent 30 days, 27,457 (18%) patients were readmitted; 15,988 (10%) visited the emergency department and 7480 (5%) died. We successfully matched 111,778 patients with acute kidney injury 1:1 to patients without acute kidney injury. The likelihood of 30-day readmission was higher in acute kidney injury patients than those without acute kidney injury (hazard ratio [HR] 1.53; 95% confidence interval [CI], 1.50-1.57). Factors most strongly associated with 30-day rehospitalization were the number of hospitalizations in the preceding year (adjusted HR 1.45 for ?2 hospitalizations; 95% CI, 1.40-1.51) and receipt of inpatient chemotherapy (adjusted HR 1.44; 95% CI, 1.32-1.58).One in 5 patients who survive a hospitalization complicated by acute kidney injury is readmitted in the next 30 days. Better strategies are needed to identify and care for acute kidney injury survivors in the community.

Abstract

Introduction Home hemodialysis has not been widely adopted despite superior outcomes relative to conventional in-center hemodialysis. Patients receiving home hemodialysis experience high rates of technique failure owing to machine complexity, training burden, and the inability to master treatments independently. Methods We conducted human factors testing on 15 health care professionals (HCPs) and 15 patients upon release of the defined training program on the Tablo? Hemodialysis System. Each participant completed one training and one testing session conducted in a simulated clinical environment. Training sessions lasted <3 hours for HCPs and <4 hours for patients, with an hour break between sessions for knowledge decay. During the testing session, we recorded participant behavior and data according to standard performance and safety-based criteria. Findings Of 15 HCPs, 10 were registered nurses and five patient care technicians, with a broad range of dialysis work experience and no limitations other than visual correction. Of 15 patients (average age 48 years), 13 reported no limitations and two reported modest limitations-partial deafness and blindness in one eye, respectively. The average error rate was 4.4 per session for HCPs and 2.9 per session for patients out of a total possible 1,710 opportunities for errors. Despite having received minimal training, neither HCPs nor patients committed safety-related errors that required mitigation; rather, we noted only minor errors and operational difficulties. Discussion The Tablo? Hemodialysis System is easy to use, and may help to enable self-care and home hemodialysis in settings heretofore associated with high rates of technique failure.

Abstract

Erythropoiesis-stimulating agents (ESAs) are commonly used to treat anemia in patients with CKD, including those receiving dialysis, although clinical trials have identified risks associated with ESA use. We evaluated the effects of changes in dialysis payment policies and product labeling instituted in 2011 on mortality and major cardiovascular events across the United States dialysis population in an open cohort study of patients on dialysis from January 1, 2005, through December 31, 2012, with Medicare as primary payer. We compared observed rates of death and major cardiovascular events in 2011 and 2012 with expected rates calculated on the basis of rates in 2005-2010, accounting for differences in patient characteristics and influenza virulence. An abrupt decline in erythropoietin dosing and hemoglobin concentration began in late 2010. Observed rates of all-cause mortality, cardiovascular mortality, and myocardial infarction in 2011 and 2012 were consistent with expected rates. During 2012, observed rates of stroke, venous thromboembolic disease (VTE), and heart failure were lower than expected (absolute deviation from trend per 100 patient-years [95% confidence interval]: -0.24 [-0.08 to -0.37] for stroke, -2.43 [-1.35 to -3.70] for VTE, and -0.77 [-0.28 to -1.27] for heart failure), although non-ESA-related changes in practice and Medicare payment penalties for rehospitalization may have confounded the results. This initial evidence suggests that action taken to mitigate risks associated with ESA use and changes in payment policy did not result in a relative increase in death or major cardiovascular events and may reflect improvements in stroke, VTE, and heart failure.

Abstract

To assess whether patient factors, such as age and preoperative kidney function, were associated with receipt of partial nephrectomy in a national integrated healthcare system.We identified patients treated with a radical or partial nephrectomy from 2002 to 2014 in the Veterans Health Administration. We examined associations among patient age, sex, race or ethnicity, multimorbidity, baseline kidney function, tumor characteristics, and receipt of partial nephrectomy. We estimated the odds of receiving a partial nephrectomy and assessed interactions between covariates and the year of surgery to explore whether patient factors associated with partial nephrectomy changed over time.In our cohort of 14,186 patients, 4508 (31.2%) received a partial nephrectomy. Use of partial nephrectomy increased from 17% in 2002 to 32% in 2008 and to 38% in 2014. Patient race or ethnicity, age, tumor stage, and year of surgery were independently associated with receipt of partial nephrectomy. Black veterans had significantly increased odds of receipt of partial nephrectomy, whereas older patients had significantly reduced odds. Partial nephrectomy utilization increased for all groups over time, but older patients and patients with worse baseline kidney function showed the least increase in odds of partial nephrectomy.Although the utilization of partial nephrectomy increased for all groups, the greatest increase occurred in the youngest patients and those with the highest baseline kidney function. These trends warrant further investigation to ensure that patients at the highest risk of impaired kidney function are considered for partial nephrectomy whenever possible.

Abstract

A better understanding of overall survival among patients with clinically localized prostate cancer (PCa) in the US Veterans Health Administration (VHA) is critical to inform PCa treatment decisions, especially in light of data from the Prostate Intervention Versus Observation Trial (PIVOT). We sought to describe patterns of survival for all patients with clinically localized PCa treated by the VHA. We created an analytic cohort of 35 954 patients with clinically localized PCa diagnosed from 1995 to 2001, approximating the PIVOT inclusion criteria (age of diagnosis ?75 yr and clinical stage T2 or lower). Mean patient age was 65.9 yr, and median follow-up was 161 mo. Overall, 22.5% of patients were treated with surgery, 16.6% were treated with radiotherapy, and 23.1% were treated with androgen deprivation. Median survival of the entire cohort was 14 yr (25th, 75th percentiles, range: 7.9-20 yr). Among patients who received treatment with curative intent, median survival was 17.9 yr following surgery and 12.9 yr following radiotherapy. One-third of patients died within 10 yr of diagnosis compared with nearly half of the participants in PIVOT. This finding sounds a note of caution when generalizing the mortality data from PIVOT to VHA patients and those in the community.More than one-third of patients diagnosed with clinically localized prostate cancer treated through the US Veterans Health Administration from 1995 to 2001 died within 10 yr of their diagnosis. Caution should be used when generalizing the estimates of competing mortality data from PIVOT.

Abstract

The sustained economic growth in Bangladesh during the previous decade has created a substantial middle-class population, who have adequate income to spend on food, clothing, and lifestyle management. Along with the improvements in living standards, has also come negative impact on health for the middle class. The study objective was to assess sex differences in obesity prevalence, diet, and physical activity among urban middle-class Bangladeshi.In this cross-sectional study, conducted in 2012, we randomly selected 402 adults from Mohammedpur, Dhaka. The sampling technique was multi-stage random sampling. We used standardized questionnaires for data collection and measured height, weight, and waist circumference.Mean age (standard deviation) was 49.4 (12.7) years. The prevalence of both generalized (79% vs. 53%) and central obesity (85% vs. 42%) were significantly higher in women than men. Women reported spending more time watching TV and spending less time walking than men (p

Abstract

Secondary hyperparathyroidism is common among patients with ESRD. Although medical therapy for secondary hyperparathyroidism has changed dramatically over the last decade, rates of parathyroidectomy for secondary hyperparathyroidism across the United States population are unknown. We examined temporal trends in rates of parathyroidectomy, in-hospital mortality, length of hospital stay, and costs of hospitalization.Using the Healthcare Cost and Utilization Project's Nationwide Inpatient Sample, a representative national database on hospital stay regardless of age and payer in the United States, we identified parathyroidectomies for secondary hyperparathyroidism from 2002 to 2011. Data from the US Renal Data System reports were used to calculate the rate of parathyroidectomy.We identified 32,971 parathyroidectomies for secondary hyperparathyroidism between 2002 and 2011. The overall rate of parathyroidectomy was approximately 5.4/1000 patients (95% confidence interval [95% CI], 5.0/1000 to 6.0/1000). The rate decreased from 2003 (7.9/1000 patients; 95% CI, 6.2/1000 to 9.6/1000), reached a nadir in 2005 (3.3/1000 patients; 95% CI, 2.6/1000 to 4.0/1000), increased again through 2006 (5.4/1000 patients; 95% CI, 4.4/1000 to 6.4/1000), and remained stable since that time. Rates of in-hospital mortality decreased from 1.7% (95% CI, 0.8% to 2.6%) in 2002 to 0.8% (95% CI, 0.1% to 1.6%) in 2011 (P for trend <0.001). In-hospital mortality rates were significantly higher in patients with heart failure (odds ratio [OR], 4.23; 95% CI, 2.59 to 6.91) and peripheral vascular disease (OR, 4.59; 95% CI, 2.75 to 7.65) and lower among patients with prior kidney transplantation (OR, 0.20; 95% CI, 0.06 to 0.65).Despite the use of multiple medical therapies, rates of parathyroidectomy of secondary hyperparathyroidism have not declined in recent years.

Abstract

Among patients receiving hemodialysis, abnormalities in calcium regulation have been linked to an increased risk of cardiovascular events. Cinacalcet lowers serum calcium concentrations through its effect on parathyroid hormone secretion and has been hypothesized to reduce the risk of cardiovascular events. In observational cohort studies, prescriptions of low dialysate calcium concentration and larger observed serum-dialysate calcium gradients have been associated with higher risks of in-dialysis facility or peri-dialytic sudden cardiac arrest. We performed this study to examine the risks associated with dialysate calcium and serum-dialysate gradients among participants in the Evaluation of Cinacalcet Hydrochloride Therapy to Lower Cardiovascular Events (EVOLVE) trial. In EVOLVE, 3883 hemodialysis patients were randomized 1:1 to cinacalcet or placebo. Dialysate calcium was administered at the discretion of treating physicians. We examined whether baseline dialysate calcium concentration or the serum-dialysate calcium gradient modified the effect of cinacalcet on the following adjudicated endpoints: (1) primary composite endpoint (death or first non-fatal myocardial infarction, hospitalization for unstable angina, heart failure, or peripheral vascular event); (2) cardiovascular death; and (3) sudden death. In EVOLVE, use of higher dialysate calcium concentrations was more prevalent in Europe and Latin America compared with North America. There was a significant fall in serum calcium concentration in the cinacalcet group; dialysate calcium concentrations were changed infrequently in both groups. There was no association between baseline dialysate calcium concentration or serum-dialysate calcium gradient and the endpoints examined. Neither the baseline dialysate calcium nor the serum-dialysate calcium gradient significantly modified the effects of cinacalcet on the outcomes examined. The effects of cinacalcet on cardiovascular death and major cardiovascular events are not altered by the dialysate calcium prescription and serum-dialysate calcium gradient.

Abstract

The capacity of electronic health record (EHR) data to guide targeted surveillance in chronic kidney disease (CKD) is unclear. We sought to leverage EHR data for predicting risk of progressing from CKD to end-stage renal disease (ESRD) to help inform surveillance of CKD among vulnerable patients from the healthcare safety-net.We conducted a retrospective cohort study of adults (n?=?28,779) with CKD who received care within 2 regional safety-net health systems during 1996-2009 in the Western United States. The primary outcomes were progression to ESRD and death as ascertained by linkage with United States Renal Data System and Social Security Administration Death Master files, respectively, through September 29, 2011. We evaluated the performance of 3 models which included demographic, comorbidity and laboratory data to predict progression of CKD to ESRD in conditions commonly targeted for disease management (hypertension, diabetes, chronic viral diseases and severe CKD) using traditional discriminatory criteria (AUC) and recent criteria intended to guide population health management strategies.Overall, 1730 persons progressed to end-stage renal disease and 7628 died during median follow-up of 6.6 years. Performance of risk models incorporating common EHR variables was highest in hypertension, intermediate in diabetes and chronic viral diseases, and lowest in severe CKD. Surveillance of persons who were in the highest quintile of ESRD risk yielded 83-94 %, 74-95 %, and 75-82 % of cases who progressed to ESRD among patients with hypertension, diabetes and chronic viral diseases, respectively. Similar surveillance yielded 42-71 % of ESRD cases among those with severe CKD. Discrimination in all conditions was universally high (AUC ?0.80) when evaluated using traditional criteria.Recently proposed discriminatory criteria account for varying risk distribution and when applied to common clinical conditions may help to inform surveillance of CKD in diverse populations.

Abstract

Hypertension is a risk factor for the development of cardiovascular and kidney disease, but treatment can substantially reduce risks. Many patients avoid antihypertensive medications because of fear of side-effects. Although associations between antihypertensives and sexual dysfunction in men have been documented, it remains unclear whether antihypertensives are associated with sexual dysfunction in women. We conducted a cross-sectional analysis of baseline data from women in the Systolic Blood Pressure Intervention Trial (SPRINT) to evaluate the relations among class of antihypertensive medication and the outcomes: sexual activity and sexual function.SPRINT enrolled individuals 50 and older with hypertension at high risk for cardiovascular disease. A subset of participants completed questionnaires regarding quality of life, including sexual function. Antihypertensive class was determined by medications taken at baseline.Of 690 women in the quality of life subset of SPRINT, 183 (26.5%) were sexually active. There were no significant differences in sexual activity among women taking one or more antihypertensives and women not taking any. Women taking an angiotensin-converting enzyme inhibitor or angiotensin receptor blocker had higher odds of sexual activity [odds ratio 1.66 (1.12-4.27), P?=?0.011]. Among sexually active women, the prevalence of sexual dysfunction was high (52.5%). No class of medication was associated with sexual dysfunction in the multivariable model.Angiotensin-converting enzyme inhibitor or angiotensin receptor blocker use was associated with higher odds of sexual activity. Although prevalence of sexual dysfunction was high, no single class of antihypertensive medication was associated with sexual dysfunction.

Abstract

Patients with end-stage renal disease can receive dialysis at home or in-center. In 2004, CMS reformed physician payment for in-center hemodialysis care from a capitated to a tiered fee-for-service model, augmenting physician payment for frequent in-center visits. We evaluated whether payment reform influenced dialysis modality assignment.Cohort study of patients starting dialysis in the United States in the 3 years before and the 3 years after payment reform.We conducted difference-in-difference analyses comparing patients with traditional Medicare coverage (who were affected by the policy) to others with Medicare Advantage (who were unaffected by the policy). We also examined whether the policy had a more pronounced influence on dialysis modality assignment in areas with lower costs of traveling to dialysis facilities.Patients with traditional Medicare coverage experienced a 0.7% (95% CI, 0.2%-1.1%; P = .003) reduction in the absolute probability of home dialysis use following payment reform compared with patients with Medicare Advantage. Patients living in areas with larger dialysis facilities (where payment reform made in-center hemodialysis comparatively more lucrative for physicians) experienced a 0.9% (95% CI, 0.5%-1.4%; P < .001) reduction in home dialysis use following payment reform compared with patients living in areas with smaller facilities (where payment reform made in-center hemodialysis comparatively less lucrative for physicians).The transition from a capitated to a tiered fee-for-service payment model for in-center hemodialysis care resulted in fewer patients receiving home dialysis. This area of policy failure highlights the importance of considering unintended consequences of future physician payment reform efforts.

Abstract

The Frequent Hemodialysis Network Daily Trial randomized 245 patients to receive six (frequent) or three (conventional) in-center hemodialysis sessions per week for 12 months. As reported previously, frequent in-center hemodialysis yielded favorable effects on the coprimary composite outcomes of death or change in left ventricular mass and death or change in self-reported physical health. Here, we determined the long-term effects of the 12-month frequent in-center hemodialysis intervention. We determined the vital status of patients over a median of 3.6 years (10%-90% range, 1.5-5.3 years) after randomization. Using an intention to treat analysis, we compared the mortality hazard in randomized groups. In a subset of patients from both groups, we reassessed left ventricular mass and self-reported physical health a year or more after completion of the intervention; 20 of 125 patients (16%) randomized to frequent hemodialysis died during the combined trial and post-trial observation periods in contrast to 34 of 120 patients (28%) randomized to conventional hemodialysis. The relative mortality hazard for frequent versus conventional hemodialysis was 0.54 (95% confidence interval, 0.31 to 0.93); with censoring of time after kidney transplantation, the relative hazard was 0.56 (95% confidence interval, 0.32 to 0.99). Bayesian analysis suggested a relatively high probability of clinically significant benefit and a very low probability of harm with frequent hemodialysis. In conclusion, a 12-month frequent in-center hemodialysis intervention significantly reduced long-term mortality, suggesting that frequent hemodialysis may benefit selected patients with ESRD.

Abstract

To change a particular quality of care outcome within a system, quality improvement initiatives must first understand the causes contributing to the outcome. After the causes of a particular outcome are known, changes can be made to address these causes and change the outcome. Using the example of home dialysis (home hemodialysis and peritoneal dialysis), this article within this Moving Points feature on quality improvement will provide health care professionals with the tools necessary to analyze the steps contributing to certain outcomes in health care quality and develop ideas that will ultimately lead to their resolution. The tools used to identify the main contributors to a quality of care outcome will be described, including cause and effect diagrams, Pareto analysis, and process mapping. We will also review common change concepts and brainstorming activities to identify effective change ideas. These methods will be applied to our home dialysis quality improvement project, providing a practical example that other kidney health care professionals can replicate at their local centers.

Abstract

To achieve sustainable change, quality improvement initiatives must become the new way of working rather than something added on to routine clinical care. However, most organizational change is not maintained. In this next article in this Moving Points in Nephrology feature on quality improvement, we provide health care professionals with strategies to sustain and support quality improvement. Threats to sustainability may be identified both at the beginning of a project and when it is ready for implementation. The National Health Service Sustainability Model is reviewed as one example to help identify issues that affect long-term success of quality improvement projects. Tools to help sustain improvement include process control boards, performance boards, standard work, and improvement huddles. Process control and performance boards are methods to communicate improvement results to staff and leadership. Standard work is a written or visual outline of current best practices for a task and provides a framework to ensure that changes that have improved patient care are consistently and reliably applied to every patient encounter. Improvement huddles are short, regular meetings among staff to anticipate problems, review performance, and support a culture of improvement. Many of these tools rely on principles of visual management, which are systems transparent and simple so that every staff member can rapidly distinguish normal from abnormal working conditions. Even when quality improvement methods are properly applied, the success of a project still depends on contextual factors. Context refers to aspects of the local setting in which the project operates. Context affects resources, leadership support, data infrastructure, team motivation, and team performance. For these reasons, the same project may thrive in a supportive context and fail in a different context. To demonstrate the practical applications of these quality improvement principles, these principles are applied to a hypothetical quality improvement initiative that aims to promote home dialysis (home hemodialysis and peritoneal dialysis).

Abstract

This article will demonstrate how to conduct a quality improvement project using the change idea generated in "How To Use Quality Improvement Tools in Clinical Practice: How To Diagnose Solutions to a Quality of Care Problem" by Dr. Ziv Harel and colleagues in this Moving Points feature. This change idea involves the introduction of a nurse educator into a CKD clinic with a goal of increasing rates of patients performing dialysis independently at home (home hemodialysis or peritoneal dialysis). Using this example, we will illustrate a Plan-Do-Study-Act (PDSA) cycle in action and highlight the principles of rapid cycle change methodology. We will then discuss the selection of outcome, process, and balancing measures, and the practicalities of collecting these data in the clinic environment. We will also introduce the PDSA worksheet as a practical way to oversee the progress of a quality improvement project. Finally, we will demonstrate how run charts are used to visually illustrate improvement in real time, and how this information can be used to validate achievement, respond appropriately to challenges the project may encounter, and prove the significance of results. This article aims to provide readers with a clear and practical framework upon which to trial their own ideas for quality improvement in the clinical setting.

Abstract

Quality improvement involves a combined effort among health care staff and stakeholders to diagnose and treat problems in the health care system. However, health care professionals often lack training in quality improvement methods, which makes it challenging to participate in improvement efforts. This article familiarizes health care professionals with how to begin a quality improvement project. The initial steps involve forming an improvement team that possesses expertise in the quality of care problem, leadership, and change management. Stakeholder mapping and analysis are useful tools at this stage, and these are reviewed to help identify individuals who might have a vested interest in the project. Physician engagement is a particularly important component of project success, and the knowledge that patients/caregivers can offer as members of a quality improvement team should not be overlooked. After a team is formed, an improvement framework helps to organize the scientific process of system change. Common quality improvement frameworks include Six Sigma, Lean, and the Model for Improvement. These models are contrasted, with a focus on the Model for Improvement, because it is widely used and applicable to a variety of quality of care problems without advanced training. It involves three steps: setting aims to focus improvement, choosing a balanced set of measures to determine if improvement occurs, and testing new ideas to change the current process. These new ideas are evaluated using Plan-Do-Study-Act cycles, where knowledge is gained by testing changes and reflecting on their effect. To show the real world utility of the quality improvement methods discussed, they are applied to a hypothetical quality improvement initiative that aims to promote home dialysis (home hemodialysis and peritoneal dialysis). This provides an example that kidney health care professionals can use to begin their own quality improvement projects.

Abstract

The Systolic Blood Pressure Intervention Trial (SPRINT) is testing whether a lower systolic blood pressure (BP) target of 120 mm Hg leads to a reduction in cardiovascular morbidity and mortality among hypertensive, nondiabetic adults. Because there may be detrimental effects of intensive BP control, particularly in older, frail adults, we sought to characterize frailty within SPRINT to address ongoing questions about the ability of large-scale trials to enroll representative samples of noninstitutionalized, community-dwelling, older adults.We constructed a 36-item frailty index (FI) in 9,306 SPRINT participants, classifying participants as fit (FI ? 0.10), less fit (0.10 < FI ? 0.21), or frail (FI > 0.21). Recurrent event models were used to evaluate the association of the FI with the incidence of self-reported falls, injurious falls, and all-cause hospitalizations.The distribution of the FI was comparable with what has been observed in population studies, with 2,570 (27.6%) participants classified as frail. The median FI was 0.18 (interquartile range = 0.14 to 0.24) in participants aged 80 years and older (N = 1,159), similar to the median FI of 0.17 reported for participants in the Hypertension in the Very Elderly Trial. In multivariable analyses, a 1% increase in the FI was associated with increased risk for self-reported falls (hazard ratio [HR] = 1.030), injurious falls (HR = 1.035), and all-cause hospitalizations (HR = 1.038) (all p values < .0001).Large clinical trials assessing treatments to reduce cardiovascular disease risk, such as SPRINT, can enroll heterogeneous populations of older adults, including the frail elderly, comparable with general population cohorts.

Abstract

Unlike conventional time-to-event analysis of composite endpoints in clinical trials, the "win ratio" method allows for flexibility in prioritizing their components. Here, we compare the EVOLVE trial findings using the win ratio with those from time-to-event analysis.Randomization to cinacalcet or placebo.The primary composite endpoint combining all-cause mortality and non-fatal myocardial infarction, hospitalization for unstable angina, heart failure, and peripheral vascular events.In an unadjusted analysis, we paired each participant from the cinacalcet arm with every participant from the placebo arm within randomization strata. Pairs were classified as "winners" or "losers," according to which participant died first during the shared follow-up time, or experienced the next ranked event first. We ranked non-fatal events in two ways: 1) all ranked evenly; and 2) prioritized by their effect on health-related quality of life. The win ratio equaled the total winners divided by total losers. Further analyses were conducted where the win ratio was stratified by, or adjusted for, age.The unadjusted win ratio for the primary composite endpoint was 1.09 (95% CI 0.97 to 1.21), a statistically non-significant result which supports the primary trial result - unadjusted hazard ratio 0.93 (95% CI 0.85 to 1.02). Age-stratified analyses showed a nominally significant benefit of cinacalcet (win ratio 1.14, 95% CI 1.04 to 1.26). Ranking of non-fatal outcomes by their relative effects on quality of life did not materially alter the results.The win ratio method corroborated the findings of EVOLVE based on conventional time-to-event analysis. EVOLVE ClinicalTrials.gov number: NCT00345839.

Abstract

Despite superior outcomes and lower associated costs, relatively few patients with end-stage renal disease undergo self-care or home hemodialysis. Few studies have examined patient- and physician-specific barriers to self-care and home hemodialysis in the modern era. The degree to which innovative technology might facilitate the adoption of these modalities is unknown. We surveyed 250 patients receiving in-center hemodialysis and 51 board-certified nephrologists to identify key barriers to adoption of self-care and home hemodialysis. Overall, 172 (69%) patients reported that they were "likely" or "very likely" to consider self-care hemodialysis if they were properly trained on a new hemodialysis system designed for self-care or home use. Nephrologists believed that patients were capable of performing many dialysis-relevant tasks, including: weighing themselves (98%), wiping down the chair and machine (84%), clearing alarms during treatment (53%), taking vital signs (46%), and cannulating vascular access (41%), but thought that patients would be willing to do the same in only 69%, 34%, 31%, 29%, and 16%, respectively. Reasons that nephrologists believe patients are hesitant to pursue self-care or home hemodialysis do not correspond in parallel or by priority to reasons reported by patients. Self-care and home hemodialysis offer several advantages to patients and dialysis providers. Overcoming real and perceived barriers with new technology, education and coordinated care will be required for these modalities to gain traction in the coming years.

Abstract

Frailty is common among patients on dialysis and increases vulnerability to dependency and death.We examined the predictive ability of frailty on the basis of physical performance and self-reported function in participants of a US Renal Data System special study that enrolled a convenience sample of 771 prevalent patients on hemodialysis from 14 facilities in the Atlanta and northern California areas from 2009 to 2011. Performance-based frailty was assessed using direct measures of grip strength (weakness) and gait speed along with weight loss, exhaustion, and low physical activity; poor self-reported function was substituted for weakness and slow gait speed in the self-reported function-based definition. For both definitions, patients meeting three or more criteria were considered frail.The mean age of 762 patients included in analyses was 57.1±14.2 years old; 240 patients (31%) met the physical performance-based definition of frailty, and 396 (52%) met the self-reported function-based definition. There were 106 deaths during 1.7 (interquartile range, 1.4-2.4) years of follow-up. After adjusting for demographic and clinical characteristics, the hazard ratio (HR) for mortality for the performance-based definition (2.16; 95% confidence interval [95% CI], 1.41 to 3.29) was slightly higher than that of the self-reported function-based definition (HR, 1.93; 95% CI, 1.24 to 3.00). Patients who met the self-report-based definition but not the physical performance definition of frailty (n=192) were not at statistically significantly higher risk of mortality than those who were not frail by either definition (n=330; HR, 1.41; 95% CI, 0.81 to 2.45), but those who met both definitions of frailty (n=204) were at significantly higher risk (HR, 2.46; 95% CI, 1.51 to 4.01).Frailty, defined using either direct tests of physical performance or self-reported physical function, was associated with higher mortality among patients receiving hemodialysis. Future studies are needed to determine the utility of assessing frailty in clinical practice.

Abstract

Thiazides and thiazide-type diuretics are recommended as first-line agents for the treatment of hypertension, but contemporary information on their use in clinical practice is lacking. We examined patterns and correlates of thiazide prescription in a cross-sectional analysis of baseline data from participants enrolled in the Systolic Blood Pressure Intervention Trial (SPRINT). We examined baseline prescription of thiazides in 7582 participants receiving at least 1 antihypertensive medication by subgroup, and used log-binomial regression to calculate adjusted prevalence ratios for thiazide prescription (versus no thiazide). Forty-three percent of all participants were prescribed a thiazide at baseline, but among participants prescribed a single agent, the proportion was only 16%. The prevalence of thiazide prescription differed significantly by demographic factors, with younger participants, women, and blacks all having higher adjusted prevalence of thiazide prescription than other corresponding subgroups. Participants in the lowest category of kidney function (estimated glomerular filtration rate <30 mL/min per 1.73 m2) were half as likely to be prescribed a thiazide as participants with preserved kidney function. In conclusion, among persons with hypertension and heightened cardiovascular risk, we found that thiazide prescription varied significantly by demographics and kidney disease status, despite limited evidence about relative differences in effectiveness.

Abstract

The effect of the calcimimetic cinacalcet on cardiovascular disease in patients undergoing hemodialysis with secondary hyperparathyroidism was assessed in the Evaluation of Cinacalcet Hydrochloride Therapy to Lower Cardiovascular Events trial. This was the largest (in size) and longest (in duration) randomized controlled clinical trial undertaken in this population. During planning, execution, analysis, and reporting of the trial, many lessons were learned, including those related to the use of a composite cardiovascular primary endpoint, definition of endpoints (particularly heart failure and severe unremitting hyperparathyroidism), importance of age for optimal stratification at randomization, use of unadjusted and adjusted intention-to-treat analysis for the primary outcome, how to respond to a lower-than-predicted event rate during the trial, development of a prespecified analytic plan that accounted for nonadherence and for cointerventions that diminished the power of the trial to observe a treatment effect, determination of the credibility of a subgroup effect, use of adverse effects database to investigate rare diseases, collection of blood for biomarker measurement not designated before trial initiation, and interpretation of the benefits-to-harms ratio for individual patients. It is likely that many of these issues will arise in the planning of future trials in CKD.

Abstract

Patients with end-stage renal disease often have derangements in calcium and phosphorus homeostasis and resultant secondary hyperparathyroidism (sHPT), which may contribute to the high prevalence of arterial stiffness and hypertension. We conducted a secondary analysis of the Evaluation of Cinacalcet Hydrochloride Therapy to Lower Cardiovascular Events (EVOLVE) trial, in which patients receiving hemodialysis with sHPT were randomly assigned to receive cinacalcet or placebo. We sought to examine whether the effect of cinacalcet on death and major cardiovascular events was modified by baseline pulse pressure as a marker of arterial stiffness, and whether cinacalcet yielded any effects on blood pressure. As reported previously, an unadjusted intention-to-treat analysis failed to conclude that randomization to cinacalcet reduces the risk of the primary composite end point (all-cause mortality or non-fatal myocardial infarction, heart failure, hospitalization for unstable angina or peripheral vascular event). However, after prespecified adjustment for baseline characteristics, patients randomized to cinacalcet experienced a nominally significant 13% lower adjusted risk (95% confidence limit 4-20%) of the primary composite end point. The effect of cinacalcet was not modified by baseline pulse pressure (Pinteraction=0.44). In adjusted models, at 20 weeks cinacalcet resulted in a 2.2?mm?Hg larger average decrease in systolic blood pressure (P=0.002) and a 1.3?mm?Hg larger average decrease in diastolic blood pressure (P=0.002) compared with placebo. In summary, in the EVOLVE trial, the effect of cinacalcet on death and major cardiovascular events was independent of baseline pulse pressure.

Abstract

Interventional trials have used either the Modification of Diet in Renal Disease (MDRD) or chronic kidney disease (CKD)-Epidemiology Collaboration (CKD-EPI) equation for determination of estimated glomerular filtration rate (eGFR) to define whether participants have stages 3-5 CKD. The equation used to calculate eGFR may influence the number and characteristics of participants designated as having CKD.We examined the classification of CKD at baseline using both equations in the Systolic Blood Pressure Intervention Trial (SPRINT). eGFR was calculated at baseline using fasting serum creatinine values from a central laboratory.Among 9,308 participants with baseline CKD classification using the 4-variable MDRD equation specified in the SPRINT protocol, 681 (7.3%) participants were reclassified to a less advanced CKD stage (higher eGFR) and 346 (3.7%) were reclassified to a more advanced CKD stage (lower eGFR) when the CKD-EPI equation was used to calculate eGFR. For eGFRs <90 ml/min/1.73 m2, participants <75 years were more likely to be reclassified to a less advanced CKD stage; this reclassification was more likely to occur in non-blacks rather than blacks. Participants aged ?75 years were more likely to be reclassified to a more advanced than a less advanced CKD stage, regardless of baseline CKD stage. Reclassification of baseline CKD status (eGFR <60 ml/min/1.73 m2) occurred in 3% of participants.Use of the MDRD equation led to a higher percentage of participants being classified as having CKD stages 3-4. Younger and non-black participants were more likely to be reclassified as not having CKD using the CKD-EPI equation.

Abstract

Parathyroidectomy is the only curative therapy for patients with primary hyperparathyroidism. However, the incidence, correlates and consequences of parathyroidectomy for primary hyperparathyroidism across the entire US population are unknown. We evaluated temporal trends in rates of inpatient parathyroidectomy for primary hyperparathyroidism, and associated in-hospital mortality, length of stay, and costs. We used the Healthcare Cost and Utilization Project Nationwide Inpatient Sample (NIS) from 2002-2011. Parathyroidectomies for primary hyperparathyroidism were identified using International Classification of Diseases, Ninth Revision codes. Unadjusted and age- and sex- adjusted rates of inpatient parathyroidectomy for primary hyperparathyroidism were derived from the NIS and the annual US Census. We estimated 109,583 parathyroidectomies for primary hyperparathyroidism between 2002 and 2011. More than half (55.4%) of patients were younger than age 65, and more than three-quarters (76.8%) were female. The overall rate of inpatient parathyroidectomy was 32.3 cases per million person-years. The adjusted rate decreased from 2004 (48.3 cases/million person-years) to 2007 (31.7 cases/million person-years) and was sustained thereafter. Although inpatient parathyroidectomy rates declined over time across all geographic regions, a steeper decline was observed in the South compared to other regions. Overall in-hospital mortality rates were 0.08%: 0.02% in patients younger than 65 years and 0.14% in patients 65 years and older. Inpatient parathyroidectomy rates for primary hyperparathyroidism have declined in recent years.

Abstract

In previous reports of the Frequent Hemodialysis Network trials, frequent hemodialysis (HD) reduced extracellular fluid (ECF) and left ventricular mass (LVM), with more pronounced effects observed among patients with low urine volume (UVol). We analyzed the effect of frequent HD on interdialytic weight gain (IDWG) and a time-integrated estimate of ECF load (TIFL). We also explored whether volume and sodium loading contributed to the change in LVM over the study period. Treatment effects on volume parameters were analyzed for modification by UVol and the dialysate-to-serum sodium gradient. Predictors of change in LVM were determined using linear regression. Frequent HD reduced IDWG and TIFL in the Daily Trial. Among patients with UVol <100 ml/day, reduction in TIFL was associated with LVM reduction. This suggests that achievement of better volume control could attenuate changes in LVM associated with mortality and cardiovascular morbidity. TIFL may prove more useful than IDWG alone in guiding HD practice. Video Journal Club 'Cappuccino with Claudio Ronco' at http://www.karger.com/?doi=441966.

Abstract

Previous economic evaluations of cinacalcet in patients with secondary hyperparathyroidism (sHPT) relied on the combination of surrogate end points in clinical trials and epidemiologic studies.The objective was to conduct an economic evaluation of cinacalcet on the basis of the EValuation Of Cinacalcet HCl Therapy to Lower CardioVascular Events (EVOLVE) trial from a US payer perspective.We developed a semi-Markov model to assess the cost-effectiveness of cinacalcet in addition to conventional therapy, compared with conventional therapy alone, in patients with moderate-to-severe sHPT receiving hemodialysis. We used treatment effect estimates from the unadjusted intent-to-treat (ITT) analysis and prespecified covariate-adjusted ITT analysis as our main analyses. We assessed model sensitivity to variations in individual inputs and overall decision uncertainty through probabilistic sensitivity analyses.The incremental cost-effectiveness ratio (ICER) for cinacalcet was $61,705 per life-year and $79,562 per quality-adjusted life-year (QALY) gained using the covariate-adjusted ITT analysis. Probabilistic sensitivity analysis suggested a 73.2% chance of the ICER being below a willingness-to-pay threshold of $100,000. Treatment effects from unadjusted ITT analysis yielded an ICER of $115,876 per QALY. The model was most sensitive to the treatment effect on mortality.In the unadjusted ITT analysis, cinacalcet does not represent a cost- effective use of health care resources when applying a willingness-to-pay threshold of $100,000 per QALY. When using the covariate-adjusted ITT treatment effect, which represents the least biased estimate, however, cinacalcet is a cost-effective therapy for patients with moderate-to-severe sHPT on hemodialysis.

Abstract

This study aims to determine whether menopausal symptoms differed between women with chronic kidney disease (CKD) and women without CKD, and whether CKD modified associations of late vasomotor symptoms (VMS) with mortality and/or cardiovascular events.CKD, defined as estimated glomerular filtration rate lower than 60?mL/minute/1.73?m (using the Chronic Kidney Disease Epidemiology Collaboration equation), was determined in 17,891 postmenopausal women, aged 50 to 79 years at baseline, in the multiethnic Women's Health Initiative cohort. Primary outcomes were presence, severity, and timing/duration of VMS (self-reported hot flashes and night sweats) at baseline. We used polytomous logistic regression to test for associations among CKD and four VMS categories (no VMS; early VMS-present before menopause but not at study baseline; late VMS-present only at study baseline; persistent VMS-present before menopause and study baseline) and Cox regression to determine whether CKD modified associations between late VMS and mortality or cardiovascular events.Women with CKD (1,017 of 17,891; mean estimated glomerular filtration rate, 50.7?mL/min/1.73?m) were more likely to have had menopause before age 45 years (26% vs 23%, P?=?0.02) but were less likely to experience VMS (38% vs 46%, P?0.001) than women without CKD. Women with CKD were not more likely than women without CKD to experience late VMS. Late VMS (hazard ratio, 1.16; 95% CI, 1.04-1.29) and CKD (hazard ratio, 1.74; 95% CI, 1.54-1.97) were each independently associated with increased risk for mortality, but CKD did not modify the association of late VMS with mortality (Pinteraction?=?0.53), coronary heart disease (Pinteraction?=?0.12), or stroke (Pinteraction?=?0.68).Women with mild CKD experience earlier menopause and fewer VMS than women without CKD.

Abstract

Few data are available regarding the long-term mortality rate for patients receiving nocturnal home hemodialysis.Posttrial observational study.Frequent Hemodialysis Network (FHN) Nocturnal Trial participants who consented to extended follow-up.The FHN Nocturnal Trial randomly assigned 87 individuals to 6-times-weekly home nocturnal hemodialysis or 3-times-weekly hemodialysis for 1 year. Patients were enrolled starting in March 2006 and follow-up was completed by May 2010. After the 1-year trial concluded, FHN Nocturnal participants were free to modify their hemodialysis prescription.We obtained dates of death and kidney transplantation through July 2011 using linkage to the US Renal Data System and queries of study centers. We used log-rank tests and Cox regression to relate mortality to the initial randomization assignment.Median follow-up for the trial and posttrial observational period was 3.7 years. In the nocturnal arm, there were 2 deaths during the 12-month trial period and an additional 12 deaths during the extended follow-up. In the conventional arm, the numbers of deaths were 1 and 4, respectively. In the nocturnal dialysis group, the overall mortality HR was 3.88 (95% CI, 1.27-11.79; P=0.01). Using as-treated analysis with a 12-month running treatment average, the HR for mortality was 3.06 (95% CI, 1.11-8.43; P=0.03). Six-month running treatment data analysis showed an HR of 1.12 (95% CI, 0.44-3.22; P=0.7).These results should be interpreted cautiously due to a surprisingly low (0.03 deaths/patient-year) mortality rate for individuals randomly assigned to conventional home hemodialysis, low statistical power for the mortality comparison due to the small sample size, and the high rate of hemodialysis prescription changes.Patients randomly assigned to nocturnal hemodialysis had a higher mortality rate than those randomly assigned to conventional dialysis. The implications of this result require further investigation.

Abstract

Medicare reimbursement policy encourages frequent provider visits for patients with ESRD undergoing hemodialysis. We hypothesize that patients seen more frequently by their nephrologist or advanced practitioner within the first 90 days of hemodialysis are more likely to undergo surgery to create an arteriovenous (AV) fistula or place an AV graft. We selected 35,959 patients aged ?67 years starting hemodialysis in the United States from a national registry. We used multivariable regression to evaluate the associations between mean visit frequency and AV fistula creation or graft placement in the first 90 days of hemodialysis. We conducted an instrumental variable analysis to test the sensitivity of our findings to potential bias from unobserved characteristics. One additional visit per month in the first 90 days of hemodialysis was associated with a 21% increase in the odds of AV fistula creation or graft placement during that period (95% confidence interval, 19% to 24%), corresponding to an average 4.5% increase in absolute probability. An instrumental variable analysis demonstrated similar findings. Excluding visits in months when patients were hospitalized, one additional visit per month was associated with a 10% increase in odds of vascular access surgery (95% confidence interval, 8% to 13%). In conclusion, patients seen more frequently by care providers in the first 90 days of hemodialysis undergo earlier AV fistula creation or graft placement. Payment policies that encourage more frequent visits to patients at key clinical time points may yield more favorable health outcomes than policies that operate irrespective of patients' health status.

Abstract

Patients with kidney disease have disordered bone and mineral metabolism, including elevated serum concentrations of fibroblast growth factor-23 (FGF23). These elevated concentrations are associated with cardiovascular and all-cause mortality. The objective was to determine the effects of the calcimimetic cinacalcet (versus placebo) on reducing serum FGF23 and whether changes in FGF23 are associated with death and cardiovascular events.This was a secondary analysis of a randomized clinical trial comparing cinacalcet to placebo in addition to conventional therapy (phosphate binders/vitamin D) in patients receiving hemodialysis with secondary hyperparathyroidism (intact parathyroid hormone ?300 pg/mL). The primary study end point was time to death or a first nonfatal cardiovascular event (myocardial infarction, hospitalization for angina, heart failure, or a peripheral vascular event). This analysis included 2985 patients (77% of randomized) with serum samples at baseline and 2602 patients (67%) with samples at both baseline and week 20. The results demonstrated that a significantly larger proportion of patients randomized to cinacalcet had ?30% (68% versus 28%) reductions in FGF23. Among patients randomized to cinacalcet, a ?30% reduction in FGF23 between baseline and week 20 was associated with a nominally significant reduction in the primary composite end point (relative hazard, 0.82; 95% confidence interval, 0.69-0.98), cardiovascular mortality (relative hazard, 0.66; 95% confidence interval, 0.50-0.87), sudden cardiac death (relative hazard, 0.57; 95% confidence interval, 0.37-0.86), and heart failure (relative hazard, 0.69; 95% confidence interval, 0.48-0.99).Treatment with cinacalcet significantly lowers serum FGF23. Treatment-induced reductions in serum FGF23 are associated with lower rates of cardiovascular death and major cardiovascular events.URL: http://www.clinicaltrials.gov. Unique identifier: NCT00345839.

Abstract

India is experiencing an alarming rise in the burden of noncommunicable diseases, but data on the incidence of chronic kidney disease (CKD) are sparse. Using the Center for Cardiometabolic Risk Reduction in South Asia surveillance study (a population-based survey of Delhi and Chennai, India) we estimated overall, and age-, sex-, city-, and diabetes-specific prevalence of CKD, and defined the distribution of the study population by the Kidney Disease Improving Global Outcomes (KDIGO) classification scheme. The likelihood of cardiovascular events in participants with and without CKD was estimated by the Framingham and Interheart Modifiable Risk Scores. Of the 12,271 participants, 80% had complete data on serum creatinine and albuminuria. The prevalence of CKD and albuminuria, age standardized to the World Bank 2010 world population, was 8.7% (95% confidence interval: 7.9-9.4%) and 7.1% (6.4-7.7%), respectively. Nearly 80% of patients with CKD had an abnormally high hemoglobin A1c (5.7 and above). Based on KDIGO guidelines, 6.0, 1.0, and 0.5% of study participants are at moderate, high, or very high risk for experiencing CKD-associated adverse outcomes. The cardiovascular risk scores placed a greater proportion of patients with CKD in the high-risk categories for experiencing cardiovascular events when compared with participants without CKD. Thus, 1 in 12 individuals living in two of India's largest cities have evidence of CKD, with features that put them at high risk for adverse outcomes.

Abstract

Trimethylamine N-oxide (TMAO) is a product of metabolism of phosphatidylcholine (lecithin) and carnitine by the intestinal microbiome. Elevated serum concentrations of TMAO have been linked to adverse cardiovascular outcomes in the general population. We examined correlates of serum TMAO and the relations among serum TMAO concentrations, all-cause mortality, and cardiovascular mortality and hospitalizations in a nationally derived cohort of patients new to hemodialysis (HD).We quantified serum TMAO by liquid chromatography and online tandem mass spectrometry and assessed nutritional and cardiovascular risk factors in 235 patients receiving HD and measured TMAO in pooled serum from healthy controls. We analyzed time to death and time to cardiovascular death or hospitalization using Cox proportional hazards regression.Serum TMAO concentrations of patients undergoing HD (median, 43 ?M/L; 25th-75th percentile, 28-67 ?M/L) were elevated compared with those with normal or near-normal kidney function (1.41 ± 0.49 ?M/L). TMAO was directly correlated with serum albumin (Spearman rank correlation, 0.24; 95% CI, 0.12-0.35; P

Abstract

The capacity of risk prediction to guide management of CKD in underserved health settings is unknown. We conducted a retrospective cohort study of 28,779 adults with nondialysis-requiring CKD who received health care in two large safety net health systems during 1996-2009 and were followed for ESRD through September of 2011. We developed and evaluated the performance of ESRD risk prediction models using recently proposed criteria designed to inform population health approaches to disease management: proportion of cases followed and proportion that needs to be followed. Overall, 1730 persons progressed to ESRD during follow-up (median follow-up=6.6 years). ESRD risk for time frames up to 5 years was highly concentrated among relatively few individuals. A predictive model using five common variables (age, sex, race, eGFR, and dipstick proteinuria) performed similarly to more complex models incorporating extensive sociodemographic and clinical data. Using this model, 80% of individuals who eventually developed ESRD were among the 5% of cohort members at the highest estimated risk for ESRD at 1 year. Similarly, a program that followed 8% and 13% of individuals at the highest ESRD risk would have included 80% of those who eventually progressed to ESRD at 3 and 5 years, respectively. In this underserved health setting, a simple five-variable model accurately predicts most cases of ESRD that develop within 5 years. Applying risk prediction using a population health approach may improve CKD surveillance and management of vulnerable groups by directing resources to a small subpopulation at highest risk for progressing to ESRD.

Abstract

Fractures are frequent in patients receiving hemodialysis. We tested the hypothesis that cinacalcet would reduce the rate of clinical fractures in patients receiving hemodialysis using data from the Evaluation of Cinacalcet HCl Therapy to Lower Cardiovascular Events trial, a placebo-controlled trial that randomized 3883 hemodialysis patients with secondary hyperparathyroidism to receive cinacalcet or placebo for ?64 months. This study was a prespecified secondary analysis of the trial whose primary end point was all-cause mortality and non-fatal cardiovascular events, and one of the secondary end points was first clinical fracture event. Clinical fractures were observed in 255 of 1935 (13.2%) patients randomized to placebo and 238 of 1948 (12.2%) patients randomized to cinacalcet. In an unadjusted intention-to-treat analysis, the relative hazard for fracture (cinacalcet versus placebo) was 0.89 (95% confidence interval [95% CI], 0.75 to 1.07). After adjustment for baseline characteristics and multiple fractures, the relative hazard was 0.83 (95% CI, 0.72 to 0.98). Using a prespecified lag-censoring analysis (a measure of actual drug exposure), the relative hazard for fracture was 0.72 (95% CI, 0.58 to 0.90). When participants were censored at the time of cointerventions (parathyroidectomy, transplant, or provision of commercial cinacalcet), the relative hazard was 0.71 (95% CI, 0.58 to 0.87). Fracture rates were higher in older compared with younger patients and the effect of cinacalcet appeared more pronounced in older patients. In conclusion, using an unadjusted intention-to-treat analysis, cinacalcet did not reduce the rate of clinical fracture. However, when accounting for differences in baseline characteristics, multiple fractures, and/or events prompting discontinuation of study drug, cinacalcet reduced the rate of clinical fracture by 16%-29%.

Abstract

The calcimimetic cinacalcet reduced the risk of death or cardiovascular (CV) events in older, but not younger, patients with moderate to severe secondary hyperparathyroidism (HPT) who were receiving hemodialysis. To determine whether the lower risk in younger patients might be due to lower baseline CV risk and more frequent use of cointerventions that reduce parathyroid hormone (kidney transplantation, parathyroidectomy, and commercial cinacalcet use), this study examined the effects of cinacalcet in older (?65 years, n=1005) and younger (<65 years, n=2878) patients.Evaluation of Cinacalcet HCl Therapy to Lower Cardiovascular Events (EVOLVE) was a global, multicenter, randomized placebo-controlled trial in 3883 prevalent patients on hemodialysis, whose outcomes included death, major CV events, and development of severe unremitting HPT. The age subgroup analysis was prespecified.Older patients had higher baseline prevalence of diabetes mellitus and CV comorbidity. Annualized rates of kidney transplantation and parathyroidectomy were >3-fold higher in younger relative to older patients and were more frequent in patients randomized to placebo. In older patients, the adjusted relative hazard (95% confidence interval) for the primary composite (CV) end point (cinacalcet versus placebo) was 0.70 (0.60 to 0.81); in younger patients, the relative hazard was 0.97 (0.86 to 1.09). Corresponding adjusted relative hazards for mortality were 0.68 (0.51 to 0.81) and 0.99 (0.86 to 1.13). Reduction in the risk of severe unremitting HPT was similar in both groups.In the EVOLVE trial, cinacalcet decreased the risk of death and of major CV events in older, but not younger, patients with moderate to severe HPT who were receiving hemodialysis. Effect modification by age may be partly explained by differences in underlying CV risk and differential application of cointerventions that reduce parathyroid hormone.

Abstract

Intention-to-treat (ITT) analysis is widely used to establish efficacy in randomized clinical trials. However, in a long-term outcomes study where non-adherence to study drug is substantial, the on-treatment effect of the study drug may be underestimated using the ITT analysis. The analyses presented herein are from the EVOLVE trial, a double-blind, placebo-controlled, event-driven cardiovascular outcomes study conducted to assess whether a treatment regimen including cinacalcet compared with placebo in addition to other conventional therapies reduces the risk of mortality and major cardiovascular events in patients receiving hemodialysis with secondary hyperparathyroidism. Pre-specified sensitivity analyses were performed to assess the impact of non-adherence on the estimated effect of cinacalcet. These analyses included lag-censoring, inverse probability of censoring weights (IPCW), rank preserving structural failure time model (RPSFTM) and iterative parameter estimation (IPE). The relative hazard (cinacalcet versus placebo) of mortality and major cardiovascular events was 0.93 (95% confidence interval 0.85, 1.02) using the ITT analysis; 0.85 (0.76, 0.95) using lag-censoring analysis; 0.81 (0.70, 0.92) using IPCW; 0.85 (0.66, 1.04) using RPSFTM and 0.85 (0.75, 0.96) using IPE. These analyses, while not providing definitive evidence, suggest that the intervention may have an effect while subjects are receiving treatment. The ITT method remains the established method to evaluate efficacy of a new treatment; however, additional analyses should be considered to assess the on-treatment effect when substantial non-adherence to study drug is expected or observed.

Abstract

Uncontrolled secondary hyperparathyroidism (sHPT) in patients with ESRD is a risk factor for calcific uremic arteriolopathy (CUA; calciphylaxis).Adverse event reports collected during the Evaluation of Cinacalcet HCl Therapy to Lower Cardiovascular Events trial were used to determine the frequency of CUA in patients receiving hemodialysis who had moderate to severe sHPT, as well as the effects of cinacalcet versus placebo. CUA events were collected while patients were receiving the study drug.Among the 3861 trial patients who received at least one dose of the study drug, 18 patients randomly assigned to placebo and six assigned to cinacalcet developed CUA (unadjusted relative hazard, 0.31; 95% confidence interval [95% CI], 0.13 to 0.79; P=0.014). Corresponding cumulative event rates (95% CI) at year 4 were 0.011% (0.006% to 0.018%) and 0.005% (0.002% to 0.010%). By multivariable analysis, other factors associated with CUA included female sex, higher body mass index, higher diastolic BP, and history of dyslipidemia or parathyroidectomy. Median (10%, 90% percentile) plasma parathyroid hormone concentrations proximal to the report of CUA were 796 (225, 2093) pg/ml and 410 (71, 4957) pg/ml in patients randomly assigned to placebo and cinacalcet, respectively. Active use of vitamin K antagonists was recorded in 11 of 24 patients with CUA, nine randomly assigned to placebo, and two to cinacalcet, in contrast to 5%-7% at any one time point in patients in whom CUA was not reported.Cinacalcet appeared to reduce the incidence of CUA in hemodialysis recipients who have moderate to severe sHPT.

Abstract

Infection is a common cause of hospitalization in adults receiving hemodialysis. Limited data are available about downstream events resulting from or following these hospitalizations.Retrospective cohort study using the US Renal Data System.Medicare beneficiaries initiating in-center hemodialysis therapy in 2005 to 2008.Demographics, dual Medicare/Medicaid eligibility, body mass index, comorbid conditions, initial vascular access type, nephrology care prior to dialysis therapy initiation, residence in a care facility, tobacco use, biochemical measures, and type of infection.30-day hospital readmission or death following first infection-related hospitalization.60,270 Medicare beneficiaries had at least one hospitalization for infection. Of those who survived the initial hospitalization, 15,113 (27%) were readmitted and survived the 30 days following hospital discharge, 1,624 (3%) were readmitted to the hospital and then died within 30 days of discharge, and 2,425 (4%) died without hospital readmission. Complications related to dialysis access, sepsis, and heart failure accounted for 12%, 9%, and 7% of hospital readmissions, respectively. Factors associated with higher odds of 30-day readmission or death without readmission included non-Hispanic ethnicity, lower serum albumin level, inability to ambulate or transfer, limited nephrology care prior to dialysis therapy, and specific types of infection. In comparison, older age, select comorbid conditions, and institutionalization had stronger associations with death without readmission than with readmission.Findings limited to Medicare beneficiaries receiving in-center hemodialysis.Hospitalizations for infection among patients receiving in-center hemodialysis are associated with exceptionally high rates of 30-day hospital readmission and death without readmission.

Abstract

Previous reports of the longitudinal association between achieved blood pressure (BP) and end-stage renal disease (ESRD) among patients with chronic kidney disease (CKD) have not incorporated time-updated BP with appropriate covariate adjustment.To assess the association between baseline and time-updated systolic blood pressure (SBP) with CKD progression.Observational, prospective cohort study. (ClinicalTrials.gov: NCT00304148).7 U.S. clinical centers.Patients in the Chronic Renal Insufficiency Cohort Study (n = 3708) followed for a median of 5.7 years (25th to 75th percentile, 4.6 to 6.7 years).The mean of 3 seated SBP measurements made up the visit-specific SBP. Time-updated SBP was the mean of that and all previous visits. Outcomes were ESRD and the composite end point of ESRD or halving of the estimated glomerular filtration rate. Analyses investigating baseline and time-updated SBP used Cox proportional hazards models and marginal structural models, respectively.Systolic blood pressure was 130 mm Hg or greater at all visits in 19.2% of patients. The hazard ratio for ESRD among patients with SBP of 130 to 139 mm Hg, compared with SBP less than 120 mm Hg, was 1.46 (95% CI, 1.13 to 1.88) using only baseline data and 2.37 (CI, 1.48 to 3.80) using time-updated data. Among patients with SBP of 140 mm Hg or greater, corresponding hazard ratios were 1.46 (CI, 1.18 to 1.88) and 3.37 (CI, 2.26 to 5.03) for models using only baseline data and those using time-updated data, respectively.Blood pressure was measured once annually, and the cohort was not a random sample.Time-updated SBP greater than 130 mm Hg was more strongly associated with CKD progression than analyses based on baseline SBP.National Institute of Diabetes and Digestive and Kidney Diseases.

Abstract

Medicare reimbursement policy encourages frequent provider visits to patients with ESRD undergoing hemodialysis. This study sought to determine whether more frequent face-to-face provider (physician and advanced practitioner) visits lead to more procedures and therapeutic interventions aimed at preserving arteriovenous fistulas and grafts, improved vascular access outcomes, and fewer related hospitalizations.Multivariable regression was used to evaluate the association between provider (physician and advanced practitioner) visit frequency and interventions aimed at preserving vascular access, vascular access survival, hospitalization for vascular access infection, and outpatient antibiotic use in a cohort of 63,488 Medicare beneficiaries receiving hemodialysis in the United States. Medicare claims were used to identify the type of vascular access used, access-related events, and vascular access failure.One additional provider (physician and advanced practitioner) visit per month was associated with a 13% higher odds of receiving an intervention to preserve vascular access (95% confidence interval [95% CI], 12% to 14%) but was not associated with vascular access survival (hazard ratio, 1.01; 95% CI, 0.99 to 1.03). One additional provider visit was associated with a 9% (95% CI, 5% to 14%) lower odds of hospitalization for vascular access infection and a corresponding 9% (95% CI, 5% to 14%) higher odds of outpatient intravenous antibiotic administration. However, the associated changes in absolute probabilities of hospitalization and antibiotic administration were small.More frequent face-to-face provider (physician and advanced practitioner) visits were associated with more procedures and therapeutic interventions aimed at preserving vascular accesses, but not with prolonged vascular access survival and only a small decrease in hospitalization for vascular access.

Abstract

Most patients with diabetic kidney disease (DKD) experience disease progression despite receiving standard care therapy. Oxidative stress is associated with DKD severity and risk of progression, but currently approved therapies do not directly attenuate the pathologic consequences of oxidative stress. GS-4997 is a once daily, oral molecule that inhibits Apoptosis Signal-regulating Kinase 1 (ASK1), which is a key mediator of the deleterious effects of oxidative stress.We describe the rationale and design of a Phase 2 placebo-controlled clinical trial investigating the effects of GS-4997 in patients with T2DM and stage 3/4 DKD receiving standard of care therapy. Approximately, 300 subjects will be randomized in a stratified manner, based on the estimated glomerular filtration rate (eGFR) and urine albumin to creatinine ratio, to one of four arms in this dose-ranging study. The primary endpoint is change in eGFR at 48 weeks, and the key secondary endpoint is change in albuminuria.Guided by the biology of oxidative stress signaling through ASK1, the biology of DKD pathogenesis, and solid statistical methods, the decisions made for this Phase 2 study regarding delineating study population, efficacy outcomes, treatment period and statistical methods represent innovative attempts to resolve challenges specific to DKD study design.

Abstract

To demonstrate how expanding services covered by a "bundled payment" can also expand variation in the costs of treating patients under the bundle, using the Medicare dialysis program as an example.Observational claims-based study of 197,332 Medicare hemodialysis beneficiaries enrolled for at least one quarter during 2006-2008.We estimated how resource utilization (all health services, dialysis-related services, and medications) changes with intensity of secondary hyperparathyroidism (sHPT) treatment.Using Medicare claims, a patient-quarter level dataset was constructed, including a measure of sHPT treatment intensity.Under the existing, narrow dialysis bundle, utilization of covered services is relatively constant across treatment intensity groups; under a broader bundle, it rises more rapidly with treatment intensity.The broader Medicare dialysis bundle reimburses providers uniformly, even though patients treated more intensively for sHPT cost more to treat. Absent any payment adjustments or efforts to ensure quality, this flat payment schedule may encourage providers to avoid high-intensity patients or reduce their treatment intensity. The first incentive harms efficiency. The second may improve or worsen efficiency, depending on whether it reduces appropriate or inappropriate treatment.

Abstract

A phase 3 randomized clinical trial was designed to test whether bardoxolone methyl, a nuclear factor erythroid-2-related factor 2 (Nrf2) activator, slows progression to end-stage renal disease in patients with stage 4 chronic kidney disease and type 2 diabetes mellitus. The trial was terminated because of an increase in heart failure in the bardoxolone methyl group; many of the events were clinically associated with fluid retention.We randomized 2,185 patients with type 2 diabetes mellitus (T2DM) and stage 4 chronic kidney disease (CKD) (estimated glomerular filtration rate 15 to <30 mL min(-1) 1.73 m(-2)) to once-daily bardoxolone methyl (20 mg) or placebo. We used classification and regression tree analysis to identify baseline factors predictive of heart failure or fluid overload events. Elevated baseline B-type natriuretic peptide and previous hospitalization for heart failure were identified as predictors of heart failure events; bardoxolone methyl increased the risk of heart failure by 60% in patients with these risk factors. For patients without these baseline characteristics, the risk for heart failure events among bardoxolone methyl- and placebo-treated patients was similar (2%). The same risk factors were also identified as predictors of fluid overload and appeared to be related to other serious adverse events.Bardoxolone methyl contributed to events related to heart failure and/or fluid overload in a subpopulation of susceptible patients with an increased risk for heart failure at baseline. Careful selection of participants and vigilant monitoring of the study drug will be required in any future trials of bardoxolone methyl to mitigate the risk of heart failure and other serious adverse events.

Abstract

Premature cardiovascular disease limits the duration and quality of life on long-term hemodialysis. The objective of this study was to define the frequency of fatal and nonfatal cardiovascular events attributable to atherosclerotic and nonatherosclerotic mechanisms, risk factors for these events, and the effects of cinacalcet, using adjudicated data collected during the EValuation of Cinacalcet HCl Therapy to Lower CardioVascular Events (EVOLVE) Trial.EVOLVE was a randomized, double-blind, placebo-controlled clinical trial that randomized 3883 hemodialysis patients with moderate to severe secondary hyperparathyroidism to cinacalcet or matched placebo for up to 64 months. For this post hoc analysis, the outcome measure was fatal and nonfatal cardiovascular events reflecting atherosclerotic and nonatherosclerotic cardiovascular diseases. During the trial, 1518 patients experienced an adjudicated cardiovascular event, including 958 attributable to nonatherosclerotic disease. Of 1421 deaths during the trial, 768 (54%) were due to cardiovascular disease. Sudden death was the most frequent fatal cardiovascular event, accounting for 24.5% of overall mortality. Combining fatal and nonfatal cardiovascular events, randomization to cinacalcet reduced the rates of sudden death and heart failure. Patients randomized to cinacalcet experienced fewer nonatherosclerotic cardiovascular events (adjusted relative hazard 0.84, 95% CI 0.74 to 0.96), while the effect of cinacalcet on atherosclerotic events did not reach statistical significance.Accepting the limitations of post hoc analysis, any benefits of cinacalcet on cardiovascular disease in the context of hemodialysis may result from attenuation of nonatherosclerotic processes.Unique identifier: NCT00345839. URL: ClinicalTrials.gov.

Abstract

Coronary stenting in patients on dialysis has increased by nearly 50% over the past decade, despite heightened risks of associated stent thrombosis and bleeding relative to the general population. We examined clopidogrel, prasugrel or ticlopidine use after percutaneous coronary intervention (PCI) with stenting in patients on dialysis. We conducted 3-, 6-, and 12-month landmark analyses to test the hypothesis that thienopyridine discontinuation prior to those time points would be associated with higher risks of death, myocardial infarction, or repeat revascularization, and a lower risk of major bleeding episodes compared with continued thienopyridine use.Using the US Renal Data System, we identified 8458 patients on dialysis with Medicare Parts A+B+D undergoing PCI with stenting between July 2007 and December 2010. Ninety-nine percent of all thienopyridine prescriptions were for clopidogrel. At 3 months, 82% of patients who received drug-eluting stents (DES) had evidence of thienopyridine use. These proportions fell to 62% and 40% at 6 and 12 months, respectively. In patients who received a bare-metal stent (BMS), 70%, 34%, and 26% of patients had evidence of thienopyridine use at 3, 6, and 12 months, respectively. In patients who received a DES, there was a suggestion of higher risks of death or myocardial infarction associated with thienopyridine discontinuation in the 3-, 6-, and 12-months landmark analyses, but no higher risk of major bleeding episodes. In patients who received a BMS, there were no differences in death or cardiovascular events, and possibly lower risk of major bleeding with thienopyridine discontinuation in the 3- and 6-month landmark analyses.The majority of patients on dialysis who undergo PCI discontinue thienopyridines before 1 year regardless of stent type. While not definitive, these data suggest that longer-term thienopyridine use may be of benefit to patients on dialysis who undergo PCI with DES.

Abstract

A well-accepted definition of frailty includes measurements of physical performance, which may limit its clinical utility.In a cross-sectional study, we compared prevalence and patient characteristics based on a frailty definition that uses self-reported function to the classic performance-based definition and developed a modified self-report-based definition.Prevalent adult patients receiving hemodialysis in 14 centers around San Francisco and Atlanta in 2009-2011.Self-report-based frailty definition in which a score lower than 75 on the Physical Function scale of the 36-Item Short Form Health Survey (SF-36) was substituted for gait speed and grip strength in the classic definition; modified self-report definition with optimized Physical Function score cutoff points derived in a development (one-half) cohort and validated in the other half.Performance-based frailty defined as 3 of the following: weight loss, weakness, exhaustion, low physical activity, and slow gait speed.387 (53%) patients were frail based on self-reported function, of whom 209 (29% of the cohort) met the performance-based definition. Only 23 (3%) met the performance-based definition of frailty only. The self-report definition had 90% sensitivity, 64% specificity, 54% positive predictive value, 93% negative predictive value, and 72.5% overall accuracy. Intracellular water per kilogram of body weight and serum albumin, prealbumin, and creatinine levels were highest among nonfrail individuals, intermediate among those who were frail by self-report, and lowest among those who also were frail by performance. Age, percentage of body fat, and C-reactive protein level followed an opposite pattern. The modified self-report definition had better accuracy (84%; 95% CI, 79%-89%) and superior specificity (88%) and positive predictive value (67%).Our study did not address prediction of outcomes.Patients who meet the self-report-based but not the performance-based definition of frailty may represent an intermediate phenotype. A modified self-report definition can improve the accuracy of a questionnaire-based method of defining frailty.

Abstract

A focus of health care reform has been on reducing 30-day hospital readmissions. Patients with ESRD are at high risk for hospital readmission. It is unknown whether more monitoring by outpatient providers can reduce hospital readmissions in patients receiving hemodialysis. In nationally representative cohorts of patients in the United States receiving in-center hemodialysis between 2004 and 2009, we used a quasi-experimental (instrumental variable) approach to assess the relationship between frequency of visits to patients receiving hemodialysis following hospital discharge and the probability of rehospitalization. We then used a multivariable regression model and published hospitalization data to estimate the cost savings and number of hospitalizations that could be prevented annually with additional provider visits to patients in the month following hospitalization. In the main cohort (n=26,613), one additional provider visit in the month following hospital discharge was estimated to reduce the absolute probability of 30-day hospital readmission by 3.5% (95% confidence interval, 1.6% to 5.3%). The reduction in 30-day hospital readmission ranged from 0.5% to 4.9% in an additional four cohorts tested, depending on population density around facilities, facility profit status, and patient Medicaid eligibility. At current Medicare reimbursement rates, the effort to visit patients one additional time in the month following hospital discharge could lead to 31,370 fewer hospitalizations per year, and $240 million per year saved. In conclusion, more frequent physician visits following hospital discharge are estimated to reduce rehospitalizations in patients undergoing hemodialysis. Incentives for closer outpatient monitoring following hospital discharge could lead to substantial cost savings.

Abstract

Accurate prognostic models could inform treatment decisions for older adults with end-stage renal disease who are considering dialysis and might identify patients more appropriate for conservative care or hospice.In a cohort of patients aged ?67 years commencing dialysis in the United States between January 1, 2008 and June 30, 2009, we compared the discrimination of three existing instruments (the Liu index; the French Renal Epidemiology and Information Network score; and hospice eligibility criteria) for the prediction of 6-month mortality. We estimated the odds of death associated with each prognostic index using logistic regression with and without adjustment for age. Predictive indices were compared using the concordance ("c")-statistic.Of 44,109 eligible patients, 10,289 (23.3%) died within 6 months of dialysis initiation. The c-statistic for the Liu, Renal Epidemiology and Information Network, hospice eligibility criteria, and combined Liu/hospice eligibility criteria scores without and with age were 0.62/0.65, 0.63/0.66, 0.65/0.68, and 0.68/0.70, respectively. Discrimination was poorer at older ages, especially for the Liu and Renal Epidemiology and Information Network scores. Although sensitivity was poor, a Renal Epidemiology and Information Network score ?9 or an hospice eligibility criteria ?3 had relatively high specificity.Existing prognostic indices based on administrative data perform poorly with respect to prediction of 6-month mortality in older patients with end-stage renal disease commencing dialysis.

Abstract

To identify homeless people with chronic kidney disease (CKD) who were at highest risk for end-stage renal disease (ESRD), we studied 982 homeless and 15,674 domiciled people with CKD receiving public health care. We developed four risk prediction models for the primary outcome of ESRD. Overall, 71 homeless and 888 domiciled people progressed to ESRD during follow-up (median: 6.6 years). Homeless people with CKD experienced significantly higher incidence rates of ESRD than poor but domiciled peers. Most homeless people who developed progressive CKD were readily identifiable well before ESRD using a prediction model with five common variables. We estimated that program following homeless people in the highest decile of ESRD risk would have captured 64-85% of those who eventually progressed to ESRD within five years. Thus, an approach targeting homeless people at high risk for ESRD appears feasible and could reduce substantial morbidity and costs incurred by this highly vulnerable group.

Abstract

Delivered dialysis dose by continuous renal replacement therapies (CRRT) depends on circuit efficacy, which is influenced in part by the anticoagulation strategy. We evaluated the association of anticoagulation strategy used on solute clearance efficacy, circuit longevity, bleeding complications, and mortality. We analyzed data from 1740 sessions 24?h in length among 244 critically ill patients, with at least 48?h on CRRT. Regional citrate, heparin, or saline flushes was variably used to prevent or attenuate filter clotting. We calculated delivered dose using the standardized Kt/Vurea . We monitored filter efficacy by calculating effluent urea nitrogen/blood urea nitrogen ratios. Filter longevity was significantly higher with citrate (median 48, interquartile range [IQR] 20.3-75.0 hours) than with heparin (5.9, IQR 8.5-27.0 hours) or no anticoagulation (17.5, IQR 9.5-32 hours, P?0.0001). Delivered dose was highest in treatments where citrate was employed. Bleeding complications were similar across the three groups (P?=?0.25). Compared with no anticoagulation, odds of death was higher with the heparin use (odds ratio [OR] 1.82, 95% confidence interval [CI] 1.02-3.32; P?=?0.033), but not with citrate (OR 1.02 95% CI 0.54-1.96; P?=?0.53). Relative to heparin or no anticoagulation, the use of regional citrate for anticoagulation in CRRT was associated with significantly prolonged filter life and increased filter efficacy with respect to delivered dialysis dose. Rates of bleeding complications, transfusions, and mortality were similar across the three groups. While these and other data suggest that citrate anticoagulation may offer superior technical performance than heparin or no anticoagulation, adequately powered clinical trials comparing alternative anticoagulation strategies should be performed to evaluate overall safety and efficacy.

Abstract

To examine the structure, processes, and outcomes of American dialysis facilities that predominantly treat racial-ethnic minority patients.Secondary analysis of data from all patients who initiated dialysis during 2005-2008 in the United States.In this retrospective cohort study, we examined the associations of the racial-ethnic composition of the dialysis facility with facility-level survival and achievement of performance targets for anemia and dialysis adequacy.We obtained dialysis facility- and patient-level data from the national data registry of patients with end-stage renal disease. We linked these data with clinical performance measures from the Centers for Medicare and Medicaid Services.Overall, minority-serving facilities were markedly larger, more often community based, and less likely to offer home dialysis than facilities serving predominantly white patients. A significantly higher proportion of minority-serving dialysis facilities exhibited worse than expected survival as compared with facilities serving predominantly white patients (p < .001 for each). However, clinical performance measures for anemia and dialysis adequacy were similar across minority-serving status.While minority-serving facilities generally met dialysis performance targets mandated by Medicare, they exhibited worse than expected patient survival.

Abstract

Patients receiving hemodialysis often perceive their caregivers are overburdened. We hypothesize that increasing hemodialysis frequency would result in higher patient perceptions of burden on their unpaid caregivers.In two separate trials, 245 patients were randomized to receive in-center daily hemodialysis (6 days/week) or conventional hemodialysis (3 days/week) while 87 patients were randomized to receive home nocturnal hemodialysis (6 nights/week) or home conventional hemodialysis for 12 months. Changes in overall mean scores over time in the 10-question Cousineau perceived burden scale were compared.In total, 173 of 245 (70%) and 80 of 87 (92%) randomized patients in the Daily and Nocturnal Trials, respectively, reported having an unpaid caregiver at baseline or during follow-up. Relative to in-center conventional dialysis, the 12-month change in mean perceived burden score with in-center daily hemodialysis was -2.1 (95% confidence interval, -9.4 to +5.3; P=0.58). Relative to home conventional dialysis, the 12-month change in mean perceived burden score with home nocturnal dialysis was +6.1 (95% confidence interval, -0.8 to +13.1; P=0.08). After multiple imputation for missing data in the Nocturnal Trial, the relative difference between home nocturnal and home conventional hemodialysis was +9.4 (95% confidence interval, +0.55 to +18.3; P=0.04). In the Nocturnal Trial, changes in perceived burden were inversely correlated with adherence to dialysis treatments (Pearson r=-0.35; P=0.02).Relative to conventional hemodialysis, in-center daily hemodialysis did not result in higher perceptions of caregiver burden. There was a trend to higher perceived caregiver burden among patients randomized to home nocturnal hemodialysis. These findings may have implications for the adoption of and adherence to frequent nocturnal hemodialysis.

Abstract

IMPORTANCE Anemia is common in patients with advanced chronic kidney disease. Whereas the treatment of anemia in patients with end-stage renal disease (ESRD) has attracted considerable attention, relatively little is known about patterns and trends in the anemia care received by patients before they start maintenance dialysis or undergo preemptive kidney transplantation. OBJECTIVE To determine the trends in anemia treatment received by Medicare beneficiaries approaching ESRD. DESIGN, SETTING, AND PARTICIPANTS Closed cohort study in the United States using national ESRD registry data (US Renal Data System) of patients 67 years or older who initiated maintenance dialysis or underwent preemptive kidney transplantation between 1995 and 2010. All eligible patients had uninterrupted Medicare (A+B) coverage for at least 2 years before ESRD. EXPOSURE Time, defined as calendar year of incident ESRD. MAIN OUTCOMES AND MEASURES Use of erythropoiesis-stimulating agents (ESA), intravenous iron supplements, and blood transfusions in the 2 years prior to ESRD; hemoglobin concentration at the time of ESRD. We used multivariable modified Poisson regression to estimate utilization prevalence ratios (PRs). RESULTS Records of 466?803 patients were analyzed. The proportion of patients with incident ESRD receiving any ESA in the 2 years before increased from 3.2% in 1995 to a peak of 40.8% in 2007; thereafter, ESA use decreased modestly to 35.0% in 2010 (compared with 1995; PR,?9.85 [95% CI, 9.04-10.74]). Among patients who received an ESA, median time from first recorded ESA use to ESRD increased from 120 days in 1995 to 337 days in 2010. Intravenous iron administration increased from 1.2% (1995) to 12.3% (2010; PR,?9.20 [95% CI, 7.97-10.61]). The proportion of patients receiving any blood transfusions increased monotonically from 20.6% (1995) to 40.3% (2010; PR,?1.88 [95% CI, 1.82-1.95]). Mean hemoglobin concentrations were 9.5 g/dL in 1995, increased to a peak of 10.3 g/dL in 2006, and then decreased moderately to 9.9 g/dL in 2010. CONCLUSIONS AND RELEVANCE Between 1995 and 2010, older adults approaching ESRD were increasingly more likely to be treated with ESAs and to receive intravenous iron supplementation, but also more likely to receive blood transfusions.

Abstract

Estimating dietary intake is challenging in patients with chronic diseases. The aim of this study was to calibrate the Block Brief 2000 food frequency questionnaire (BFFQ) using 3-day food diary records among patients on dialysis.Data from 3-day food diary records from 146 patients new to dialysis were reviewed and entered into National Cancer Institute self-administered 24-hour dietary recall (ASA24), a web-based dietary interview system. The information was then re-entered omitting foods reported in the diaries that were not in the BFFQ to generate a "BFFQ-restricted" set of intakes. We modeled each major dietary component (i.e., energy [total calories], protein, carbohydrate, fat) separately using linear regression. The main independent variables were BFFQ-restricted food diary estimates computed as the average of the 3 days of diaries, restricted to items included in the BFFQ, with the unrestricted 3-day food diary averages as dependent variables.The BFFQ-restricted diary energy estimate of 1,325 ± 545 kcal was 87% of the energy intake in the full food diary (1,510.3 ± 510.4, P < .0001). The BFFQ-restricted diary carbohydrate intake was 83% of the full food diary (156.7 ± 78.7 g vs. 190.4 ± 72.7, P < .0001). The BFFQ-restricted fat intake was 90% of the full-diary-reported fat intake (50.1 ± 24.1 g vs. 56.4 ± 21.6 g, P < .0001). Daily protein intake assessments were not statistically different by BFFQ-restricted diary and full diary assessment (63.1 ± 28.5 vs. 64.1 ± 21.4 g, P = .60). The associations between BFFQ-restricted diary intake and unrestricted intake were linear. Three-day diary-reported intake could be estimated from BFFQ-restricted intake with r2 ranging from 0.36 to 0.56 (P < .0001 for energy [total calories], protein, carbohydrate, and fat). Final equations did not include adjustments for age, sex, or race because the patterns of associations were not significantly different.Energy and macronutrient estimates by BFFQ are lower than estimates from 3-day food diaries, but simple calibration equations can be used to approximate total intake from BFFQ responses.

Abstract

To examine the patient, tumor, and temporal factors associated with receipt of renal mass biopsy (RMB) in a contemporary nationally representative sample.We queried the Surveillance, Epidemiology, and End Results-Medicare data set for incident cases of renal cell carcinoma diagnosed between 1992 and 2007. We tested for associations among receipt of RMB and patient and tumor characteristics, type of therapy, and procedure type. Temporal trends in receipt of RMB were characterized over the study period.Approximately 1 in 5 (20.7%) patients diagnosed with renal cell carcinoma (n = 24,702) underwent RMB before instituting therapy. There was a steady and modest increase in RMB utilization, with the highest utilization (30%) occurring in the final study year. Of patients who underwent radical (n = 15,666) or partial (n = 2211) nephrectomy, 17% and 20%, respectively, underwent RMB in advance of surgery. Sixty-five percent of patients who underwent ablation (n = 314) underwent RMB before or in conjunction with the procedure. Roughly half of patients (50.4%) treated with systemic therapy alone underwent RMB. Factors independently associated with use of RMB included younger age, black race, Hispanic ethnicity, tumor size <7 cm, and metastatic disease at presentation.At present, most patients who eventually undergo radical or partial nephrectomy do not undergo RMB, whereas most patients who eventually undergo ablation or systemic therapy do. The optimal use of RMB in the evaluation of kidney tumors has yet to be determined.

Abstract

It is currently unknown whether any secular trends exist in the incidence and outcomes of hip fracture in kidney transplant recipients (KTR). We identified first-time KTR (1997-2010) who had >1 year of Medicare coverage and no recorded history of hip fracture. New hip fractures were identified from corresponding diagnosis and surgical procedure codes. Outcomes studied included time to hip fracture, type of surgery received and 30-day mortality. Of 69?740 KTR transplanted in 1997-2010, 597 experienced a hip fracture event during 155?341 person-years of follow-up for an incidence rate of 3.8 per 1000 person-years. While unadjusted hip fracture incidence did not change, strong confounding by case mix was present. Using year of transplantation as a continuous variable, the hazard ratio (HR) for hip fracture in 2010 compared with 1997, adjusted for demographic, dialysis, comorbid and most transplant-related factors, was 0.56 (95% confidence interval [CI]: 0.41-0.77). Adjusting for baseline immunosuppression modestly attenuated the HR (0.68; 95% CI: 0.47-0.99). The 30-day mortality was 2.2 (95% CI: 1.3-3.7) per 100 events. In summary, hip fractures remain an important complication after kidney transplantation. Since 1997, case-mix adjusted posttransplant hip fracture rates have declined substantially. Changes in immunosuppressive therapy appear to be partly responsible for these favorable findings.

Abstract

Studies of frailty among patients on hemodialysis have relied on definitions that substitute self-reported functioning for measures of physical performance and omit weight loss or substitute alternate criteria. We examined the association between body composition and a definition of frailty that includes measured physical performance and weight loss in a cross-sectional analysis of 638 adult patients receiving maintenance hemodialysis at 14 centers. Frailty was defined as having three of following characteristics: weight loss, weakness, exhaustion, low physical activity, and slow gait speed. We performed logistic regression with body mass index (BMI) and bioelectrical impedance spectroscopy (BIS)-derived estimates of intracellular water (ICW), fat mass, and extracellular water (ECW) as the main predictors, and age, sex, race, and comorbidity as covariates. Overall, 30% of participants were frail. Older age (odds ratio [OR], 1.31 per 10 years; 95% confidence interval [95% CI], 1.14 to 1.50), diabetes (OR, 1.65; 95% CI, 1.13 to 2.40), higher fat mass (OR, 1.18; 95% CI, 1.02 to 1.37), and higher ECW (OR, 1.33; 95% CI, 1.20 to 1.47) associated with higher odds of frailty. Higher ICW associated with lower odds of frailty (OR, 0.80 per kg; 95% CI, 0.73 to 0.87). The addition of BMI data did not change the area under the receiver operating characteristics curve (AUC; AUC=0.66 versus 0.66; P=0.71), but the addition of BIS data did change the AUC (AUC=0.72; P<0.001). Thus, individual components of body composition but not BMI associate strongly with frailty in this cohort of patients receiving hemodialysis.

Abstract

Few studies have examined the changes in lipoproteins over time and how inflammation is associated with lipoprotein concentrations among patients with end-stage renal disease on dialysis. One possible explanation for the association of low LDL cholesterol concentration and adverse outcomes is that inflammation reduces selected apolipoprotein concentrations.Serum samples were collected from a subsample of patients enrolled into the Comprehensive Dialysis Study every 3 months for up to 1 year. We examined the relation between temporal patterns in levels of inflammatory markers and changes in apolipoproteins (apo) A1 and B and the apo B/A1 ratio using linear mixed effects modeling and adjusting for potential confounders.We enrolled 266 participants from 56 dialysis facilities. The mean age was 62 years, 45% were women and 26% were black. Apo A1 was lower among patients with higher Quetelet's (body mass) index (BMI), diabetes mellitus and atherosclerosis. Apo B was lower among older patients, patients with higher serum creatinine and patients with lower BMI. Over the course of a year, apo A1 changed inversely with serum concentrations of the acute phase proteins C-reactive protein (CRP) and ?1 acid glycoprotein (?1AG), while apo B did not. Changes in ?1AG were more strongly associated with changes in apolipoprotein concentrations than were changes in CRP; increases in ?1AG were associated with decreases in apo A1 and increases in the apo B/A1 ratio.Changes in inflammatory markers were associated with changes in apo A1, but not apo B over 1 year, suggesting that reductions in high-density lipoprotein cholesterol are associated with inflammation, either of which could mediate cardiovascular risk, but not supporting a hypothesis linking increased risk of low levels of apo B containing lipoproteins to the risk associated with inflammation.

Abstract

Visit-to-visit blood pressure variability (VTV-BPV) is an independent risk factor for cardiovascular events and death in the general population. We sought to determine the association of VTV-BPV with outcomes in patients on hemodialysis, using data from a National Institutes of Health-sponsored randomized trial (the HEMO study). We used the coefficient of variation (CV) and the average real variability in systolic blood pressure (SBP) as metrics of VTV-BPV. In all, 1844 out of 1846 randomized subjects had at least three visits with SBP measurements and were included in the analysis. Median follow-up was 2.5 years (interquartile range 1.3-4.3 years), during which time there were 869 deaths from any cause and 408 (adjudicated) cardiovascular deaths. The mean pre-dialysis SBP CV was 9.9±4.6%. In unadjusted models, we found a 31% higher risk of death from any cause per 10% increase in VTV-BPV. This association was attenuated after multivariable adjustment but remained statistically significant. Similarly, we found a 28% higher risk of cardiovascular death per 10% increase in VTV-BPV, which was attenuated and no longer statistically significant in fully adjusted models. The associations among VTV-BPV, death and cardiovascular death were modified by baseline SBP. In a diverse, well-dialyzed cohort of patients on maintenance hemodialysis, VTV-BPV, assessed using metrics of variability in pre-dialysis SBP, was associated with a higher risk of all-cause mortality and a trend toward higher risk of cardiovascular mortality, particularly in patients with a lower baseline SBP.Journal of Human Hypertension advance online publication, 27 June 2013; doi:10.1038/jhh.2013.49.

Abstract

The relation between the quantity of many healthcare services delivered and health outcomes is uncertain. In January 2004, the Centers for Medicare and Medicaid Services introduced a tiered fee-for-service system for patients on hemodialysis, creating an incentive for providers to see patients more frequently. We analyzed the effect of this change on patient mortality, transplant wait-listing, and costs. While mortality rates for Medicare beneficiaries on hemodialysis declined after reimbursement reform, mortality declined more - or was no different - among patients whose providers were not affected by the economic incentive. Similarly, improved placement of patients on the kidney transplant waitlist was no different among patients whose providers were not affected by the economic incentive; payments for dialysis visits increased 13.7% in the year following reform. The payment system designed to increase provider visits to hemodialysis patients increased Medicare costs with no evidence of a benefit on survival or kidney transplant listing.

Abstract

Frailty is a multidimensional phenotype that describes declining physical function and a vulnerability to adverse outcomes in the setting of physical stress such as illness or hospitalization. Phase angle is a composite measure of tissue resistance and reactance measured via bioelectrical impedance analysis (BIA). Whether phase angle is associated with frailty and mortality in the general population is unknown.To evaluate associations among phase angle, frailty and mortality.Population-based survey.Third National Health and Nutritional Examination Survey (1988-1994).In all, 4,667 persons aged 60 and older.Frailty was defined according to a set of criteria derived from a definition previously described and validated.Narrow phase angle (the lowest quintile) was associated with a four-fold higher odds of frailty among women and a three-fold higher odds of frailty among men, adjusted for age, sex, race-ethnicity and comorbidity. Over a 12-year follow-up period, the adjusted relative hazard for mortality associated with narrow phase angle was 2.4 (95 % confidence interval [95 % CI] 1.8 to 3.1) in women and 2.2 (95 % CI 1.7 to 2.9) in men. Narrow phase angle was significantly associated with mortality even among participants with little or no comorbidity.Analyses of BIA and frailty were cross-sectional; BIA was not measured serially and incident frailty during follow-up was not assessed. Participants examined at home were excluded from analysis because they did not undergo BIA.Narrow phase angle is associated with frailty and mortality independent of age and comorbidity.

Abstract

The vast majority of US dialysis facilities are for-profit and profit status has been associated with processes of care and outcomes in patients on dialysis. This study examined whether dialysis facility profit status was associated with the rate of hospitalization in patients starting dialysis.This was a retrospective cohort study of Medicare beneficiaries starting dialysis between 2005 and 2008 using data from the US Renal Data System. All-cause hospitalization was examined and compared between for-profit and nonprofit dialysis facilities through 2009 using Poisson regression. Companion analyses of cause-specific hospitalization that are likely to be influenced by dialysis facility practices including hospitalizations for heart failure and volume overload, access complications, or hyperkalemia were conducted.The cohort included 150,642 patients. Of these, 12,985 (9%) were receiving care in nonprofit dialysis facilities. In adjusted models, patients receiving hemodialysis in for-profit facilities had a 15% (95% confidence interval [95% CI], 13% to 18%) higher relative rate of hospitalization compared with those in nonprofit facilities. Among patients receiving peritoneal dialysis, the rate of hospitalization in for-profit versus nonprofit facilities was not significantly different (relative rate, 1.07; 95% CI, 0.97 to 1.17). Patients on hemodialysis receiving care in for-profit dialysis facilities had a 37% (95% CI, 31% to 44%) higher rate of hospitalization for heart failure or volume overload and a 15% (95% CI, 11% to 20%) higher rate of hospitalization for vascular access complications.Hospitalization rates were significantly higher for patients receiving hemodialysis in for-profit compared with nonprofit dialysis facilities.

Abstract

End-stage renal disease is associated with reduced heart rate variability (HRV), components of which generally are associated with advanced age, diabetes mellitus and left ventricular hypertrophy. We hypothesized that daily in-center hemodialysis (HD) would increase HRV.The Frequent Hemodialysis Network (FHN) Daily Trial randomized 245 patients to receive 12 months of six versus three times per week in-center HD. Two hundred and seven patients had baseline Holter recordings. HRV measures were calculated from 24-h Holter electrocardiograms at both baseline and 12 months in 131 patients and included low-frequency power (LF, a measure of sympathetic modulation), high-frequency power (HF, a measure of parasympathetic modulation) and standard deviation (SD) of the R-R interval (SDNN, a measure of beat-to-beat variation).Baseline to Month 12 change in LF was augmented by 50% [95% confidence interval (95% CI) 6.1-112%, P =0.022] and LF + HF was augmented by 40% (95% CI 3.3-88.4%, P = 0.03) in patients assigned to daily hemodialysis (DHD) compared with conventional HD. Changes in HF and SDNN were similar between the randomized groups. The effects of DHD on LF were attenuated by advanced age and diabetes mellitus (predefined subgroups). Changes in HF (r = -0.20, P = 0.02) and SDNN (r = -0.18, P = 0.04) were inversely associated with changes in left ventricular mass (LVM).DHD increased the LF component of HRV. Reduction of LVM by DHD was associated with increased vagal modulation of heart rate (HF) and with increased beat-to-beat heart rate variation (SDNN), suggesting an important functional correlate to the structural effects of DHD on the heart in uremia.

Abstract

The burden of chronic kidney disease (CKD) will rise in parallel with the growing prevalence of type two diabetes mellitus in South Asia but is understudied. Using a cross-sectional survey of adults living in a middle-income neighborhood of Dhaka, Bangladesh, we tested the hypothesis that the prevalence of CKD in this group would approach that of the U.S. and would be strongly associated with insulin resistance.We enrolled 402 eligible adults (>30 years old) after performing a multi-stage random selection procedure. We administered a questionnaire, and collected fasting serum samples and urine samples. We used the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equation to estimate glomerular filtration rate, and sex-specific cut offs for albuminuria: > 1.9 mg/mmol (17 mg/g) for men, and >2.8 mg/mmol (25 mg/g) for women. We assessed health-related quality of life using the Medical Outcomes Study Short Form-12 (SF-12).A total of 357 (89%) participants with serum samples comprised the analytic cohort. Mean age of was 49.5 (± 12.7) years. Chronic kidney disease was evident in 94 (26%). Of the participants with CKD, 58 (62%) had albuminuria only. A participant with insulin resistance had a 3.6-fold increase in odds of CKD (95% confidence interval 2.1 to 6.4). Participants with stage three or more advanced CKD reported a decrement in the Physical Health Composite score of the SF-12, compared with participants without CKD.We found an alarmingly high prevalence of CKD-particularly CKD associated with insulin resistance-in middle-income, urban Bangladeshis.

Abstract

BACKGROUND: The prevalence of kidney stone disease is rising along with increasing rates of obesity, type 2 diabetes mellitus (T2DM), and metabolic syndrome. OBJECTIVE: To investigate the associations among the presence and severity of T2DM, glycemic control, and insulin resistance with kidney stone disease. DESIGN, SETTING, AND PARTICIPANTS: We performed a cross-sectional analysis of all adult participants in the 2007-2010 National Health and Nutrition Examination Survey (NHANES). A history of kidney stone disease was obtained by self-report. T2DM was defined by self-reported history, T2DM-related medication usage, and reported diabetic comorbidity. Insulin resistance was estimated using fasting plasma insulin (FPI) levels and the homeostasis model assessment of insulin resistance (HOMA-IR) definition. We classified glycemic control using glycosylated hemoglobin A1c (HbA1c) and fasting plasma-glucose levels (FPG). OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS: Odds ratios (OR) for having kidney stone disease were calculated for each individual measure of T2DM severity. Logistic regression models were fitted adjusting for age, sex, race/ethnicity, smoking history, and the Quételet index (body mass index), as well as laboratory values and components of metabolic syndrome. RESULTS AND LIMITATIONS: Correlates of kidney stone disease included a self-reported history of T2DM (OR: 2.44; 95% confidence interval [CI], 1.84-3.25) and history of insulin use (OR: 3.31; 95% CI, 2.02-5.45). Persons with FPG levels 100-126mg/dl and >126mg/dl had increased odds of having kidney stone disease (OR 1.28; 95% CI, 0.95-1.72; and OR 2.29; 95% CI, 1.68-3.12, respectively). Corresponding results for persons with HbA1c 5.7-6.4% and =6.5% were OR 1.68 (95% CI, 1.17-2.42) and OR 2.82 (95% CI, 1.98-4.02), respectively. When adjusting for patient factors, a history of T2DM, the use of insulin, FPI, and HbA1c remained significantly associated with kidney stone disease. The cross-sectional design limits causal inference. CONCLUSIONS: Among persons with T2DM, more-severe disease is associated with a heightened risk of kidney stones.

Abstract

Although inhibitors of the renin-angiotensin-aldosterone system can slow the progression of diabetic kidney disease, the residual risk is high. Whether nuclear 1 factor (erythroid-derived 2)-related factor 2 activators further reduce this risk is unknown.We randomly assigned 2185 patients with type 2 diabetes mellitus and stage 4 chronic kidney disease (estimated glomerular filtration rate [GFR], 15 to <30 ml per minute per 1.73 m(2) of body-surface area) to bardoxolone methyl, at a daily dose of 20 mg, or placebo. The primary composite outcome was end-stage renal disease (ESRD) or death from cardiovascular causes.The sponsor and the steering committee terminated the trial on the recommendation of the independent data and safety monitoring committee; the median follow-up was 9 months. A total of 69 of 1088 patients (6%) randomly assigned to bardoxolone methyl and 69 of 1097 (6%) randomly assigned to placebo had a primary composite outcome (hazard ratio in the bardoxolone methyl group vs. the placebo group, 0.98; 95% confidence interval [CI], 0.70 to 1.37; P=0.92). In the bardoxolone methyl group, ESRD developed in 43 patients, and 27 patients died from cardiovascular causes; in the placebo group, ESRD developed in 51 patients, and 19 patients died from cardiovascular causes. A total of 96 patients in the bardoxolone methyl group were hospitalized for heart failure or died from heart failure, as compared with 55 in the placebo group (hazard ratio, 1.83; 95% CI, 1.32 to 2.55; P<0.001). Estimated GFR, blood pressure, and the urinary albumin-to-creatinine ratio increased significantly and body weight decreased significantly in the bardoxolone methyl group, as compared with the placebo group.Among patients with type 2 diabetes mellitus and stage 4 chronic kidney disease, bardoxolone methyl did not reduce the risk of ESRD or death from cardiovascular causes. A higher rate of cardiovascular events with bardoxolone methyl than with placebo prompted termination of the trial. (Funded by Reata Pharmaceuticals; BEACON ClinicalTrials.gov number, NCT01351675.).

Abstract

Higher left ventricular volume is associated with death in patients with ESRD. This work investigated the effects of frequent hemodialysis on ventricular volumes and left ventricular remodeling.The Frequent Hemodialysis Network daily trial randomized 245 patients to 12 months of six times per week versus three times per week in-center hemodialysis; the Frequent Hemodialysis Network nocturnal trial randomized 87 patients to 12 months of six times per week nocturnal hemodialysis versus three times per week predominantly home-based hemodialysis. Left and right ventricular end systolic and diastolic volumes, left ventricular mass, and ejection fraction at baseline and end of the study were ascertained by cardiac magnetic resonance imaging. The ratio of left ventricular mass/left ventricular end diastolic volume was used as a surrogate marker of left ventricular remodeling. In each trial, the effect of frequent dialysis on left or right ventricular end diastolic volume was tested between predefined subgroups.In the daily trial, frequent hemodialysis resulted in significant reductions in left ventricular end diastolic volume (-11.0% [95% confidence interval, -16.1% to -5.5%]), left ventricular end systolic volume (-14.8% [-22.7% to -6.2%]), right ventricular end diastolic volume (-11.6% [-19.0% to -3.6%]), and a trend for right ventricular end systolic volume (-11.3% [-21.4% to 0.1%]) compared with conventional therapy. The magnitude of reduction in left and right ventricular end diastolic volumes with frequent hemodialysis was accentuated among patients with residual urine output<100 ml/d (P value [interaction]=0.02). In the nocturnal trial, there were no significant changes in left or right ventricular volumes. The frequent dialysis interventions had no substantial effect on the ratio of left ventricular mass/left ventricular end diastolic volume in either trial.Frequent in-center hemodialysis reduces left and right ventricular end systolic and diastolic ventricular volumes as well as left ventricular mass, but it does not affect left ventricular remodeling.

Abstract

Environmental and behavioural factors are thought to contribute to all-cause mortality. Here, we develop a method to systematically screen and validate the potential independent contributions to all-cause mortality of 249 environmental and behavioural factors in the National Health and Nutrition Examination Survey (NHANES).We used Cox proportional hazards regression to associate 249 factors with all-cause mortality while adjusting for sociodemographic factors on data in the 1999-2000 and 2001-02 surveys (median 5.5 follow-up years). We controlled for multiple comparisons with the false discovery rate (FDR) and validated significant findings in the 2003-04 survey (median 2.8 follow-up years). We selected 249 factors from a set of all possible factors based on their presence in both the 1999-2002 and 2003-04 surveys and linkage with at least 20 deceased participants. We evaluated the correlation pattern of validated factors and built a multivariable model to identify their independent contribution to mortality.We identified seven environmental and behavioural factors associated with all-cause mortality, including serum and urinary cadmium, serum lycopene levels, smoking (3-level factor) and physical activity. In a multivariable model, only physical activity, past smoking, smoking in participant's home and lycopene were independently associated with mortality. These three factors explained 2.1% of the variance of all-cause mortality after adjusting for demographic and socio-economic factors.Our association study suggests that, of the set of 249 factors in NHANES, physical activity, smoking, serum lycopene and serum/urinary cadmium are associated with all-cause mortality as identified in previous studies and after controlling for multiple hypotheses and validation in an independent survey. Whereas other NHANES factors may be associated with mortality, they may require larger cohorts with longer time of follow-up to detect. It is possible to use a systematic association study to prioritize risk factors for further investigation.

Abstract

Erectile dysfunction (ED) is more common in men with type 2 diabetes mellitus (T2DM), obesity, and/or the metabolic syndrome (MetS).The aim of this study is to investigate the associations among proxy measures of diabetic severity and the presence of MetS with ED in a nationally representative U.S. data sample.We performed a cross-sectional analysis of adult participants in the 2001-2004 National Health and Nutrition Examination Survey.ED was ascertained by self-report. T2DM severity was defined by calculated measures of glycemic control and insulin resistance (IR). IR was estimated using fasting plasma insulin (FPI) levels and the homeostasis model assessment of IR (HOMA-IR) definition. We classified glycemic control using hemoglobin-A1c (HbA1c) and fasting plasma glucose (FPG) levels. MetS was defined by the American Heart Association and National Heart, Lung, and Blood Institute criteria. Logistic regression models, adjusted for sociodemographics, risk factors, and comorbidities, were fitted for each measure of T2DM severity, MetS, and the presence of ED.Proxy measures of glycemic control and IR were associated with ED. Participants with FPG between 100-126 mg/dL (5.6-7 mmol/L) and ? 126 mg/dL (>7 mmol/L) had higher odds of ED, odds ratio (OR) 1.22 (confidence interval or CI, 0.83-1.80), and OR 2.68 (CI, 1.48-4.86), respectively. Participants with HbA1c 5.7-6.4% (38.8-46.4 mmol/mol) and ? 6.5% (47.5 mmol/mol) had higher odds of ED (OR 1.73 [CI, 1.08-2.76] and 3.70 [CI, 2.19-6.27], respectively). When FPI and HOMA-IR were evaluated by tertiles, there was a graded relation among participants in the top tertile. In multivariable models, a strong association remained between HbA1c and ED (OR 3.19 [CI,1.13-9.01]). MetS was associated with >2.5-fold increased odds of self reported ED (OR 2.55 [CI, 1.85-3.52]).Poor glycemic control, impaired insulin sensitivity, and the MetS are associated with a heightened risk of ED.

Abstract

Although several studies have shown poorer survival among individuals with 25-hydroxy (OH) vitamin D deficiency, data on patients receiving dialysis are limited. Using data from the Comprehensive Dialysis Study (CDS), we tested the hypothesis that patients new to dialysis with low serum concentrations of 25-OH vitamin D would experience higher mortality and hospitalizations.The CDS is a prospective cohort study.We recruited participants from 56 dialysis units located throughout the United States.We obtained data on demographics, comorbidites, and laboratory values from the CDS Patient Questionnaire as well as the Medical Evidence Form (CMS form 2728). Participants provided baseline serum samples for 25-OH vitamin D measurements.We ascertained time to death and first hospitalization as well as number of first-year hospitalizations via the U.S. Renal Data System standard analysis files. We used Cox proportional hazards to determine the association between 25-OH vitamin D tertiles and survival and hospitalization. For number of hospitalizations in the first year, we used negative binomial regression.The analytic cohort was composed of 256 patients with Patient Questionnaire data and 25-OH vitamin D concentrations. The mean age of participants was 62 (±14.0) years, and mean follow-up was 3.8 years. Patients with 25-OH vitamin D concentrations in the lowest tertile (<10.6 ng/mL) at the start of dialysis experienced higher mortality (adjusted hazard ratio 1.75, 95% confidence interval [CI] 1.03-2.97) as well as hospitalization (adjusted hazard ratio 1.76, 95% CI 1.24-2.49). Patients in the lower 2 tertiles (<15.5 ng/mL) experienced a higher rate of hospitalizations in the first year (incidence rate ratio 1.70 [95% CI 1.06-2.72] for middle tertile, 1.66 [95% CI 1.10-2.51] for lowest tertile).We found a sizeable increase in mortality and hospitalization for patients on dialysis with severe 25-OH vitamin D deficiency.

Abstract

Sensitization to human leukocyte antigen (HLA) from red blood cell (RBC) transfusion is poorly quantified and is based on outdated, insensitive methods. The objective was to evaluate the effect of transfusion on the breadth, magnitude and specificity of HLA antibody formation using sensitive and specific methods.Transfusion, demographic and clinical data from the US Renal Data System were obtained for patients on dialysis awaiting primary kidney transplant who had ?2 HLA antibody measurements using the Luminex single-antigen bead assay. One cohort included patients with a transfusion (n = 50) between two antibody measurements matched with up to four nontransfused patients (n = 155) by age, sex, race and vintage (time on dialysis). A second crossover cohort (n = 25) included patients with multiple antibody measurements before and after transfusion. We studied changes in HLA antibody mean fluorescence intensity (MFI) and calculated panel reactive antibody (cPRA).In the matched cohort, 10 of 50 (20%) transfused versus 6 of 155 (4%) nontransfused patients had a ?10 HLA antibodies increase of >3000 MFI (P = 0.0006); 6 of 50 (12%) transfused patients had a ?30 antibodies increase (P = 0.0007). In the crossover cohort, the number of HLA antibodies increasing >1000 and >3000 MFI was higher in the transfused versus the control period, P = 0.03 and P = 0.008, respectively. Using a ?3000 MFI threshold, cPRA significantly increased in both matched (P = 0.01) and crossover (P = 0.002) transfused patients.Among prospective primary kidney transplant recipients, RBC transfusion results in clinically significant increases in HLA antibody strength and breadth, which adversely affect the opportunity for future transplant.

Abstract

Type 2 diabetes mellitus (T2DM) is the most important contributing cause of end-stage renal disease (ESRD) worldwide. Bardoxolone methyl, a nuclear factor-erythroid-2-related factor 2 activator, augments estimated glomerular filtration. The Bardoxolone methyl EvAluation in patients with Chronic kidney disease and type 2 diabetes mellitus: the Occurrence of renal eveNts (BEACON) trial was designed to establish whether bardoxolone methyl slows or prevents progression to ESRD. Herein, we describe baseline characteristics of the BEACON population.BEACON is a randomized double-blind placebo-controlled clinical trial in 2185 patients with T2DM and chronic kidney disease stage 4 (eGFR between 15 and 30 mL/min/1.73 m(2)) designed to test the hypothesis that bardoxolone methyl added to guideline-recommended treatment including inhibitors of the renin-angiotensin-aldosterone system slows or prevents progression to ESRD or cardiovascular death compared with placebo.Baseline characteristics (mean or percentage) of the population include age 68.5 years, female 43%, Caucasian 78%, eGFR 22.5 mL/min/1.73 m(2) and systolic/diastolic blood pressure 140/70 mmHg. The median urinary albumin:creatinine ratio was 320 mg/g and the frequency of micro- and macroalbuminuria was 30 and 51%, respectively. Anemia, abnormalities in markers of bone metabolism and elevations in cardiovascular biomarkers were frequently observed. A history of cardiovascular disease was present in 56%, neuropathy in 47% and retinopathy in 41% of patients.The BEACON trial enrolled a population heretofore unstudied in an international randomized controlled trial. Enrolled patients suffered with numerous co-morbid conditions and exhibited multiple laboratory abnormalities, highlighting the critical need for new therapies to optimize management of these conditions.

Abstract

Among patients receiving maintenance dialysis, weight loss at any body mass index is associated with mortality. However, it is not known whether weight changes before dialysis initiation are associated with mortality and if so, what risks are associated with weight gain or loss.Linking data from the US Renal Data System to a national registry of nursing home residents, this study identified 11,090 patients who started dialysis between January of 2000 and December of 2006. Patients were categorized according to weight measured between 3 and 6 months before dialysis initiation and the percentage change in body weight before dialysis initiation (divided into quintiles). The outcome was mortality within 1 year of starting dialysis.There were 361 patients (3.3%) who were underweight (Quételet's [body mass] index<18.5 kg/m(2)) and 4046 patients (36.5%) who were obese (body mass index ? 30 kg/m(2)) before dialysis initiation. The median percentage change in body weight before dialysis initiation was -6% (interquartile range=-13% to 1%). There were 6063 deaths (54.7%) over 1 year of follow-up. Compared with patients with minimal weight changes (-3% to 3%, quintile 4), patients with weight loss ? 15% (quintile 1) had 35% higher risk for mortality (95% confidence interval, 1.25 to 1.47), whereas those patients with weight gain ? 4% (quintile 5) had a 24% higher risk for mortality (95% confidence interval, 1.14 to 1.35) adjusted for baseline body mass index and other confounders.Among nursing home residents, changes in body weight in advance of dialysis initiation are associated with significantly higher 1-year mortality.

Abstract

AKI affects approximately 2%-7% of hospitalized patients and >35% of critically ill patients. Survival after AKI may be described as having an acute phase (including an initial hyperacute component) followed by a convalescent phase, which may itself have early and late components.Data from the Veterans Affairs/National Institutes of Health Acute Renal Failure Trial Network (ATN) study was used to model mortality risk among patients with dialysis-requiring AKI. This study assumed that the mortality hazard can be described by a piecewise log-linear function with change points. Using an average likelihood method, the authors tested for the number of change points in a piecewise log-linear hazard model. The maximum likelihood approach to locate the change point(s) was then adopted, and associated parameters and standard errors were estimated.There were 1124 ATN participants with follow-up to 1 year. The mortality hazard of AKI decreased over time with inflections in the rate of decrease at days 4, 42, and 148, with the sharpest change at day 42. The daily rate of decline in the log of the hazard for death was 0.220 over the first 4 days, 0.046 between day 4 and day 42, 0.017 between day 42 and day 148, and 0.003 between day 148 and day 365.There appear to be two major phases of mortality risk after AKI: an early phase extending over the first 6 weeks and a late phase from 6 weeks to 1 year. Within the first 42 days, this can be further divided into hyperacute (days 1-4) and acute (days 4-42) phases. After 42 days, there appear to be early (days 42-148) and late (after day 148) convalescent phases. These findings may help to inform the design of AKI clinical trials and assist critical care physicians in prognostic stratification.

Abstract

Chinese translationIn the TEMPO (Tolvaptan Efficacy and Safety in Management of Autosomal Dominant Polycystic Kidney Disease and Its Outcomes) trial, tolvaptan significantly reduced expansion of kidney volume and loss of kidney function.To determine how the benefits of tolvaptan seen in TEMPO may relate to longer-term health outcomes, such as progression to end-stage renal disease (ESRD) and death, and cost-effectiveness.A decision-analytic model.Published literature from 1993 to 2012.Persons with early autosomal dominant polycystic kidney disease.Lifetime.Societal.Patients received tolvaptan therapy until death, development of ESRD, or liver complications or no tolvaptan therapy.Median age at ESRD onset, life expectancy, discounted quality-adjusted life-years and lifetime costs (in 2010 U.S. dollars), and incremental cost-effectiveness ratios.Tolvaptan prolonged the median age at ESRD onset by 6.5 years and increased life expectancy by 2.6 years. At $5760 per month, tolvaptan cost $744 100 per quality-adjusted life-year gained compared with standard care.For patients with autosomal dominant polycystic kidney disease that progressed more slowly, the cost per quality-adjusted life-year gained was even greater for tolvaptan.Although TEMPO followed patients for 3 years, the main analysis assumed that clinical benefits persisted over patients' lifetimes.Assuming that the benefits of tolvaptan persist in the longer term, the drug may slow progression to ESRD and reduce mortality rates. However, barring an approximately 95% reduction in price, cost-effectiveness does not compare favorably with many other commonly accepted medical interventions.National Institutes of Health and Agency for Healthcare Research and Quality.

Abstract

Contrast-induced AKI (CI-AKI) is a common condition associated with serious, adverse outcomes. CI-AKI may be preventable because its risk factors are well characterized and the timing of renal insult is commonly known in advance. Intravenous (IV) fluids and N-acetylcysteine (NAC) are two of the most widely studied preventive measures for CI-AKI. Despite a multitude of clinical trials and meta-analyses, the most effective type of IV fluid (sodium bicarbonate versus sodium chloride) and the benefit of NAC remain unclear. Careful review of published trials of these interventions reveals design limitations that contributed to their inconclusive findings. Such design limitations include the enrollment of small numbers of patients, increasing the risk for type I and type II statistical errors; the use of surrogate primary endpoints defined by small increments in serum creatinine, which are associated with, but not necessarily causally related to serious, adverse, patient-centered outcomes; and the inclusion of low-risk patients with intact baseline kidney function, yielding low event rates and reduced generalizability to a higher-risk population. The Prevention of Serious Adverse Events following Angiography (PRESERVE) trial is a randomized, double-blind, multicenter trial that will enroll 8680 high-risk patients undergoing coronary or noncoronary angiography to compare the effectiveness of IV isotonic sodium bicarbonate versus IV isotonic sodium chloride and oral NAC versus oral placebo for the prevention of serious, adverse outcomes associated with CI-AKI. This article discusses key methodological issues of past trials investigating IV fluids and NAC and how they informed the design of the PRESERVE trial.

Abstract

BACKGROUND AND OBJECTIVES: Patients with ESRD experience a fivefold higher incidence of hip fracture than the age- and sex-matched general population. Despite multiple changes in the treatment of CKD mineral bone disorder, little is known about long-term trends in hip fracture incidence, treatment patterns, and outcomes in patients on dialysis. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: Fourteen annual cohorts (1996-2009) of older patients (?67 years) initiating dialysis in the United States were studied. Eligible patients had Medicare fee-for-service coverage for ?2 years before dialysis initiation and were followed for ?3 years for a first hip fracture. Type of treatment (internal fixation or partial or total hip replacement) was ascertained along with 30-day mortality. Cox and modified Poisson regressions were used to describe trends in study outcomes. RESULTS: This study followed 409,040 patients over 607,059 person-years, during which time 17,887 hip fracture events were recorded (29.3 events/1000 person-years). Compared with patients incident for ESRD in 1996, adjusted hip fracture rates increased until the 2004 cohort (+41%) and declined thereafter. Surgical treatment included internal fixation in 56%, partial hip replacement in 29%, and total hip replacement in 2%, which remained essentially unchanged over time; 30-day mortality after hip fracture declined from 20% (1996) to 16% (2009). CONCLUSIONS: Hip fracture incidence rates remain higher today than in patients reaching ESRD in 1996, despite multiple purported improvements in the management of CKD mineral bone disorder. Although recent declines in incidence and steady declines in associated short-term mortality are encouraging, hip fractures remain among the most common and consequential noncardiovascular complications of ESRD.

Abstract

BACKGROUND AND OBJECTIVES: Geographic and other variations in medical practices lead to differences in medical costs, often without a clear link to health outcomes. This work examined variation in the frequency of physician visits to patients receiving hemodialysis to measure the relative importance of provider practice patterns (including those patterns linked to geographic region) and patient health in determining visit frequency. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: This work analyzed a nationally representative 2006 database of patients receiving hemodialysis in the United States. A variation decomposition analysis of the relative importance of facility, geographic region, and patient characteristics-including demographics, socioeconomic status, and indicators of health status-in explaining physician visit frequency variation was conducted. Finally, the associations between facility, geographic and patient characteristics, and provider visit frequency were measured using multivariable regression. RESULTS: Patient characteristics accounted for only 0.9% of the total visit frequency variation. Accounting for case-mix differences, patients' hemodialysis facilities explained about 24.9% of visit frequency variation, of which 9.3% was explained by geographic region. Visit frequency was more closely associated with many facility and geographic characteristics than indicators of health status. More recent dialysis initiation and recent hospitalization were associated with decreased visit frequency. CONCLUSIONS: In hemodialysis, provider visit frequency depends more on geography and facility location and characteristics than patients' health status or acuity of illness. The magnitude of variation unrelated to patient health suggests that provider visit frequency practices do not reflect optimal management of patients on dialysis.

Abstract

Frequent hemodialysis can alter volume status, blood pressure, and the concentration of osmotically active solutes, each of which might affect residual kidney function (RKF). In the Frequent Hemodialysis Network Daily and Nocturnal Trials, we examined the effects of assignment to six compared with three-times-per-week hemodialysis on follow-up RKF. In both trials, baseline RKF was inversely correlated with number of years since onset of ESRD. In the Nocturnal Trial, 63 participants had non-zero RKF at baseline (mean urine volume 0.76 liter/day, urea clearance 2.3 ml/min, and creatinine clearance 4.7 ml/min). In those assigned to frequent nocturnal dialysis, these indices were all significantly lower at month 4 and were mostly so at month 12 compared with controls. In the frequent dialysis group, urine volume had declined to zero in 52% and 67% of patients at months 4 and 12, respectively, compared with 18% and 36% in controls. In the Daily Trial, 83 patients had non-zero RKF at baseline (mean urine volume 0.43 liter/day, urea clearance 1.2 ml/min, and creatinine clearance 2.7 ml/min). Here, treatment assignment did not significantly influence follow-up levels of the measured indices, although the range in baseline RKF was narrower, potentially limiting power to detect differences. Thus, frequent nocturnal hemodialysis appears to promote a more rapid loss of RKF, the mechanism of which remains to be determined. Whether RKF also declines with frequent daily treatment could not be determined.

Abstract

The authors sought to evaluate the cost-effectiveness of statins for primary prevention of myocardial infarction (MI) and stroke in patients with chronic kidney disease (CKD).Patients with CKD have an elevated risk of MI and stroke. Although HMG Co-A reductase inhibitors (?statins?) may prevent cardiovascular events in patients with non?dialysis-requiring CKD, adverse drug effects and competing risks could materially influence net effects and clinical decision-making.We developed a decision-analytic model of CKD and cardiovascular disease (CVD) to determine the cost-effectiveness of low-cost generic statins for primary CVD prevention in men and women with hypertension and mild-to-moderate CKD. Outcomes included MI and stroke rates, discounted quality-adjusted life years (QALYs) and lifetime costs (2010 USD), and incremental cost-effectiveness ratios.For 65-year-old men with moderate hypertension and mild-to-moderate CKD, statins reduced the combined rate of MI and stroke, yielded 0.10 QALYs, and increased costs by $1,800 ($18,000 per QALY gained). For patients with lower baseline cardiovascular risks, health and economic benefits were smaller; for 65-year-old women, statins yielded 0.06 QALYs and increased costs by $1,900 ($33,400 per QALY gained). Results were sensitive to rates of rhabdomyolysis and drug costs. Statins are less cost-effective when obtained at average retail prices, particularly in patients at lower CVD risk.Although statins reduce absolute CVD risk in patients with CKD, the increased risk of rhabdomyolysis, and competing risks associated with progressive CKD, partly offset these gains. Low-cost generic statins appear cost-effective for primary prevention of CVD in patients with mild-to-moderate CKD and hypertension.

Abstract

Despite high mortality and low levels of physical activity (PA) among patients starting dialysis, the link between low PA and mortality has not been carefully evaluated.The Comprehensive Dialysis Study was a prospective cohort study that enrolled patients who started dialysis between June 2005 and June 2007 in a random sample of dialysis facilities in the United States. The Human Activity Profile (HAP) was administered to estimate PA among 1554 ambulatory enrolled patients in the Comprehensive Dialysis Study. Patients were followed until death or September 30, 2009, and the major outcome was all-cause mortality.The average age was 59.8 (14.2) years; 55% of participants were male, 28% were black, and 56% had diabetes mellitus. The majority (57.3%) had low fitness estimated from the HAP score. The median follow-up was 2.6 (interquartile range, 2.2-3.1) years. The association between PA and mortality was linear across the range of scores (1-94). After multivariable adjustment, lower adjusted activity score on the HAP was associated with higher mortality (hazard ratio, 1.30; 95% confidence interval, 1.23-1.39 per 10 points). Patients in the lowest level of fitness experienced a 3.5-fold (95% confidence interval, 2.54-4.89) increase in risk of death compared with those with average or above fitness.Low levels of PA are strongly associated with mortality among patients new to dialysis. Interventions aimed to preserve or enhance PA should be prospectively tested.

Abstract

Chronic kidney disease (CKD) associated with type 2 diabetes mellitus constitutes a global epidemic complicated by considerable renal and cardiovascular morbidity and mortality, despite the provision of inhibitors of the renin-angiotensin-aldosterone system (RAAS). Bardoxolone methyl, a synthetic triterpenoid that reduces oxidative stress and inflammation through Nrf2 activation and inhibition of NF-?B was previously shown to increase estimated glomerular filtration rate (eGFR) in patients with CKD associated with type 2 diabetes mellitus. To date, no antioxidant or anti-inflammatory therapy has proved successful at slowing the progression of CKD.Herein, we describe the design of Bardoxolone Methyl Evaluation in Patients with Chronic Kidney Disease and Type 2 Diabetes: the Occurrence of Renal Events (BEACON) trial, a multinational, multicenter, double-blind, randomized, placebo-controlled Phase 3 trial designed to determine whether long-term administration of bardoxolone methyl (on a background of standard therapy, including RAAS inhibitors) safely reduces renal and cardiac morbidity and mortality.The primary composite endpoint is time-to-first occurrence of either end-stage renal disease or cardiovascular death. Secondary endpoints include the change in eGFR and time to occurrence of cardiovascular events.BEACON will be the first event-driven trial to evaluate the effect of an oral antioxidant and anti-inflammatory drug in advanced CKD.

Abstract

The development of acute kidney injury (AKI) after cardiac surgery is associated with significant mortality, morbidity, and cost. The last decade has seen major changes in the complexity of cardiac surgical candidates and in the number and type of cardiac surgical procedures being performed.Using data from the Nationwide Inpatient Sample, we determined the annual rates of AKI, AKI requiring dialysis (AKI-D), and inpatient mortality after cardiac surgery in the United States in the years 1999 through 2008.Inpatient mortality with AKI and AKI-D decreased from 27.9% and 45.9%, respectively, in 1999 to 12.8% and 35.3%, respectively, in 2008. Compared with 1999, the odds of AKI and AKI-D in 2008, adjusted for demographic and clinical factors, were 3.30 (95% confidence interval [CI]: 2.89 to 3.77) and 2.23 (95% CI: 1.78 to 2.80), respectively. Corresponding adjusted odds of death associated with AKI and AKI-D were 0.31 (95% CI: 0.26 to 0.36) and 0.47 (95% CI: 0.34 to 0.65.) Taken together, the attributable risks for death after cardiac surgery associated with AKI and AKI-D increased from 30% and 5%, respectively, in 1999 to 47% and 14%, respectively, in 2008.In sum, despite improvements in individual patient outcomes over the decade 1999 to 2008, the population contribution of AKI and AKI-D to inpatient mortality after surgery increased over the same period.

Abstract

Symptoms of sleep and mood disturbances are common among patients on dialysis and are associated with significant decrements in survival and health-related quality of life. We used data from the Comprehensive Dialysis Study (CDS) to examine the association of self-reported physical activity with self-reported symptoms of insomnia, restless legs syndrome (RLS), and depression in patients new to dialysis. The CDS collected data on physical activity, functional status, and health-related quality of life from 1678 patients on either peritoneal (n = 169) or hemodialysis (n = 1509). The Human Activity Profile was used to measure self-reported physical activity. Symptoms were elicited in the following manner: insomnia using three questions designed to capture difficulty in initiating or maintaining sleep, RLS using three questions based on the National Institutes of Health workshop, and depression using the two-item Patient Health Questionnaire. We obtained data on symptoms of insomnia and depression for 1636, and on symptoms of RLS for 1622 (>98%) patients. Of these, 863 (53%) reported one of three insomnia symptoms as occurring at a persistent frequency. Symptoms of RLS and depression occurred in 477 (29%) and 451 (28%) of patients, respectively. The Adjusted Activity Score of the Human Activity Profile was inversely correlated with all three conditions in models adjusting for demographics, comorbid conditions, and laboratory variables. Sleep and mood disturbances were commonly reported in our large, diverse cohort of patients new to dialysis. Patients who reported lower levels of physical activity were more likely to report symptoms of insomnia, RLS, and depression.

Abstract

Impaired kidney function is an established predictor of mortality after acute nonvariceal upper gastrointestinal bleeding (ANVUGIB); however, which factors are associated with mortality after ANVUGIB among patients undergoing dialysis is unknown. We examined the associations among demographic characteristics, dialysis-specific features, and comorbid conditions with short-term mortality after ANVUGIB among patients on dialysis.Design: Retrospective cohort study. Setting: United States Renal Data System (USRDS), a nation-wide registry of patients with end-stage renal disease. Participants: All ANVUGIB episodes identified by validated algorithms in Medicare-covered patients between 2003 and 2007. Measurements: Demographic characteristics and comorbid conditions from 1 year of billing claims prior to each bleeding event. We used logistic regression extended with generalized estimating equations methods to model the associations among risk factors and 30-day mortality following ANVUGIB events.From 2003 to 2007, we identified 40,016 eligible patients with 50,497 episodes of ANVUGIB. Overall 30-day mortality was 10.7% (95% CI: 10.4-11.0). Older age, white race, longer dialysis vintage, peritoneal dialysis (vs. hemodialysis), and hospitalized (vs. outpatient) episodes were independently associated with a higher risk of 30-day mortality. Most but not all comorbid conditions were associated with death after ANVUGIB. The joint ability of all factors captured to discriminate mortality was modest (c=0.68).We identified a profile of risk factors for 30-day mortality after ANVUGIB among patients on dialysis that was distinct from what had been reported in non-dialysis populations. Specifically, peritoneal dialysis and more years since initiation of dialysis were independently associated with short-term death after ANVUGIB.

Abstract

Disorders of mineral metabolism, including secondary hyperparathyroidism, are thought to contribute to extraskeletal (including vascular) calcification among patients with chronic kidney disease. It has been hypothesized that treatment with the calcimimetic agent cinacalcet might reduce the risk of death or nonfatal cardiovascular events in such patients.In this clinical trial, we randomly assigned 3883 patients with moderate-to-severe secondary hyperparathyroidism (median level of intact parathyroid hormone, 693 pg per milliliter [10th to 90th percentile, 363 to 1694]) who were undergoing hemodialysis to receive either cinacalcet or placebo. All patients were eligible to receive conventional therapy, including phosphate binders, vitamin D sterols, or both. The patients were followed for up to 64 months. The primary composite end point was the time until death, myocardial infarction, hospitalization for unstable angina, heart failure, or a peripheral vascular event. The primary analysis was performed on the basis of the intention-to-treat principle.The median duration of study-drug exposure was 21.2 months in the cinacalcet group, versus 17.5 months in the placebo group. The primary composite end point was reached in 938 of 1948 patients (48.2%) in the cinacalcet group and 952 of 1935 patients (49.2%) in the placebo group (relative hazard in the cinacalcet group vs. the placebo group, 0.93; 95% confidence interval, 0.85 to 1.02; P=0.11). Hypocalcemia and gastrointestinal adverse events were significantly more frequent in patients receiving cinacalcet.In an unadjusted intention-to-treat analysis, cinacalcet did not significantly reduce the risk of death or major cardiovascular events in patients with moderate-to-severe secondary hyperparathyroidism who were undergoing dialysis. (Funded by Amgen; EVOLVE ClinicalTrials.gov number, NCT00345839.).

Abstract

Inflammation and oxidative stress are hallmarks and mediators of the progression of CKD. Bardoxolone methyl, a potent activator of the nuclear factor erythroid 2-related factor 2 (Nrf2)-mediated antioxidant and anti-inflammatory response, increases estimated GFR and decreases BUN, serum phosphorus, and uric acid concentrations in patients with moderate to severe CKD. However, it also increases albuminuria, which is associated with inflammation and disease progression. Therefore, we investigated whether this bardoxolone methyl-induced albuminuria may result from the downregulation of megalin, a protein involved in the tubular reabsorption of albumin and lipid-bound proteins. Administration of bardoxolone methyl to cynomolgus monkeys significantly decreased the protein expression of renal tubular megalin, which inversely correlated with the urine albumin-to-creatinine ratio. Moreover, daily oral administration of bardoxolone methyl to monkeys for 1 year did not lead to any adverse effects on renal histopathologic findings but did reduce serum creatinine and BUN, as observed in patients with CKD. Finally, the bardoxolone methyl-induced decrease in megalin corresponded with pharmacologic induction of renal Nrf2 targets, including NAD(P)H:quinone oxidoreductase 1 enzyme activity and glutathione content. This result indicates that Nrf2 may have a role in megalin regulation. In conclusion, these data suggest that the increase in albuminuria that accompanies bardoxolone methyl administration may result, at least in part, from reduced expression of megalin, which seems to occur without adverse effects and with strong induction of Nrf2 targets.

Abstract

Recent studies have focused on the association between dialysate sodium (Na(+)) prescriptions and interdialytic weight gain (IDWG). We report on a case series of 13 patients undergoing conventional, thrice-weekly in-center hemodialysis with an individualized dialysate Na(+) prescription. Individualized dialysate Na(+) was achieved in all patients through a stepwise weekly reduction of the standard dialysate Na(+) prescription (140?mEq/L) by 2-3?mEq/L until reaching a Na(+) gradient of -2?mEq/L (dialysate Na(+) minus average plasma Na(+) over the preceding 3 months). Interdialytic weight gain, with and without indexing to dry weight (IDWG%), blood pressure, and the proportion of treatments with cramps, intradialytic hypotension (drop in systolic blood pressure >30?mmHg) and intradialytic hypotension requiring an intervention were reviewed. At the beginning of the observation period, the pre-hemodialysis (HD) plasma Na(+) concentration ranged from 130 to 141?mEq/L. When switched from the standard to the individualized dialysate Na(+) concentration, IDWG% decreased from 3.4%?±?1.6% to 2.5%?±?1.0% (P?=?0.003) with no change in pre- or post-HD systolic or diastolic blood pressures (all P?>?0.05). We found no significant change in the proportion of treatments with cramps (6% vs. 13%), intradialytic hypotension (62% vs. 65%), or intradialytic hypotension requiring an intervention (29% vs. 33%). Individualized reduction of dialysate Na(+) reduces IDWG% without significantly increasing the frequency of cramps or hypotension.

Abstract

The lack of reliable human proxies for minor (ie, non-HLA) histocompatibility loci hampers the ability to leverage these factors toward improving transplant outcomes. Despite conflicting reports of the effect of donor-recipient sex mismatch on renal allografts, the association between acute rejection of renal allografts and the development of human alloantibodies to the male H-Y antigen suggested to us that donor-recipient sex mismatch deserved re-evaluation.To evaluate whether the relationships between donor sex and allograft failure differed by recipient sex.We studied recipients of deceased-donor (n = 125,369) and living-donor (n = 63,139) transplants in the United States Renal Data System. Using Cox proportional hazards models stratified by donor type, we estimated the association between donor-recipient sex mismatch and death-censored allograft failure with adjustment for known risk factors, with and without the use of multiple imputation methods to account for potential bias and/or loss of efficiency due to missing data.The advantage afforded by male donor kidneys was more pronounced among male than among female recipients (8% vs 2% relative risk reduction; interaction P < 0.01). This difference is of the order of magnitude of several other risk factors affecting donor selection decisions.Donor-recipient sex mismatch affects renal allograft survival in a direction consistent with immune responses to sexually determined minor histocompatibility antigens. Our study provides a paradigm for clinical detection of markers for minor histocompatibility loci.

Abstract

Some propose using phosphate binders in the CKD population given the association between higher levels of phosphorus and mortality, but their safety and efficacy in this population are not well understood. Here, we aimed to determine the effects of phosphate binders on parameters of mineral metabolism and vascular calcification among patients with moderate to advanced CKD. We randomly assigned 148 patients with estimated GFR=20-45 ml/min per 1.73 m(2) to calcium acetate, lanthanum carbonate, sevelamer carbonate, or placebo. The primary endpoint was change in mean serum phosphorus from baseline to the average of months 3, 6, and 9. Serum phosphorus decreased from a baseline mean of 4.2 mg/dl in both active and placebo arms to 3.9 mg/dl with active therapy and 4.1 mg/dl with placebo (P=0.03). Phosphate binders, but not placebo, decreased mean 24-hour urine phosphorus by 22%. Median serum intact parathyroid hormone remained stable with active therapy and increased with placebo (P=0.002). Active therapy did not significantly affect plasma C-terminal fibroblast growth factor 23 levels. Active therapy did, however, significantly increase calcification of the coronary arteries and abdominal aorta (coronary: median increases of 18.1% versus 0.6%, P=0.05; abdominal aorta: median increases of 15.4% versus 3.4%, P=0.03). In conclusion, phosphate binders significantly lower serum and urinary phosphorus and attenuate progression of secondary hyperparathyroidism among patients with CKD who have normal or near-normal levels of serum phosphorus; however, they also promote the progression of vascular calcification. The safety and efficacy of phosphate binders in CKD remain uncertain.

Abstract

In light of the recent trend toward earlier dialysis initiation and its association with mortality among patients with end-stage renal disease, we hypothesized that frailty is associated with higher estimated glomerular filtration rate (eGFR) at dialysis start and may confound the relation between earlier dialysis initiation and mortality.We examined frailty among participants of the Comprehensive Dialysis Study (CDS), a special study of the US Renal Data System, which enrolled incident patients from September 1, 2005, through June 1, 2007. Patients were followed for vital status through September 30, 2009, and for time to first hospitalization through December 31, 2008. We used multivariate logistic regression to model the association of frailty with eGFR at dialysis start and proportional hazards regression to assess the outcomes of death or hospitalization.Among 1576 CDS participants included, the prevalence of frailty was 73%. In multivariate analysis, higher eGFR at dialysis initiation was associated with higher odds of frailty (odds ratio [OR], 1.44 [95% CI, 1.23-1.68] per 5 mL/min/1.73 m(2); P < .001). Frailty was independently associated with mortality (hazard ratio [HR], 1.57 [95% CI, 1.25-1.97]; P < .001) and time to first hospitalization (HR, 1.26 [95% CI, 1.09-1.45]; P < .001). While higher eGFR at dialysis initiation was associated with mortality (HR, 1.12 [95% CI, 1.02-1.23] per 5 mL/min/1.73 m(2); P = .02), the association was no longer statistically significant after frailty was accounted for (HR, 1.08 [95% CI, 0.98-1.19] per 5 mL/min/1.73 m(2); P = .11).Frailty is extremely common among patients starting dialysis in the United States and is associated with higher eGFR at dialysis initiation. Recognition of signs and symptoms of frailty by clinicians may prompt earlier initiation of dialysis and may explain, at least in part, the well-described association between eGFR at dialysis initiation and mortality.

Abstract

Secondary hyperparathyroidism (sHPT) and other abnormalities associated with chronic kidney disease-mineral bone disorder can contribute to dystrophic (including vascular) calcification. Dietary modification and variety of medications can be used to attenuate the severity of sHPT. However, it is unknown whether any of these approaches can reduce the high risks of death and cardiovascular disease in patients with end-stage renal disease.The Evaluation of Cinacalcet HCl Therapy to Lower Cardiovascular Events (EVOLVE) trial was designed to test the hypothesis that treatment with the calcimimetic agent cinacalcet compared with placebo (on a background of conventional therapy including phosphate binders +/- vitamin D sterols) reduces time to death or non-fatal cardiovascular events (specifically myocardial infarction, unstable angina, heart failure and peripheral arterial disease events) among patients on hemodialysis with sHPT. This report describes baseline characteristics of enrolled subjects with a focus on regional variation.There were 3883 subjects randomized from 22 countries, including the USA, Canada, Australia, three Latin American nations, Russia and 15 European nations. The burden of overt cardiovascular disease at baseline was high (e.g. myocardial infarction 12.4%, heart failure 23.3%). The median plasma parathyroid hormone concentration at baseline was 692 pg/mL (10%, 90% range, 363-1694 pg/mL). At baseline, 87.2% of subjects were prescribed phosphate binders and 57.5% were prescribed activated vitamin D derivatives. Demographic data, comorbid conditions and baseline laboratory data varied significantly across regions.EVOLVE enrolled 3883 subjects on hemodialysis with moderate to severe sHPT. Inclusion of subjects from multiple global regions with varying degrees of disease severity will enhance the external validity of the trial results.

Abstract

This study examined the associations between homelessness and clinical outcomes of CKD among adults from the urban healthcare safety net.This retrospective cohort study examined 15,343 adults with CKD stages 3-5 who received ambulatory care during 1996-2005 from the Community Health Network of San Francisco. Main outcome measures were time to ESRD or death and frequency of emergency department visits and hospitalizations.Overall, 858 persons (6%) with CKD stages 3-5 were homeless. Homeless adults were younger, were disproportionately male and uninsured, and suffered from far higher rates of depression and substance abuse compared with adults with stable housing (P<0.001 for all comparisons). Over a median follow-up of 2.8 years (interquartile range=1.4-6.1), homeless adults experienced significantly higher crude risk of ESRD or death (hazard ratio=1.82, 95% confidence interval=1.49-2.22) compared with housed adults. This elevated risk was attenuated but remained significantly higher (adjusted hazard ratio=1.28, 95% confidence interval=1.04-1.58) after controlling for differences in sociodemographics, comorbid conditions, and laboratory variables. Homeless adults were also far more likely to use acute care services (median [interquartile range] number of emergency department visits was 9 [4-20] versus 1 [0-4], P<0.001) than housed counterparts.Homeless adults with CKD suffer from increased morbidity and mortality and use costly acute care services far more frequently than peers who are stably housed. These findings warrant additional inquiry into the unmet health needs of the homeless with CKD to provide appropriate and effective care to this disadvantaged group.

Abstract

We investigated the effects of frequency of hemodialysis on nutritional status by analyzing the data in the Frequent Hemodialysis Network Trial. We compared changes in albumin, body weight, and composition among 245 patients randomized to six or three times per week in-center hemodialysis (Daily Trial) and 87 patients randomized to six times per week nocturnal or three times per week conventional hemodialysis, performed largely at home (Nocturnal Trial). In the Daily Trial, there were no significant differences between groups in changes in serum albumin or the equilibrated protein catabolic rate by 12 months. There was a significant relative decrease in predialysis body weight of 1.5 ± 0.2 kg in the six times per week group at 1 month, but this significantly rebounded by 1.3 ± 0.5 kg over the remaining 11 months. Extracellular water (ECW) decreased in the six times per week compared with the three per week hemodialysis group. There were no significant between-group differences in phase angle, intracellular water, or body cell mass (BCM). In the Nocturnal Trial, there were no significant between-group differences in any study parameter. Any gain in 'dry' body weight corresponded to increased adiposity rather than muscle mass but was not statistically significant. Thus, frequent in-center hemodialysis reduced ECW but did not increase serum albumin or BCM while frequent nocturnal hemodialysis yielded no net effect on parameters of nutritional status or body composition.

Abstract

The Centers for Medicare and Medicaid Services (CMS) Medical Evidence Report (form CMS-2728) queries providers about the timing of the patient's first nephrologist consultation before initiation of dialysis. The monitoring of disease-specific goals in the Healthy People 2020 initiative will use information from this question, but the accuracy of the reported information is unknown. We defined a cohort of 80,509 patients aged ?67 years who initiated dialysis between July 2005 and December 2008 with ?2 years of uninterrupted Medicare coverage as their primary payer. The primary referent, determined from claims data, was the first observed outpatient nephrologist consultation; secondary analyses used the earliest nephrology consultation, whether inpatient or outpatient. We used linear regression models to assess the associations among the magnitude of discrepant reporting and patient characteristics and we tested for any temporal trends. When using the earliest recorded outpatient nephrology encounter, agreement between the two sources of ascertainment was 48.2%, and the ? statistic was 0.29 when we categorized the timing of the visit into four periods (never, <6, 6-12, and >12 months). When we dichotomized the timing of first predialysis nephrology care at >12 or ?12 months, accuracy was 70% (?=0.36), but it differed by patient characteristics and declined over time. In conclusion, we found substantial disagreement between information from the CMS Medical Evidence Report and Medicare physician claims on the timing of first predialysis nephrologist care. More-specific instructions may improve reporting and increase the utility of form CMS-2728 for research and public health surveillance.

Abstract

There is no consensus on the optimal method to measure delivered dialysis dose in patients with acute kidney injury (AKI). The use of direct dialysate-side quantification of dose in preference to the use of formal blood-based urea kinetic modeling and simplified blood urea nitrogen (BUN) methods has been recommended for dose assessment in critically-ill patients with AKI. We evaluate six different blood-side and dialysate-side methods for dose quantification.We examined data from 52 critically-ill patients with AKI requiring dialysis. All patients were treated with pre-dilution CVVHDF and regional citrate anticoagulation. Delivered dose was calculated using blood-side and dialysis-side kinetics. Filter function was assessed during the entire course of therapy by calculating BUN to dialysis fluid urea nitrogen (FUN) ratios q/12 hours.Median daily treatment time was 1,413 min (1,260-1,440). The median observed effluent volume per treatment was 2,355 mL/h (2,060-2,863) (p<0.001). Urea mass removal rate was 13.0 ± 7.6 mg/min. Both EKR (r²=0.250; p<0.001) and KD (r²=0.409; p<0.001) showed a good correlation with actual solute removal. EKR and KD presented a decline in their values that was related to the decrease in filter function assessed by the FUN/BUN ratio.Effluent rate (mL/kg/h) can only empirically provide an estimated of dose in CRRT. For clinical practice, we recommend that the delivered dose should be measured and expressed as KD. EKR also constitutes a good method for dose comparisons over time and across modalities.

Abstract

AKI is an important clinical problem that has become increasingly more common. Mortality rates associated with AKI remain high despite advances in supportive care. Patients surviving AKI have increased long-term mortality and appear to be at increased risk of developing CKD and progressing to ESRD. No proven effective pharmacologic therapies are currently available for the prevention or treatment of AKI. Advances in addressing this unmet need will require the development of novel therapeutic agents based on precise understanding of key pathophysiological events and the implementation of well designed clinical trials. To address this need, the National Institute of Diabetes and Digestive and Kidney Diseases sponsored the "Clinical Trials in Acute Kidney Injury: Current Opportunities and Barriers" workshop in December 2010. The event brought together representatives from academia, industry, the National Institutes of Health, and the US Food and Drug Administration. We report the discussions of workgroups that developed outlines of clinical trials for the prevention of AKI in two patient populations: patients undergoing elective surgery who are at risk for or who develop AKI, and patients who are at risk for contrast-induced AKI. In both of these populations, primary prevention or secondary therapy can be delivered at an optimal time relative to kidney injury. The workgroups detailed primary and secondary endpoints for studies in these groups, and explored the use of adaptive clinical trial designs for trials of novel preventive strategies to improve outcomes of patients with AKI.

Abstract

AKI remains an important clinical problem, with a high mortality rate, increasing incidence, and no Food and Drug Administration-approved therapeutics. Advances in addressing this clinical need require approaches for rapid diagnosis and stratification of injury, development of therapeutic agents based on precise understanding of key pathophysiological events, and implementation of well designed clinical trials. In the near future, AKI biomarkers may facilitate trial design. To address these issues, the National Institute of Diabetes and Digestive and Kidney Diseases sponsored a meeting, "Clinical Trials in Acute Kidney Injury: Current Opportunities and Barriers," in December of 2010 that brought together academic investigators, industry partners, and representatives from the National Institutes of Health and the Food and Drug Administration. Important issues in the design of clinical trials for interventions in AKI in patients with sepsis or AKI in the setting of critical illness after surgery or trauma were discussed. The sepsis working group discussed use of severity of illness scores and focus on patients with specific etiologies to enhance homogeneity of trial participants. The group also discussed endpoints congruent with those endpoints used in critical care studies. The second workgroup emphasized difficulties in obtaining consent before admission and collaboration among interdisciplinary healthcare groups. Despite the difficult trial design issues, these clinical situations represent a clinical opportunity because of the high event rates, severity of AKI, and poor outcomes. The groups considered trial design issues and discussed advantages and disadvantages of several short- and long-term primary endpoints in these patients.

Abstract

The US National Institutes of Health (NIH) and Centers for Medicare and Medicaid Services (CMS) sponsored a randomized clinical trial comparing six versus three times per week in-center hemodialysis (the Frequent Hemodialysis Network [FHN] Daily Trial), to test the effects of frequent hemodialysis on an array of intermediate outcomes. Herein we report challenges to enrollment and randomization into the trial.Screening and enrollment was tracked at all participating dialysis clinics and specific reasons for dropout after baseline assessment were recorded for all enrolled subjects. Reasons for consent refusal were recorded in a subset of (10 out of 65) sites.The trial screened 6276 hemodialysis patients on three times weekly hemodialysis in 65 hemodialysis clinics, 3481 (55%) were considered eligible for enrollment, and 3124 (90%) were approached for consent; 378 (12%) consented and 245 were randomized (65% of those enrolled). Prospective subjects chose not to participate primarily because of the anticipated time required for three extra treatments per week and the difficulties in following the protocol.Recruitment into the FHN Daily Trial proved challenging but the goal of 250 randomized subjects was almost met.

Abstract

Acute kidney injury (AKI) remains a complex clinical problem associated with significant short-term morbidity and mortality and lacking effective pharmacologic interventions. Patients with AKI experience longer-term risks for progressive chronic ESRD, which diminish patients' health-related quality of life and create a larger burden on the healthcare system. Although experimental models have yielded numerous promising agents, translation into clinical practice has been unsuccessful, possibly because of issues in clinical trial design, such as delayed drug administration, masking of therapeutic benefit by adverse events, and inadequate sample size. To address issues of clinical trial design, the National Institute of Diabetes and Digestive and Kidney Diseases sponsored a workshop titled "Clinical Trials in Acute Kidney Injury: Current Opportunities and Barriers" in December 2010. Workshop participants included representatives from academia, industry, and government agencies whose areas of expertise spanned basic science, clinical nephrology, critical care medicine, biostatistics, pharmacology, and drug development. This document summarizes the discussions of collaborative workgroups that addressed issues related to patient selection, study endpoints, the role of novel biomarkers, sample size and power calculations, and adverse events and pilot/feasibility studies in prevention and treatment of AKI. Companion articles outline the discussions of workgroups for model trials related to prevention or treatment of established AKI in different clinical settings, such as in patients with sepsis.

Abstract

AKI is an important public health issue. AKI is a common hospital complication associated with increased in-hospital and long-term mortality, extensive morbidity (including prolonged hospital length of stay), and an estimated annual cost of at least $10 billion in the United States. At present, no specific therapy has been developed to prevent AKI, hasten recovery of kidney function, or abrogate the deleterious systemic effects of AKI. However, recent progress includes establishing a consensus definition of AKI and discovery of novel biomarkers that may allow early detection of AKI. Furthermore, significant insights into the pathophysiology of AKI and its deleterious systemic effects have been gleaned from animal studies. Urgently needed are large, definitive randomized clinical trials testing interventions to prevent and/or treat AKI. This review summarizes and analyzes current ongoing clinical trials registered with clinicaltrials.gov that address prevention or management of AKI. The purpose of this review is to provide a resource for people interested in potential prophylactic and therapeutic approaches to patient care and investigators hoping to plan and execute the next round of randomized clinical trials. Finally, this review discusses research needs that are not addressed by the current clinical trials portfolio and suggests key areas for future research in AKI.

Abstract

Relatively little is known about the effects of hemodialysis frequency on the disability of patients with ESRD.This study examined changes in physical performance and self-reported physical health and functioning among subjects randomized to frequent (six times per week) compared with conventional (three times per week) hemodialysis in both the Frequent Hemodialysis Network daily (n=245) and nocturnal (n=87) trials. The main outcome measures were adjusted change in scores over 12 months on the short physical performance battery (SPPB), RAND 36-item health survey physical health composite (PHC), and physical functioning subscale (PF) based on the intention to treat principle.Overall scores for SPPB, PHC, and PF were poor relative to population norms and in line with other studies in ESRD. In the Daily Trial, subjects randomized to frequent compared with conventional in-center hemodialysis experienced no significant change in SPPB (adjusted mean change of -0.20±0.19 versus -0.41±0.21, P=0.45) but experienced significant improvement in PHC (3.4±0.8 versus 0.4±0.8, P=0.009) and a relatively large change in PF that did not reach statistical significance. In the Nocturnal Trial, there were no significant differences among subjects randomized to frequent compared with conventional hemodialysis in SPPB (adjusted mean change of -0.92±0.44 versus -0.41±0.43, P=0.41), PHC (2.7±1.4 versus 2.1±1.5, P=0.75), or PF (-3.1±3.5 versus 1.1±3.6, P=0.40).Frequent in-center hemodialysis compared with conventional in-center hemodialysis improved self-reported physical health and functioning but had no significant effect on objective physical performance. There were no significant effects of frequent nocturnal hemodialysis on the same physical metrics.

Abstract

More frequent hemodialysis sessions and longer session lengths may offer improved phosphorus control. We analyzed data from the Frequent Hemodialysis Network Daily and Nocturnal Trials to examine the effects of treatment assignment on predialysis serum phosphorus and on prescribed dose of phosphorus binder, expressed relative to calcium carbonate on a weight basis. In the Daily Trial, with prescribed session lengths of 1.5-2.75 hours six times per week, assignment to frequent hemodialysis associated with both a 0.46 mg/dl decrease (95% confidence interval [95% CI], 0.13-0.78 mg/dl) in mean serum phosphorus and a 1.35 g/d reduction (95% CI, 0.20-2.50 g/d) in equivalent phosphorus binder dose at month 12 compared with assignment to conventional hemodialysis. In the Nocturnal Trial, with prescribed session lengths of 6-8 hours six times per week, assignment to frequent hemodialysis associated with a 1.24 mg/dl decrease (95% CI, 0.68-1.79 mg/dl) in mean serum phosphorus compared with assignment to conventional hemodialysis. Among patients assigned to the group receiving six sessions per week, 73% did not require phosphorus binders at month 12 compared with only 8% of patients assigned to sessions three times per week (P<0.001). At month 12, 42% of patients on nocturnal hemodialysis required the addition of phosphorus into the dialysate to prevent hypophosphatemia. Frequent hemodialysis did not have major effects on calcium or parathyroid hormone concentrations in either trial. In conclusion, frequent hemodialysis facilitates control of hyperphosphatemia and extended session lengths could allow more liberal diets and freedom from phosphorus binders.

Abstract

Patients with secondary hyperparathyroidism experience a variety of clinical symptoms which may adversely affect physical and mental function. As part of a multicenter, open-label clinical trial, subjects completed a questionnaire that included the Medical Outcomes Study Short Form-36 and 14 kidney disease-related symptoms at multiple time points during the study. Out of the 567 subjects who received at least one dose of cinacalcet, 528 to 535 (93.8-94.4%) completed all or portions of the questionnaire at baseline. The median bioactive parathyroid hormone (PTH) was 294 pg/mL (10%, 90% range, 172-655 pg/mL). Following treatment with cinacalcet and low-dose vitamin D sterols, subjects reported significant improvement in the frequency of pain in muscles, joints and bones, stiff joints, dry skin, itchy skin, excessive thirst, and trouble with memory. At end of the efficacy assessment phase (Weeks 16 to 22), the magnitude of improvement was the greatest in joint pain, bone pain, dry skin, and excessive thirst (>5 on a 0-100 scale; P < 0.001). There were no clinically or statistically significant changes in any of the Short Form-36 subscales or in the physical or mental health composite scores. Among patients on hemodialysis with moderate to severe secondary hyperparathyroidism, treatment with cinacalcet and low-dose vitamin D sterols results in significant improvement in pain in the muscles, joints and bones, joint stiffness, dry and itchy skin, excessive thirst, and trouble with memory.

Abstract

Impaired kidney function is a risk factor for upper gastrointestinal (GI) bleeding, an event associated with poor outcomes. The burden of upper GI bleeding and its effect on patients with ESRD are not well described. Using data from the US Renal Data System, we quantified the rates of occurrence of and associated 30-day mortality from acute, nonvariceal upper GI bleeding in patients undergoing dialysis; we used medical claims and previously validated algorithms where available. Overall, 948,345 patients contributed 2,296,323 patient-years for study. The occurrence rates for upper GI bleeding were 57 and 328 episodes per 1000 person-years according to stringent and lenient definitions of acute, nonvariceal upper GI bleeding, respectively. Unadjusted occurrence rates remained flat (stringent) or increased (lenient) from 1997 to 2008; after adjustment for sociodemographic characteristics and comorbid conditions, however, we found a significant decline for both definitions (linear approximation, 2.7% and 1.5% per year, respectively; P<0.001). In more recent years, patients had higher hematocrit levels before upper GI bleeding episodes and were more likely to receive blood transfusions during an episode. Overall 30-day mortality was 11.8%, which declined significantly over time (relative declines of 2.3% or 2.8% per year for the stringent and lenient definitions, respectively). In summary, despite declining trends worldwide, crude rates of acute, nonvariceal upper GI bleeding among patients undergoing dialysis have not decreased in the past 10 years. Although 30-day mortality related to upper GI bleeding declined, perhaps reflecting improvements in medical care, the burden on the ESRD population remains substantial.

Abstract

An increase in left ventricular mass (LVM) is associated with mortality and cardiovascular morbidity in patients with end-stage renal disease.The Frequent Hemodialysis Network (FHN) Daily Trial randomized 245 patients to 12 months of 6 times per week daily in-center hemodialysis or conventional hemodialysis; the FHN Nocturnal Trial randomized 87 patients to 12 months of 6 times per week nocturnal hemodialysis or conventional hemodialysis. The main cardiac secondary outcome was change in LVM. In each trial, we examined whether several predefined baseline demographic or clinical factors as well as change in volume removal, blood pressure, or solute clearance influenced the effect of frequent hemodialysis on LVM. In the Daily Trial, frequent hemodialysis resulted in a significant reduction in LVM (13.1 g; 95% CI, 5.0-21.3 g; P=0.002), LVM index (6.9 g/m(2); 95% CI, 2.4-11.3 g/m(2); P=0.003), and percent change in geometric mean of LVM (7.0%; 95% CI, 1.0%-12.6; P=0.02). Similar trends were noted in the Nocturnal Trial but did not reach statistical significance. In the Daily Trial, a more pronounced effect of frequent hemodialysis on LVM was evident among patients with left ventricular hypertrophy at baseline. Changes in LVM were associated with changes in blood pressure (conventional hemodialysis: R=0.28, P=0.01, daily hemodialysis: R=0.54, P<0.001) and were not significantly associated with changes in other parameters.Frequent in-center hemodialysis reduces LVM. The benefit of frequent hemodialysis on LVM may be mediated by salutary effects on blood pressure. Clinical Trial Registration- URL: http://www.clinicaltrials.gov. Unique identifier: NCT00264758.

Abstract

Patients on peritoneal dialysis experience inflammation associated with advanced chronic kidney disease and the therapy itself. An important consequence of the inflammation may be acceleration of the rate of decline in residual renal function. The decline in residual renal function has been associated with an increased mortality for patients in this population. Bardoxolone methyl is a synthetic triterpenoid. To date, the effects of bardoxolone methyl on kidney function in humans have been studied in patients with type 2 diabetes mellitus. A large-scale event-driven study of bardoxolone methyl in patients with type 2 diabetes mellitus with stage 4 chronic kidney disease is underway. The safety of bardoxolone methyl has not been evaluated in patients with more advanced (stage 5) chronic kidney disease or patients on dialysis. This report describes a proposed double blind, prospective evaluation of bardoxolone methyl in patients with type 2 diabetes mellitus receiving peritoneal dialysis. In addition to assessing the safety of bardoxolone methyl in this population, the study will evaluate the effect of bardoxolone methyl on residual renal function over 6 months as compared to placebo.

Abstract

The objective of this study was to determine the incidence of acute kidney injury (AKI) and its relation with mortality among hospitalized patients.Analysis of hospital discharge and laboratory data from an urban academic medical center over a 1-year period. We included hospitalized adult patients receiving two or more serum creatinine (sCr) measurements. We excluded prisoners, psychiatry, labor and delivery, and transferred patients, 'bedded outpatients' as well as individuals with a history of kidney transplant or chronic dialysis. We defined AKI as (a) an increase in sCr of ?0.3 mg/dl; (b) an increase in sCr to ?150% of baseline, or (c) the initiation of dialysis in a patient with no known history of prior dialysis. We identified factors associated with AKI as well as the relationships between AKI and in-hospital mortality. RESUlTS: Among the 19,249 hospitalizations included in the analysis, the incidence of AKI was 22.7%. Older persons, Blacks, and patients with reduced baseline kidney function were more likely to develop AKI (all p < 0.001). Among AKI cases, the most common primary admitting diagnosis groups were circulatory diseases (25.4%) and infection (16.4%). After adjustment for age, sex, race, admitting sCr concentration, and the severity of illness index, AKI was independently associated with in-hospital mortality (adjusted odds ratio 4.43, 95% confidence interval 3.68-5.35).AKI occurred in over 1 of 5 hospitalizations and was associated with a more than fourfold increased likelihood of death. These observations highlight the importance of AKI recognition as well as the association of AKI with mortality in hospitalized patients.

Abstract

Transplanted nephron mass is an important determinant of long-term allograft survival, but accurate assessment before organ retrieval is challenging. Newer radiologic imaging techniques allow for better determination of total kidney and cortical volumes.Using volume measurements reconstructed from magnetic resonance or computed tomography imaging from living donor candidates, we characterized total kidney (n=312) and cortical volumes (n=236) according to sex, age, weight, height, body mass index (BMI), and body surface area (BSA).The mean cortical volume was 204 mL (range 105-355 mL) with no significant differences between left and right cortical volumes. The degree to which existing anthropomorphic surrogates predict nephron mass was quantified, and a diligent attempt was made to derive a better surrogate model for nephron mass. Cortical volumes were strongly associated with sex and BSA, but not with weight, height, or BMI. Four prediction models for cortical volume constructed using combinations of age, sex, race, weight, and height were compared with models including either BSA or BMI.Among existing surrogate measures, BSA was superior to BMI in predicting renal cortical volume. We were able to construct a statistically superior proxy for cortical volume, but whether relevant improvements in predictive accuracy could be gained needs further evaluation in a larger population.

Abstract

As research has identified a wide array of biological functions of vitamin D, the consequences of vitamin D deficiency in persons with chronic kidney disease has attracted increased attention. The objective of this study was to determine the extent of 25-hydroxyvitamin D (25-OH vitamin D) deficiency and its associations with self-reported physical activity and health-related quality of life (HRQoL) among participants of the Comprehensive Dialysis Study (CDS).The nutrition substudy of the CDS enrolled patients new to dialysis from 68 dialysis units throughout the USA. Baseline 25-OH vitamin D concentration was measured using the Direct Enzyme Immunoassay (Immunodiagnostic Systems Inc.). Physical activity was measured with the Human Activity Profile (HAP); the Medical Outcomes Study Short Form-12 (SF-12) was employed to measure HRQoL.Mean age of the participants (n = 192) was 62 years. There were 124 participants (65%) with 25-OH vitamin D concentrations < 15 ng/mL, indicating deficiency, and 64 (33%) with 25-OH vitamin D ? 15 to <30 ng/mL, indicating insufficiency. After adjusting for age, sex, race/ethnicity, diabetes, season and center, lower 25-OH vitamin D concentrations were independently associated with lower scores on the HAP and on the Mental Component Summary of the SF-12 (P < 0.05 for both), but not with the Physical Component Summary of the SF-12.In a well-characterized cohort of incident dialysis patients, lower 25-OH vitamin D concentrations were associated with lower self-reported physical activity and poorer self-reported mental health.

Abstract

Patients on dialysis maintain extremely low levels of physical activity. Prior studies have demonstrated a direct correlation between nutrition and physical activity but provide conflicting data on the link between inflammation and physical activity. Using a cohort of patients new to dialysis from the Comprehensive Dialysis Study (CDS), we examined associations of self-reported physical activity with laboratory markers of nutrition and inflammation.Between June 2005 and June 2007, CDS collected data on self-reported physical activity, nutrition, and health-related quality of life from patients starting dialysis in 296 facilities located throughout the United States. Baseline serum samples were collected from participants in a nutrition sub-study of CDS.Serum albumin and prealbumin were measured as markers of nutrition, and C-reactive protein (CRP) and ?-1-acid glycoprotein as markers of inflammation. Self-reported physical activity was characterized by the maximum activity score (MAS) and adjusted activity score (AAS) of the Human Activity Profile.The mean age of participants in the analytic cohort (n = 201) was 61 years. The MAS and AAS were below the 10th and first percentile, respectively, in comparison with healthy 60 year-old norms. Both activity scores were directly correlated with albumin (r(2) = 0.3, P < .0001) and prealbumin (r(2) = 0.3, P < .0001), and inversely correlated with CRP (AAS: r(2) = -0.2, P = .01; MAS: r(2) = -0.1, P = .08). In multivariate analyses adjusting for age, gender, race/ethnicity, diabetes status, and center, both activity scores were directly correlated with prealbumin and inversely correlated with CRP.Patients new to dialysis with laboratory-based evidence of malnutrition and/or inflammation are likely to report lower levels of physical activity.

Abstract

Prior small studies have shown multiple benefits of frequent nocturnal hemodialysis compared to conventional three times per week treatments. To study this further, we randomized 87 patients to three times per week conventional hemodialysis or to nocturnal hemodialysis six times per week, all with single-use high-flux dialyzers. The 45 patients in the frequent nocturnal arm had a 1.82-fold higher mean weekly stdKt/V(urea), a 1.74-fold higher average number of treatments per week, and a 2.45-fold higher average weekly treatment time than the 42 patients in the conventional arm. We did not find a significant effect of nocturnal hemodialysis for either of the two coprimary outcomes (death or left ventricular mass (measured by MRI) with a hazard ratio of 0.68, or of death or RAND Physical Health Composite with a hazard ratio of 0.91). Possible explanations for the left ventricular mass result include limited sample size and patient characteristics. Secondary outcomes included cognitive performance, self-reported depression, laboratory markers of nutrition, mineral metabolism and anemia, blood pressure and rates of hospitalization, and vascular access interventions. Patients in the nocturnal arm had improved control of hyperphosphatemia and hypertension, but no significant benefit among the other main secondary outcomes. There was a trend for increased vascular access events in the nocturnal arm. Thus, we were unable to demonstrate a definitive benefit of more frequent nocturnal hemodialysis for either coprimary outcome.

Abstract

The proportion of prospective living donors disqualified for medical reasons is unknown. The objective of this study is to delineate and quantify specific reasons for exclusion of prospective living donors from kidney donation.All adult prospective kidney donors who contacted our transplant program between October 1, 2007 and April 1, 2009 were included in our analysis (n?=?484). Data were collected by review of an electronic transplant database.Of the 484 prospective donors, 39 (8%) successfully donated, 229 (47%) were excluded, 104 (22%) were actively undergoing evaluation, and 112 (23%) were withdrawn before evaluation was complete. Criteria for exclusion were medical (n?=?150), psychosocial (n?=?22), or histocompatibility (n?=?57) reasons. Of the 150 prospective donors excluded for medical reasons, 79% were excluded because of obesity, hypertension, nephrolithiasis, and/or abnormal glucose tolerance. One hundred and forty-seven (61%) intended recipients had only one prospective living donor, of whom 63 (42%) were excluded.A significant proportion of prospective living kidney donors were excluded for medical reasons such as obesity (body mass index >30), hypertension, nephrolithiasis, and abnormal glucose tolerance. Longer-term studies are needed to characterize the risks to medically complex kidney donors and the potential risks and benefits afforded to recipients.

Abstract

Acute kidney injury (AKI) requiring dialysis is associated with high mortality. Most prognostic tools used to describe case complexity and to project patient outcome lack predictive accuracy when applied in patients with AKI. In this study, we developed an AKI-specific predictive model for 60-day mortality and compared the model to the performance of two generic (Sequential Organ Failure Assessment [SOFA] and Acute Physiology and Chronic Health Evaluation II [APACHE II]) scores, and a disease specific (Cleveland Clinic [CCF]) score.Data from 1122 subjects enrolled in the Veterans Affairs/National Institutes of Health Acute Renal Failure Trial Network study; a multicenter randomized trial of intensive versus less intensive renal support in critically ill patients with AKI conducted between November 2003 and July 2007 at 27 VA- and university-affiliated centers.The 60-day mortality was 53%. Twenty-one independent predictors of 60-day mortality were identified. The logistic regression model exhibited good discrimination, with an area under the receiver operating characteristic (ROC) curve of 0.85 (0.83 to 0.88), and a derived integer risk score yielded a value of 0.80 (0.77 to 0.83). Existing scoring systems, including APACHE II, SOFA, and CCF, when applied to our cohort, showed relatively poor discrimination, reflected by areas under the ROC curve of 0.68 (0.64 to 0.71), 0.69 (0.66 to 0.73), and 0.65 (0.62 to 0.69), respectively.Our new risk model outperformed existing generic and disease-specific scoring systems in predicting 60-day mortality in critically ill patients with AKI. The current model requires external validation before it can be applied to other patient populations.

Abstract

Little is known about trends in the timing of first nephrology consultation and associated outcomes among older patients initiating dialysis.Data from patients aged 67 years or older who initiated dialysis in the United States between January 1, 1996, and December 31, 2006, were stratified by timing of the earliest identifiable nephrology visit. Trends of earlier nephrology consultation were formally examined in light of concurrently changing case mix and juxtaposed with trends in 1-year mortality rates after initiation of dialysis.Among 323,977 older patients initiating dialysis, the proportion of patients receiving nephrology care less than 3 months before initiation of dialysis decreased from 49.6% (in 1996) to 34.7% (in 2006). Patients initiated dialysis with increasingly preserved kidney function, from a mean estimated glomerular filtration rate of 8 mL/min/1.73 m(2) in 1996 to 12 mL/min/1.73 m(2) in 2006. Patients were less anemic in later years, which was partly attributable to increased use of erythropoiesis-stimulating agents, and fewer used peritoneal dialysis as the initial modality. During the same period, crude 1-year mortality rates remained unchanged (annual change in mortality rate, +0.2%; 95% confidence interval, 0% to +0.4%). Adjustment for changes in demographic and comorbidity patterns yielded estimated annual reductions in 1-year mortality rates of 0.9% (95% confidence interval, 0.7% to 1.1%), which were explained only partly by concurrent trends toward earlier nephrology consultation (annual mortality reduction after accounting for timing of nephrology care was attenuated to 0.4% [0.2% to 0.6%]).Despite significant trends toward earlier use of nephrology consultation among older patients approaching maintenance dialysis, we observed no material improvement in 1-year survival rates after dialysis initiation during the same time period.

Abstract

Identifying potential modifiable risk factors to reduce the incidence of vascular access thrombosis in hemodialysis could reduce considerable morbidity and health care costs. We analyzed data from a subset of 1426 HEMO study subjects to determine whether more frequent intradialytic hypotension and/or lower predialysis systolic BP were associated with higher rates of vascular access thrombosis. Our primary outcome measure was episodes of vascular access thrombosis occurring within a given 6-month period during HEMO study follow-up. There were 2005 total episodes of vascular access thrombosis during a median 3.1 years of follow-up. The relative rate of thrombosis of native arteriovenous fistulas for the highest quartile of intradialytic hypotension was approximately twice that of the lowest quartile, independent of predialysis systolic BP and other covariates. There was no significant association of intradialytic hypotension with prosthetic arteriovenous graft thrombosis after multivariable adjustment. Higher predialysis systolic BP was associated with a lower rate of fistula and graft thrombosis, independent of intradialytic hypotension and other covariates. In conclusion, more frequent episodes of intradialytic hypotension and lower predialysis systolic BP associate with increased rates of vascular access thrombosis. These results underscore the importance of including vascular access patency in future studies of BP management in hemodialysis.

Abstract

Persons with end-stage renal disease (ESRD) on hemodialysis carry an exceptionally high burden of cardiovascular disease. Angiotensin-converting enzyme inhibitors (ACEIs) are recommended for patients on dialysis, but there are few data regarding their effectiveness in ESRD.We conducted a secondary analysis of results of the HEMO study, a randomized trial of dialysis dose and membrane flux in patients on maintenance hemodialysis. We focused on the nonrandomized exposure of ACEI use, using proportional hazards regression and a propensity score analysis. The primary outcome was all-cause mortality. Secondary outcomes examined in the present analysis were cardiovascular hospitalization, heart failure hospitalization, and the composite outcomes of death or cardiovascular hospitalization and death or heart failure hospitalization.In multivariable-adjusted analyses, there were no significant associations among ACEI use and mortality (hazard ratio 0.97, 95% CI 0.82-1.14), cardiovascular hospitalization, and either composite outcome. Angiotensin-converting enzyme inhibitor use was associated with a higher risk of heart failure hospitalization (hazard ratio 1.41, 95% CI 1.11-1.80). In the propensity score-matched cohort, ACEI use was not significantly associated with any outcomes, including heart failure hospitalization.In a well-characterized cohort of patients on maintenance hemodialysis, ACEI use was not significantly associated with mortality or cardiovascular morbidity. The higher risk of heart failure hospitalization associated with ACEI use may not only reflect residual confounding but also highlights gaps in evidence when applying treatments proven effective in the general population to patients with ESRD. Our results underscore the need for definitive trials in ESRD to inform the treatment of cardiovascular disease.

Abstract

Though much is known about the prognostic influence of acute kidney injury (AKI) in left-side heart failure, much less is known about AKI in patients with pulmonary arterial hypertension (PAH).We identified consecutive patients with PAH who were hospitalized at Stanford Hospital for acute right-side heart failure. AKI was diagnosed according to the criteria of the Acute Kidney Injury Network. From June 1999 to June 2009, 105 patients with PAH were hospitalized for acute right-side heart failure (184 hospitalizations). AKI occurred in 43 hospitalizations (23%) in 34 patients (32%). The odds of developing AKI were higher among patients with chronic kidney disease (odds ratio [OR] 3.9, 95% confidence interval [CI] 1.8-8.5), high central venous pressure (OR 1.8, 95% CI 1.1-2.4, per 5 mm Hg), and tachycardia on admission (OR 4.3, 95% CI 2.1-8.8). AKI was strongly associated with 30-day mortality after acute right-side heart failure hospitalization (OR 5.3, 95% CI 2.2-13.2).AKI is relatively common in patients with PAH and associated with a short-term risk of death.

Abstract

Infection and cardiovascular disease are leading causes of hospitalization and death in patients on dialysis. The objective of this study was to determine whether an infection-related hospitalization increased the short-term risk of a cardiovascular event in older patients on dialysis.With use of the United States Renal Data System, patients aged 65 to 100 years who started dialysis between January 1, 2000, and December 31, 2002, were examined. All hospitalizations were examined from study entry until time of transplant, death, or December 31, 2004. All discharge diagnoses were examined to determine if an infection occurred during hospitalization. Only principal discharge diagnoses were examined to ascertain cardiovascular events of interest. We used the self-controlled case-series method to estimate the relative incidence of a cardiovascular event within 90 days after an infection-related hospitalization as compared with other times not within 90 days of such a hospitalization.A total of 16,874 patients had at least one cardiovascular event and were included in the self-controlled case-series analysis. The risk of a cardiovascular event was increased by 25% in the first 30 days after an infection and was overall increased 18% in the 90 days after an infection-related hospitalization relative to control periods.The first 90 days, and in particular the first 30 days, after an infection-related hospitalization is a high-risk period for cardiovascular events and may be an important timeframe for cardiovascular risk reduction, monitoring, and intervention in older patients on dialysis.

Abstract

In the Hemodialysis (HEMO) Study, observed small decreases in achieved equilibrated Kt/V(urea) were noncausally associated with markedly increased mortality. Here we examine the association of mortality with modeled volume (V(m)), the denominator of equilibrated Kt/V(urea).Parameters derived from modeled urea kinetics (including V(m)) and blood pressure (BP) were obtained monthly in 1846 patients. Case mix-adjusted time-dependent Cox regressions were used to relate the relative mortality hazard at each time point to V(m) and to the change in V(m) over the preceding 6 months. Mixed effects models were used to relate V(m) to changes in intradialytic systolic BP and to other factors at each follow-up visit.Mortality was associated with V(m) and change in V(m) over the preceding 6 months. The association between change in V(m) and mortality was independent of vascular access complications. In contrast, mortality was inversely associated with V calculated from anthropometric measurements (V(ant)). In case mix-adjusted analysis using V(m) as a time-dependent covariate, the association of mortality with V(m) strengthened after statistical adjustment for V(ant). After adjustment for V(ant), higher V(m) was associated with slightly smaller reductions in intradialytic systolic BP and with risk factors for mortality including recent hospitalization and reductions in serum albumin concentration and body weight.An increase in V(m) is a marker for illness and mortality risk in hemodialysis patients.

Abstract

Contemporary studies have not comprehensively compared waiting times and determinants of deceased donor kidney transplantation across all major racial ethnic groups in the Unites States. Here, we compared relative rates and determinants of waitlisting and deceased donor kidney transplantation among 503,090 nonelderly adults of different racial ethnic groups who initiated hemodialysis between1995 and 2006 with follow-up through 2008. Annual rates of deceased donor transplantation from the time of dialysis initiation were lowest in American Indians/Alaska Natives (2.4%) and blacks (2.8%), intermediate in Pacific Islanders (3.1%) and Hispanics (3.2%), and highest in whites (5.9%) and Asians (6.4%). Lower rates of deceased donor transplantation among most racial ethnic minority groups appeared primarily to reflect differences in time from waitlisting to transplantation, but this was not the result of higher rates of waitlist inactivity or removal from the waitlist. The fraction of the reduced transplant rates attributable to measured factors (e.g., demographic, clinical, socioeconomic, linguistic, and geographic factors) varied from 14% in blacks to 43% in American Indians/Alaska Natives compared with whites. In conclusion, adjusted rates of deceased donor kidney transplantation remain significantly lower among racial ethnic minorities compared with whites; generally, differences in time to waitlisting were not as pronounced as differences in time between waitlisting and transplantation. Determinants of delays in time to transplantation differed substantially by racial ethnic group. Area-based efforts targeted to address racial- and ethnic-specific delays in transplantation may help to reduce overall disparities in deceased donor kidney transplantation in the United States.

Abstract

Studies examining dose of continuous renal replacement therapy (CRRT) and outcomes have yielded conflicting results. Most studies considered the prescribed dose as the effluent rate represented by ml/kg per hour and reported this volume as a surrogate of solute removal. Because filter fouling can reduce the efficacy of solute clearance, the actual delivered dose may be substantially lower than the observed effluent rate.Data were examined from 52 critically ill patients with acute kidney injury (AKI) requiring dialysis. All patients were treated with predilution continuous venovenous hemodiafiltration (CVVHDF) and regional citrate anticoagulation. Filter performance was monitored during the entire course of therapy by measuring blood urea nitrogen (BUN) and dialysis fluid urea nitrogen (FUN) at initiation and every 12 hours. Filter efficacy was assessed by calculating FUN/BUN ratios every 12 hours of filter use. Prescribed urea clearance (K, ml/min) was determined from the effluent rate. Actual delivered urea clearance was determined using dialysis-side measurements.Median daily treatment time was 1413 minutes (1260 to 1440) with a total effluent volume of 46.4 ± 17.4 L and urea mass removal of 13.0 ± 7.6 mg/min. Prescribed clearance overestimated the actual delivered clearance by 23.8%. This gap between prescribed and delivered clearance was related to the decrease in filter function assessed by the FUN/BUN ratio.Effluent volume significantly overestimates delivered dose of small solutes in CRRT. To assess adequacy of CRRT, solute clearance should be measured rather than estimated by the effluent volume.

Abstract

Sepsis commonly contributes to acute kidney injury (AKI); however, the frequency with which sepsis develops as a complication of AKI and the clinical consequences of this sepsis are unknown. This study examined the incidence of, and outcomes associated with, sepsis developing after AKI.We analyzed data from 618 critically ill patients enrolled in a multicenter observational study of AKI (PICARD). Patients were stratified according to their sepsis status and timing of incident sepsis relative to AKI diagnosis.We determined the associations among sepsis, clinical characteristics, provision of dialysis, in-hospital mortality, and length of stay (LOS), comparing outcomes among patients according to their sepsis status. Among the 611 patients with data on sepsis status, 174 (28%) had sepsis before AKI, 194 (32%) remained sepsis-free, and 243 (40%) developed sepsis a median of 5 days after AKI. Mortality rates for patients with sepsis developing after AKI were higher than in sepsis-free patients (44 vs. 21%; p < 0.0001) and similar to patients with sepsis preceding AKI (48 vs. 44%; p = 0.41). Compared with sepsis-free patients, those with sepsis developing after AKI were also more likely to be dialyzed (70 vs. 50%; p < 0.001) and had longer LOS (37 vs. 27 days; p < 0.001). Oliguria, higher fluid accumulation and severity of illness scores, non-surgical procedures after AKI, and provision of dialysis were predictors of sepsis after AKI.Sepsis frequently develops after AKI and portends a poor prognosis, with high mortality rates and relatively long LOS. Future studies should evaluate techniques to monitor for and manage this complication to improve overall prognosis.

Abstract

Cognitive impairment is common among persons with chronic kidney disease, but the extent to which nontraditional vascular risk factors mediate this association is unclear.We conducted cross-sectional analyses of baseline data collected from adults with chronic kidney disease participating in the Chronic Renal Insufficiency Cohort study. Cognitive impairment was defined as a Modified Mini-Mental State Exam score>1 SD below the mean score.Among 3591 participants, the mean age was 58.2±11.0 years, and the mean estimated GFR (eGFR) was 43.4±13.5 ml/min per 1.73 m2. Cognitive impairment was present in 13%. After adjustment for demographic characteristics, prevalent vascular disease (stroke, coronary artery disease, and peripheral arterial disease) and traditional vascular risk factors (diabetes, hypertension, smoking, and elevated cholesterol), an eGFR<30 ml/min per 1.73 m2 was associated with a 47% increased odds of cognitive impairment (odds ratio 1.47, 95% confidence interval 1.05, 2.05) relative to those with an eGFR 45 to 59 ml/min per 1.73 m2. This association was attenuated and no longer significant after adjustment for hemoglobin concentration. While other nontraditional vascular risk factors including C-reactive protein, homocysteine, serum albumin, and albuminuria were correlated with cognitive impairment in unadjusted analyses, they were not significantly associated with cognitive impairment after adjustment for eGFR and other confounders.The prevalence of cognitive impairment was higher among those with lower eGFR, independent of traditional vascular risk factors. This association may be explained in part by anemia.

Abstract

Previous studies of blood pressure and mortality in haemodialysis have yielded mixed results, perhaps due to confounding by comorbid conditions. We hypothesized that after improved accounting for confounding factors, higher systolic blood pressure (SBP) would be associated with higher all-cause mortality. We conducted a secondary analysis of data from the haemodialysis study, a randomized trial in prevalent haemodialysis patients. We used three proportional hazard models to determine the relative hazard at different levels of SBP: (1) Model-BL used baseline SBP; (2) Model-TV used SBP as a time-varying variable; and (3) Model-TV-Lag added a 3-month lag to Model-TV to de-emphasize changes in SBP associated with acute illness. In all the models, pre-dialysis SBP <120?mm?Hg was associated with a higher risk of mortality compared with the referent group (140-159?mm?Hg); higher pre-dialysis SBP was not associated with higher risk of mortality. In conclusion, we observed a robust association between lower pre-dialysis SBP and higher risk for all-cause and cardiovascular mortality in a well-characterized cohort of prevalent haemodialysis patients. Randomized clinical trials are needed to define optimal blood pressure targets in the haemodialysis population.

Abstract

The annual mortality rate for maintenance hemodialysis patients in the United States is unacceptably high at 15%-20%. In 2004, we initiated the Frequent Hemodialysis Network (FHN) clinical trials. This report presents baseline characteristics of FHN Trial participants and compares them with hemodialysis patients tracked in US Renal Data System (USRDS) data.2 separate randomized clinical trials.FHN includes 332 patients with chronic kidney disease requiring long-term dialysis therapy enrolled in 2 separate randomized clinical trials. The FHN Daily Trial (245 randomly assigned participants) was designed to compare outcomes of 6-times-weekly in-center daily hemodialysis (1.5-2.75 h/session) with conventional 3-times-weekly in-center hemodialysis. The FHN Nocturnal Trial (87 randomly assigned participants) was designed to compare outcomes of 6-times-weekly home nocturnal (6-8 h/session) with conventional 3-times-weekly hemodialysis. USRDS data include 338,109 incident and prevalent long-term hemodialysis patients from the calendar year 2007.Participants in both trials were on average younger than the average hemodialysis patient in the United States (Daily Trial, 50.4 years; P < 0.001; Nocturnal Trial, 52.8 years; P < 0.001). Compared with USRDS data, whites were under-represented in the Daily Trial (36% vs 55%; P < 0.001), whereas Hispanics were under-represented in the Nocturnal Trial and over-represented in the Daily Trial (0% vs 28%; P < 0.001). In addition, there were more fistulas and fewer catheters in the Daily Trial (61% and 20%, respectively; P < 0.001 for both) and fewer grafts and more catheters in the Nocturnal Trial (10% and 44%, respectively; P < 0.005 for both).Clinical trial exclusion criteria and patient willingness to participate limit comparisons with the USRDS.FHN participants were younger and the racial composition for each study was different from the racial composition of the aggregate US dialysis population. Catheters for vascular access were more common in FHN Nocturnal Trial participants.

Abstract

Phosphate binders include calcium acetate or carbonate, sevelamer hydrochloride or carbonate, magnesium and lanthanum carbonate, and aluminum carbonate or hydroxide. Their relative phosphate-binding capacity has been assessed in human, in vivo studies that have measured phosphate recovery from stool and/or changes in urinary phosphate excretion or that have compared pairs of different binders where dose of binder in each group was titrated to a target level of serum phosphate. The relative phosphate-binding coefficient (RPBC) based on weight of each binder can be estimated relative to calcium carbonate, the latter being set to 1.0. A systematic review of these studies gave the following estimated RPBC: for elemental lanthanum, 2.0, for sevelamer hydrochloride or carbonate 0.75, for calcium acetate 1.0, for anhydrous magnesium carbonate 1.7, and for "heavy" or hydrated, magnesium carbonate 1.3. Estimated RPBC for aluminum-containing binders were 1.5 for aluminum hydroxide and 1.9 for aluminum carbonate. The phosphate-binding equivalent dose was then defined as the dose of each binder in g × its RPBC, which would be the binding ability of an equivalent weight of calcium carbonate. The phosphate-binding equivalent dose may be useful in comparing changes in phosphate binder prescription over time when multiple binders are being prescribed, when estimating an initial binder prescription, and also in phosphate kinetic modeling.

Abstract

The influence of body size on dialysis-related mortality among Asians and Pacific Islanders--heterogeneous ethnic groups with dissimilar body compositions--is poorly understood. Our study objective was to compare the relations of body size and mortality among patients with end-stage renal disease of different ethnicities.We examined data from a cohort of 21,492 adult Asians, Pacific Islanders and non-Hispanic Whites who initiated dialysis during 1995-2003 within California, Hawaii and the US Pacific Islands.Time to death through September 22, 2008.Among both men and women, Pacific Islanders were the heaviest and Whites the tallest of the ethnic groups examined. Annual mortality rates were highest among Whites (29.6%), intermediate among Pacific Islanders (18.8%) and lowest among Asians (17.3%). Larger body size was associated with lower mortality among Pacific Islanders, Whites and most Asians on dialysis after adjustment for patient-level sociodemographic and clinical factors, area-based socioeconomic status and geographic clustering. Filipinos were the exception to this rule and showed a trend towards higher mortality with increasing body size. These findings were consistent irrespective of how body size was measured.Larger body size is associated with lower mortality among Pacific Islanders, Whites and most Asians on dialysis. Use of disaggregated ethnicity data may enhance our understanding of how ethnicity- or community-specific factors influence body size, body composition and dialysis-related outcomes in these diverse populations.

Abstract

Self-reported physical health and functioning and direct measures of physical performance are decreased in hemodialysis patients and are associated with mortality and hospitalization.We determined baseline cross-sectional associations of physical performance, health, and functioning with demographics, clinical characteristics, nutritional indexes, laboratory benchmarks, and measures of body composition in participants in the Frequent Hemodialysis Network (FHN) trial.375 persons enrolled in the FHN with data for physical performance, health, and functioning.Explanatory variables were categorized into fixed factors of age, race, comorbid conditions (diabetes mellitus, heart failure, and peripheral arterial disease) and potentially modifiable factors of dialysis dose, phosphorus level, hemoglobin level, equilibrated normalized protein catabolic rate (enPCR), body composition, body mass index, phase angle, and ratio of intracellular water volume to body weight (calculated from bioelectrical impedance).Scores on tests of physical performance, health, and functioning.Physical performance measured using the Short Physical Performance Battery, self-reported physical health and functioning using the 36-Item Short Form Health Survey (SF-36). Body composition (body mass index and bioimpedance analysis) and laboratory data were obtained from affiliated dialysis providers.Relative to population norms, scores for all 3 physicality metrics were low. Poorer scores on all 3 metrics were associated with diabetes mellitus and peripheral arterial disease. Poorer scores on the SF-36 Physical Functioning subscale and Short Physical Performance Battery also were associated with age, lower ratio of intracellular water volume to body weight, and lower enPCR. Black race was associated with poorer scores on the Short Physical Performance Battery.This was a cross-sectional study of individuals agreeing to participate in the FHN study and may not be generalizable to the general dialysis population.Hemodialysis patients show markedly impaired physical performance, health, and functioning relative to population norms. Although some factors associated with these impairments are not modifiable, others may change with improvement in nutritional status or body composition.

Abstract

In this randomized clinical trial, we aimed to determine whether increasing the frequency of in-center hemodialysis would result in beneficial changes in left ventricular mass, self-reported physical health, and other intermediate outcomes among patients undergoing maintenance hemodialysis.Patients were randomly assigned to undergo hemodialysis six times per week (frequent hemodialysis, 125 patients) or three times per week (conventional hemodialysis, 120 patients) for 12 months. The two coprimary composite outcomes were death or change (from baseline to 12 months) in left ventricular mass, as assessed by cardiac magnetic resonance imaging, and death or change in the physical-health composite score of the RAND 36-item health survey. Secondary outcomes included cognitive performance; self-reported depression; laboratory markers of nutrition, mineral metabolism, and anemia; blood pressure; and rates of hospitalization and of interventions related to vascular access.Patients in the frequent-hemodialysis group averaged 5.2 sessions per week; the weekly standard Kt/V(urea) (the product of the urea clearance and the duration of the dialysis session normalized to the volume of distribution of urea) was significantly higher in the frequent-hemodialysis group than in the conventional-hemodialysis group (3.54±0.56 vs. 2.49±0.27). Frequent hemodialysis was associated with significant benefits with respect to both coprimary composite outcomes (hazard ratio for death or increase in left ventricular mass, 0.61; 95% confidence interval [CI], 0.46 to 0.82; hazard ratio for death or a decrease in the physical-health composite score, 0.70; 95% CI, 0.53 to 0.92). Patients randomly assigned to frequent hemodialysis were more likely to undergo interventions related to vascular access than were patients assigned to conventional hemodialysis (hazard ratio, 1.71; 95% CI, 1.08 to 2.73). Frequent hemodialysis was associated with improved control of hypertension and hyperphosphatemia. There were no significant effects of frequent hemodialysis on cognitive performance, self-reported depression, serum albumin concentration, or use of erythropoiesis-stimulating agents.Frequent hemodialysis, as compared with conventional hemodialysis, was associated with favorable results with respect to the composite outcomes of death or change in left ventricular mass and death or change in a physical-health composite score but prompted more frequent interventions related to vascular access. (Funded by the National Institute of Diabetes and Digestive and Kidney Diseases and others; ClinicalTrials.gov number, NCT00264758.).

Abstract

Physical inactivity contributes to the frailty and the decline in function that develops over time among patients with end-stage renal disease. We assessed physical activity among 1547 ambulatory patients new to dialysis in the United States Renal Data System Comprehensive Dialysis Study. We used a self-reporting Human Activity Profile that included Maximal and Adjusted Activity Scores and compared results to established norms by age and gender. Physical activity was found to be extremely low with scores for all age and gender categories below the 5th percentile of healthy individuals and 95% of patients had scores consonant with low fitness. Older age, female gender, diabetes, atherosclerotic disease, and a low level of education were associated with lower activity scores assessed by univariate and multivariable linear regression analysis. Higher serum albumin, creatinine, and lower body mass index, but not hemoglobin levels, were associated with greater physical activity. By multivariable analysis, patients on hemodialysis using a catheter reported lower levels of physical activity compared to those on peritoneal dialysis, hemodialysis using an arteriovenous fistula, or with a graft. Lower Maximal and Adjusted Activity Scores were associated with poor physical function and mental health. Hence, physical activity is distressingly low among patients new to dialysis. Thus, strategies to enhance activity in these patients should be explored.

Abstract

When evaluating clinical characteristics and outcomes in patients on hemodialysis, the prevalence and severity of comorbidity may change over time. Knowing whether updated assessments of comorbidity enhance predictive power will assist the design of future studies. We conducted a secondary data analysis of 1846 prevalent hemodialysis patients from 15 US clinical centers enrolled in the HEMO study. Our primary explanatory variable was the Index of Coexistent Diseases score, which aggregates comorbidities, as a time-constant and time-varying covariate. Our outcomes of interest were all-cause mortality, time to first hospitalization, and total hospitalizations. We used Cox proportional hazards regression. Accounting for an updated comorbidity assessment over time yielded a more robust association with mortality than accounting for baseline comorbidity alone. The variation explained by time-varying comorbidity assessments on time to death was greater than age, baseline serum albumin, diabetes, or any other covariates. There was a less pronounced advantage of updated comorbidity assessments on determining time to hospitalization. Updated assessments of comorbidity significantly strengthen the ability to predict death in patients on hemodialysis. Future studies in dialysis should invest the necessary resources to include repeated assessments of comorbidity.

Can Rescaling Dose of Dialysis to Body Surface Area in the HEMO Study Explain the Different Responses to Dose in Women versus Men?CLINICAL JOURNAL OF THE AMERICAN SOCIETY OF NEPHROLOGYDaugirdas, J. T., Greene, T., Chertow, G. M., Depner, T. A.2010; 5 (9): 1628-1636

Abstract

In the Hemodialysis (HEMO) Study, the lower death rate in women but not in men assigned to the higher dose (Kt/V) could have resulted from use of "V" as the normalizing factor, since women have a lower anthropometric V per unit of surface area (V/SA) than men.The effect of Kt/V on mortality was re-examined after normalizing for surface area and expressing dose as surface area normalized standard Kt/V (SAn-stdKt/V).Both men and women in the high-dose group received approximately 16% more dialysis (when expressed as SAn-stdKt/V) than the controls. SAn-stdKt/V clustered into three levels: 2.14/wk for conventional dose women, 2.44/wk for conventional dose men or 2.46/wk for high-dose women, and 2.80/wk for high-dose men. V/SA was associated with the effect of dose assignment on the risk of death; above 20 L/m(2), the mortality hazard ratio = 1.23 (0.99 to 1.53); below 20 L/m(2), hazard ratio = 0.78 (0.65 to 0.95), P = 0.002. Within gender, V/SA did not modify the effect of dose on mortality.When normalized to body surface area rather than V, the dose of dialysis in women in the HEMO Study was substantially lower than in men. The lowest surface-area-normalized dose was received by women randomized to the conventional dose arm, possibly explaining the sex-specific response to dialysis dose. Results are consistent with the hypothesis that when dialysis dose is expressed as Kt/V, women, due to their lower V/SA ratio, require a higher amount than men.

Abstract

Infection is an important cause of hospitalization and death in patients receiving dialysis. Few studies have examined the full range of infections experienced by dialysis patients. The purpose of this study is to examine types, rates, and risk factors for infection in older persons starting dialysis therapy.Retrospective observational cohort study.The cohort was assembled from the US Renal Data System and included patients aged 65-100 years who initiated dialysis therapy between January 1, 2000, and December 31, 2002. Exclusions included prior kidney transplant, unknown dialysis modality, or death, loss to follow-up, or transplant during the first 90 days of dialysis therapy. Patients were followed up until death, transplant, or study end on December 31, 2004.Baseline demographics, comorbid conditions, and serum albumin and hemoglobin levels.Infection-related hospitalizations were ascertained using discharge International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes. Hospitalization rates were calculated for each type of infection. The Wei-Lin-Weissfeld model was used to examine risk factors for up to 4 infection-related events.119,858 patients were included, 7,401 of whom were on peritoneal dialysis therapy. During a median follow-up of 1.9 years, infection-related diagnoses were observed in approximately 35% of all hospitalizations. Approximately 50% of patients had at least 1 infection-related hospitalization. Rates (per 100 person-years) of pulmonary, soft-tissue, and genitourinary infections ranged from 8.3-10.3 in patients on peritoneal dialysis therapy and 10.2-15.3 in patients on hemodialysis therapy. Risk factors for infection included older age, female sex, diabetes, heart failure, pulmonary disease, and low serum albumin level.Use of ICD-9-CM codes, reliance on Medicare claims to capture hospitalizations, use of the Medical Evidence Form to ascertain comorbid conditions, and absence of data for dialysis access.Infection-related hospitalization is frequent in older patients on dialysis therapy. A broad range of infections, many unrelated to dialysis access, result in hospitalization in this population.

Abstract

Cognitive impairment is common among persons with ESRD, but the underlying mechanisms are unknown. This study evaluated the prevalence of cognitive impairment and association with modifiable ESRD- and dialysis-associated factors in a large group of hemodialysis patients.Cross-sectional analyses were conducted on baseline data collected from 383 subjects participating in the Frequent Hemodialysis Network trials. Global cognitive impairment was defined as a score <80 on the Modified Mini-Mental State Exam, and impaired executive function was defined as a score >or=300 seconds on the Trailmaking B test. Five main categories of explanatory variables were examined: urea clearance, nutritional markers, hemodynamic measures, anemia, and central nervous system (CNS)-active medications.Subjects had a mean age of 51.6 +/- 13.3 years and a median ESRD vintage of 2.6 years. Sixty-one subjects (16%) had global cognitive impairment, and 110 subjects (29%) had impaired executive function. In addition to several nonmodifiable factors, the use of H1-receptor antagonists and opioids were associated with impaired executive function. No strong association was found between several other potentially modifiable factors associated with ESRD and dialysis therapy, such as urea clearance, proxies of dietary protein intake and other nutritional markers, hemodynamic measures, and anemia with global cognition and executive function after adjustment for case-mix factors.Cognitive impairment, especially impaired executive function, is common among hemodialysis patients, but with the exception of CNS-active medications, is not strongly associated with several ESRD- and dialysis-associated factors.

Abstract

To explore the relation between 25-hydroxyvitamin D deficiency and frailty. Frailty is a multidimensional phenotype that describes declining physical function and a vulnerability to adverse outcomes in the setting of physical stress such as illness or hospitalization. Low serum concentrations of 25-hydroxyvitamin D are known to be associated with multiple chronic diseases such as cardiovascular disease and diabetes, in addition to all cause mortality.Using data from the Third National Health and Nutrition Survey (NHANES III), we evaluated the association between low serum 25-hydroxyvitamin D concentration and frailty, defined according to a set of criteria derived from a definition previously described and validated.Nationally representative survey of noninstitutionalized US residents collected between 1988 and 1994.25-Hydroxyvitamin D deficiency, defined as a serum concentration <15 ng mL(-1), was associated with a 3.7-fold increase in the odds of frailty amongst whites and a fourfold increase in the odds of frailty amongst non-whites. This association persisted after sensitivity analyses adjusting for season of the year and latitude of residence, intended to reduce misclassification of persons as 25-hydroxyvitamin D deficient or insufficient.Low serum 25-hydroxyvitamin D concentrations are associated with frailty amongst older adults.

Abstract

Objective . This study aims to highlight the challenges in the diagnosis of hyperparathyroidism (HPT) in patients with advanced chronic kidney disease (CKD). Methods . In this report, we describe a middle-aged Filipino gentleman with underlying CKD who presented with intractable nausea, vomiting, severe and medically refractory hypercalcaemia and parathyroid hormone (PTH) concentrations in excess of 2400 pg/mL. The underlying pathophysiology as well as the aetiologies and current relevant literature are discussed. We also suggest an appropriate diagnostic approach to identify and promptly treat patients with CKD, HPT and hypercalcaemia. Results . Evaluation confirmed the presence of a large parathyroid adenoma; HPT and hypercalcaemia resolved rapidly following resection. Conclusion . This case report is remarkable for its severe hypercalcaemia requiring haemodialysis, large adenoma size, acute-on-chronic kidney injury and markedly elevated PTH concentration in association with primary HPT in CKD.

Abstract

To determine whether profit status is associated with differences in hospital days per patient, an outcome that may also be influenced by provider financial goals.United States Renal Data System Standard Analysis Files and Centers for Medicare and Medicaid Services cost reports.We compared the number of hospital days per patient per year across for-profit and nonprofit dialysis facilities during 2003. To address possible referral bias in the assignment of patients to dialysis facilities, we used an instrumental variable regression method and adjusted for selected patient-specific factors, facility characteristics such as size and chain affiliation, as well as metrics of market competition.All patients who received in-center hemodialysis at any time in 2003 and for whom Medicare was the primary payer were included (N=170,130; roughly two-thirds of the U.S. hemodialysis population). Patients dialyzed at hospital-based facilities and patients with no dialysis facilities within 30 miles of their residence were excluded.Overall, adjusted hospital days per patient were 17+/-5 percent lower in nonprofit facilities. The difference between nonprofit and for-profit facilities persisted with the correction for referral bias. There was no association between hospital days per patient per year and chain affiliation, but larger facilities had inferior outcomes (facilities with 73 or more patients had a 14+/-1.7 percent increase in hospital days relative to facilities with 35 or fewer patients). Differences in outcomes among for-profit and nonprofit facilities translated to 1,600 patient-years in hospital that could be averted each year if the hospital utilization rates in for-profit facilities were to decrease to the level of their nonprofit counterparts.Hospital days per patient-year were statistically and clinically significantly lower among nonprofit dialysis providers. These findings suggest that the indirect incentives in Medicare's current payment system may provide insufficient incentive for for-profit providers to achieve optimal patient outcomes.

Abstract

In the United States, relatively little is known about clinical outcomes of chronic kidney disease (CKD) in vulnerable populations utilizing public health systems. The primary study objectives were to describe patient characteristics, incident ESRD, and mortality in adults with nondialysis-dependent CKD receiving care in the health care safety net.Time to ESRD and time to death were examined among a cohort of 15,353 ambulatory adults with nondialysis-dependent CKD from the Community Health Network of San Francisco.The mean age of the CKD cohort was 59.0 +/- 13.8 years; 50% of the cohort was younger than 60 years and 26% was younger than 50 years. Most (72%) were members of nonwhite racial-ethnic groups, 73% were indigent (annual income

Abstract

Admission to the hospital on weekends is associated with increased mortality for several acute illnesses. We investigated whether patients admitted on a weekend with acute kidney injury (AKI) were more likely to die than those admitted on a weekday. Using the Nationwide Inpatient Sample, a large database of admissions to acute care, nonfederal hospitals in the United States, we identified 963,730 admissions with a diagnosis of AKI between 2003 and 2006. Of these, 214,962 admissions (22%) designated AKI as the primary reason for admission (45,203 on a weekend and 169,759 on a weekday). We used logistic regression models to examine the adjusted odds of in-hospital mortality associated with weekend versus weekday admission. Compared with admission on a weekday, patients admitted with a primary diagnosis of AKI on a weekend had a higher odds of death [adjusted odds ratio (OR) 1.07, 95% confidence interval (CI) 1.02 to 1.12]. The risk for death with admission on a weekend for AKI was more pronounced in smaller hospitals (adjusted OR 1.17, 95% CI 1.03 to 1.33) compared with larger hospitals (adjusted OR 1.07, 95% CI 1.01 to 1.13). Increased mortality was also associated with weekend admission among patients with AKI as a secondary diagnosis across a spectrum of co-existing medical diagnoses. In conclusion, among patients hospitalized with AKI, weekend admission is associated with a higher risk for death compared with admission on a weekday.

Abstract

Standard Kt/V(urea) (stdKt/V) is a hypothetical continuous clearance in patients treated with intermittent hemodialysis based on the generation rate of urea nitrogen and the average predialysis urea nitrogen. Previous equations to estimate stdKt/V were derived using a fixed-volume model. To determine the impact of fluid removal as well as residual urea clearance on stdKt/V, we modeled 245 hemodialysis sessions (including conventional 3/week, in-center 6/week, and at-home nocturnal 6/week) in 210 patients enrolled in the Frequent Hemodialysis Network Daily and Nocturnal clinical trials. To examine the role of fluid removal, modeled stdKt/V was compared to stdKt/V estimated from a previously published simplified equation. In a subgroup of 45 sessions with residual urea clearance over 1.5 ml/min, the contribution of residual urea clearance to stdKt/V was measured. For all dialysis schedules, the fixed-volume equation predicted stdKt/V well when both fluid removal and residual urea clearance were set to zero. When fluid removal was included, modeled stdKt/V was slightly underestimated for all three modes of hemodialysis. The shortfall correlated directly with weekly fluid removal and inversely with modeled urea volume. Modeled stdKt/V compressed residual urea clearance to about 70% of its measured value and the fractional downsizing significantly correlated inversely with treatment Kt/V. Our new equation predicted modeled stdKt/V with a high level of accuracy, even when substantial fluid removal and residual urea clearance were present.

Abstract

There is an association between hemodialysis session length and mortality independent of the effects of session duration on urea clearance. However, previous studies did not consider changes in session length over time nor did they control for the influence of time-dependent confounding. Using data from a national cohort of 8552 incident patients on thrice-weekly, in-center hemodialysis, we applied marginal structural analysis to determine the association between session length and mortality. Exposure was based on prescribed session length with the outcome being death from any cause. On the 31st day after initiating dialysis, the patients were considered at-risk and remained so until death, censoring, or completion of 1 year on dialysis. On primary marginal structural analysis, session lengths <4 h were associated with a 42% increase in mortality. Sensitivity analyses showed a dose-response relationship between session duration and mortality, and a consistency of findings across prespecified subgroups. Our study suggests that shorter hemodialysis sessions are associated with higher mortality when marginal structural analysis was used to adjust for time-dependent confounding. Further studies are needed to confirm these findings and determine causality.

Abstract

Nephrologists care for an increasing number of elderly patients on hemodialysis. As such, an understanding of the overlap among complications of hemodialysis and geriatric syndromes is crucial. This article reviews hemodialysis management issues including vascular access, hypertension, anemia and bone and mineral disorders with an attention towards the distinct medical needs of the elderly. Key concepts of geriatrics frailty, dementia and palliative care are also discussed, as nephrologists frequently participate in decision-making directed toward balancing longevity, functional status and the burden of therapy.

Abstract

The U.S. diet is high in salt, with the majority coming from processed foods. Reducing dietary salt is a potentially important target for the improvement of public health.We used the Coronary Heart Disease (CHD) Policy Model to quantify the benefits of potentially achievable, population-wide reductions in dietary salt of up to 3 g per day (1200 mg of sodium per day). We estimated the rates and costs of cardiovascular disease in subgroups defined by age, sex, and race; compared the effects of salt reduction with those of other interventions intended to reduce the risk of cardiovascular disease; and determined the cost-effectiveness of salt reduction as compared with the treatment of hypertension with medications.Reducing dietary salt by 3 g per day is projected to reduce the annual number of new cases of CHD by 60,000 to 120,000, stroke by 32,000 to 66,000, and myocardial infarction by 54,000 to 99,000 and to reduce the annual number of deaths from any cause by 44,000 to 92,000. All segments of the population would benefit, with blacks benefiting proportionately more, women benefiting particularly from stroke reduction, older adults from reductions in CHD events, and younger adults from lower mortality rates. The cardiovascular benefits of reduced salt intake are on par with the benefits of population-wide reductions in tobacco use, obesity, and cholesterol levels. A regulatory intervention designed to achieve a reduction in salt intake of 3 g per day would save 194,000 to 392,000 quality-adjusted life-years and $10 billion to $24 billion in health care costs annually. Such an intervention would be cost-saving even if only a modest reduction of 1 g per day were achieved gradually between 2010 and 2019 and would be more cost-effective than using medications to lower blood pressure in all persons with hypertension.Modest reductions in dietary salt could substantially reduce cardiovascular events and medical costs and should be a public health target.

Abstract

End-stage renal disease (ESRD) affects more than 1500 people per million population in countries with a high prevalence, such as Japan, Taiwan, and the US. Approximately two-thirds of people with ESRD receive haemodialysis, one quarter have kidney transplants, and one tenth receive peritoneal dialysis. METHODS AND OUTCOMES: We conducted a systematic review and aimed to answer the following clinical questions: What are the effects of different doses for peritoneal dialysis? What are the effects of different doses and membrane fluxes for haemodialysis? What are the effects of interventions aimed at preventing secondary complications? We searched: Medline, Embase, The Cochrane Library, and other important databases up to October 2009 (Clinical Evidence reviews are updated periodically, please check our website for the most up-to-date version of this review). We included harms alerts from relevant organisations such as the US Food and Drug Administration (FDA) and the UK Medicines and Healthcare products Regulatory Agency (MHRA).We found 26 systematic reviews, RCTs, or observational studies that met our inclusion criteria. We performed a GRADE evaluation of the quality of evidence for interventions.In this systematic review we present information relating to the effectiveness and safety of the following interventions: cinacalcet, darbepoetin, erythropoietin, haemodialysis (standard-dose, increased-dose), high membrane-flux haemodialysis, increased-dose peritoneal dialysis, low membrane-flux haemodialysis, mupirocin, sevelamer, standard-dose dialysis, and statins.

Abstract

Serum creatinine concentration (sCr) is the marker used for diagnosing and staging acute kidney injury (AKI) in the RIFLE and AKIN classification systems, but is influenced by several factors including its volume of distribution. We evaluated the effect of fluid accumulation on sCr to estimate severity of AKI.In 253 patients recruited from a prospective observational study of critically-ill patients with AKI, we calculated cumulative fluid balance and computed a fluid-adjusted sCr concentration reflecting the effect of volume of distribution during the development phase of AKI. The time to reach a relative 50% increase from the reference sCr using the crude and adjusted sCr was compared. We defined late recognition to estimate severity of AKI when this time interval to reach 50% relative increase between the crude and adjusted sCr exceeded 24 hours.The median cumulative fluid balance increased from 2.7 liters on day 2 to 6.5 liters on day 7. The difference between adjusted and crude sCr was significantly higher at each time point and progressively increased from a median difference of 0.09 mg/dL to 0.65 mg/dL after six days. Sixty-four (25%) patients met criteria for a late recognition to estimate severity progression of AKI. This group of patients had a lower urine output and a higher daily and cumulative fluid balance during the development phase of AKI. They were more likely to need dialysis but showed no difference in mortality compared to patients who did not meet the criteria for late recognition of severity progression.In critically-ill patients, the dilution of sCr by fluid accumulation may lead to underestimation of the severity of AKI and increases the time required to identify a 50% relative increase in sCr. A simple formula to correct sCr for fluid balance can improve staging of AKI and provide a better parameter for earlier recognition of severity progression.

Abstract

Classic urea modeling assumes that both urea generation rate (G) and residual renal urea clearance (Kru) are constant throughout the week, but this may not be true. Reductions in intradialysis G could be caused by lower plasma amino acid levels due to predialysis/intradialysis fasting and also to losses of amino acids into the dialysate. Intradialytic reductions in Kru could be due to lower intravascular volume, blood pressure, or osmotic load. To determine the possible effects of reduced G or Kru during dialysis on the calculation of the volume of distribution (V) and Kt/Vurea, we modeled 3 and 6/week nocturnal, 6/week short daily, and 3/week conventional hemodialysis. A modified 2-pool mathematical model of urea mass balance with a constant time-averaged G was used, but the model was altered to allow adjustment of the ratio of dialytic/interdialytic G (Gd/Gid) and dialytic/total Kru (Krud/Kru) to vary from 1.0 down to near zero. In patients dialyzed six times per week for 400 minutes per session, when Gd/Gid was decreased from 1.0 to 0.05, the predicted urea reduction ratio (URR) increased from 68.9% to 80.2%. To achieve an increased URR of this magnitude under conditions of constant G (Gd/Gid=1.0) required a decrease in modeled urea volume (V) of 36%. At Gd/Gid ratios of 0.8 or 0.6 (corresponding to 20% or 40% reductions in intradialysis G), the modeled URR was increased to 71.0% or 73.3%, causing a 7% or 15% factitious decrease in V. The error was intermediate for the 3/week nocturnal schedule, and was much less pronounced for the 6/week daily and 3/week conventional treatments. Reductions in intradialytic Kru had the opposite effect, lowering the predicted URR and increasing the apparent V, but here the errors were of much lesser amplitude. The results suggest that, particularly for nocturnal dialysis, the standard "constant G" urea kinetic model may need to be modified.

Abstract

Acute kidney injury (AKI) is associated with adverse outcomes in critically ill patients. The influence of preexisting chronic kidney disease (CKD) on AKI outcomes is unclear.We analyzed data from a prospective observational cohort study of AKI in critically ill patients who received nephrology consultation: the Program to Improve Care in Acute Renal Disease. In-hospital mortality rate, length of stay, and dialysis dependence were compared in patients with and without a prior history of CKD, defined by an elevated serum creatinine, proteinuria, and/or abnormal renal ultrasound within a year before hospitalization. We hypothesized that patients with AKI and prior history of CKD would have lower mortality rates, shorter lengths of stay, and higher rates of dialysis dependence than patients without prior history of CKD.Patients with AKI and a prior history of CKD were older and underwent nephrology consultation earlier in the course of AKI. In-hospital mortality rate was lower (31 versus 40%, P = 0.04), and median intensive care unit length of stay was 4.6 d shorter (14.7 versus 19.3 d, P = 0.001) in patients with a prior history of CKD. Among dialyzed survivors, patients with prior CKD were also more likely to be dialysis dependent at hospital discharge. Differences in outcome were most evident in patients with lower severity of illness.Among critically ill patients with AKI, those with prior CKD experience a lower mortality rate but are more likely to be dialysis dependent at hospital discharge. Future studies should determine optimal strategies for managing AKI with and without a prior history of CKD.

Abstract

Patients with chronic kidney disease (CKD) are at increased risk of harm as a consequence of errors in medical care. Hug and colleagues highlight the significance of adverse drug events in hospitalized patients with CKD. Their findings demonstrate the role adverse drug events play in the safety of patients with CKD and underscore the importance of novel strategies intended to reduce such medical errors.

Abstract

Prevention of pressure ulcers is fundamental to safe care of nursing home residents yet the role of hydration in pressure ulcer prevention has not been systematically examined. This randomized clinical trial was undertaken to determine whether administration of supplemental fluid to nursing home residents at risk for pressure ulcers would enhance collagen deposition, increase estimated total body water, augment subcutaneous tissue oxygenation, and was safe. After a baseline period, 64 subjects were randomized to receive the fluid volume prescribed or additional fluid (prescribed plus 10 mL/kg) for 5 days. Participants' potential to heal as measured with hydroxyproline was low at baseline and did not increase significantly during treatment when additional fluid was systematically provided. Fluid intake increased significantly during treatment. Estimates of total body water and subcutaneous oxygen did not increase, indicating hydration was not improved. Supplemental fluid did not result in overhydration as measured by clinical parameters. Further work is needed to examine the relationship between fluid intake and hydration in nursing home residents as well as the role of hydration in pressure ulcer prevention.

Abstract

It is unclear whether functional status before dialysis is maintained after the initiation of this therapy in elderly patients with end-stage renal disease (ESRD).Using a national registry of patients undergoing dialysis, which was linked to a national registry of nursing home residents, we identified all 3702 nursing home residents in the United States who were starting treatment with dialysis between June 1998 and October 2000 and for whom at least one measurement of functional status was available before the initiation of dialysis. Functional status was measured by assessing the degree of dependence in seven activities of daily living (on the Minimum Data Set-Activities of Daily Living [MDS-ADL] scale of 0 to 28 points, with higher scores indicating greater functional difficulty).The median MDS-ADL score increased from 12 during the 3 months before the initiation of dialysis to 16 during the 3 months after the initiation of dialysis. Three months after the initiation of dialysis, functional status had been maintained in 39% of nursing home residents, but by 12 months after the initiation of dialysis, 58% had died and predialysis functional status had been maintained in only 13%. In a random-effects model, the initiation of dialysis was associated with a sharp decline in functional status, indicated by an increase of 2.8 points in the MDS-ADL score (95% confidence interval [CI], 2.5 to 3.0); this decline was independent of age, sex, race, and functional-status trajectory before the initiation of dialysis. The decline in functional status associated with the initiation of dialysis remained substantial (1.7 points; 95% CI, 1.4 to 2.1), even after adjustment for the presence or absence of an accelerated functional decline during the 3-month period before the initiation of dialysis.Among nursing home residents with ESRD, the initiation of dialysis is associated with a substantial and sustained decline in functional status.

Abstract

To determine whether acute renal failure (ARF) increases the long-term risk of progressive chronic kidney disease (CKD), we studied the outcome of patients whose initial kidney function was normal or near normal but who had an episode of dialysis-requiring ARF and did not develop end-stage renal disease within 30 days following hospital discharge. The study encompassed 556,090 adult members of Kaiser Permanente of Northern California hospitalized over an 8 year period, who had pre-admission estimated glomerular filtration rates (eGFR) equivalent to or greater than 45 ml/min/1.73 m(2) and who survived hospitalization. After controlling for potential confounders such as baseline level of eGFR and diabetes status, dialysis-requiring ARF was independently associated with a 28-fold increase in the risk of developing stage 4 or 5 CKD and more than a twofold increased risk of death. Our study shows that in a large, community-based cohort of patients with pre-existing normal or near normal kidney function, an episode of dialysis-requiring ARF was a strong independent risk factor for a long-term risk of progressive CKD and mortality.

Abstract

Fluid accumulation is associated with adverse outcomes in critically ill patients. Here, we sought to determine if fluid accumulation is associated with mortality and non-recovery of kidney function in critically ill adults with acute kidney injury. Fluid overload was defined as more than a 10% increase in body weight relative to baseline, measured in 618 patients enrolled in a prospective multicenter observational study. Patients with fluid overload experienced significantly higher mortality within 60 days of enrollment. Among dialyzed patients, survivors had significantly lower fluid accumulation when dialysis was initiated compared to non-survivors after adjustments for dialysis modality and severity score. The adjusted odds ratio for death associated with fluid overload at dialysis initiation was 2.07. In non-dialyzed patients, survivors had significantly less fluid accumulation at the peak of their serum creatinine. Fluid overload at the time of diagnosis of acute kidney injury was not associated with recovery of kidney function. However, patients with fluid overload when their serum creatinine reached its peak were significantly less likely to recover kidney function. Our study shows that in patients with acute kidney injury, fluid overload was independently associated with mortality. Whether the fluid overload was the result of a more severe renal failure or it contributed to its cause will require clinical trials in which the role of fluid administration to such patients is directly tested.

Abstract

In the United States, public health insurance is available for nearly all persons with end-stage renal disease (ESRD). Little is known about the extent of health insurance coverage for persons with non-dialysis dependent chronic kidney disease (CKD).To describe patterns of health insurance coverage for adults with non-dialysis dependent CKD and to examine risk factors for progression of CKD to ESRD and management of hypertension among those lacking insurance.Cross-sectional analysis of data from a nationally representative sample of 16,148 US adults aged 20 years or older who participated in the National Health and Nutrition Examination Survey 1999-2006.National prevalence estimates of health insurance coverage, ESRD risk factors, and treatment of hypertension.An estimated 10.0% (95% CI, 8.3%-12.0%) of US adults with non-dialysis dependent CKD were uninsured, 60.9% (95% CI, 58.2%-63.7%) had private insurance and 28.7% (95% CI, 26.4%-31.1%) had public insurance alone. Uninsured persons with non-dialysis dependent CKD were more likely to be under the age of 50 (62.8% vs. 23.0%, P < 0.001) and nonwhite (58.7%, vs. 21.8%, P < 0.001) compared with their insured counterparts. Approximately two-thirds of uninsured adults with non-dialysis dependent CKD had at least one modifiable risk factor for CKD progression, including 57% with hypertension, 40% who were obese, 22% with diabetes, and 13% with overt albuminuria. In adjusted analyses, uninsured persons with non-dialysis dependent CKD were less likely to be treated for their hypertension (OR, 0.59; 95% CI, 0.40-0.85) and less likely to be receiving recommended therapy with angiotensin inhibitors (OR, 0.45; 95% CI, 0.26-0.77) compared with those with insurance coverage.Uninsured persons with non-dialysis dependent CKD are at higher risk for progression to ESRD than their insured counterparts but are less likely to receive recommended interventions to slow disease progression. Lack of public health insurance for patients with non-dialysis dependent CKD may result in missed opportunities to slow disease progression and thereby reduce the public burden of ESRD.

Abstract

Despite data that traditional laboratory-based outcome measures in dialysis are improving over time, population-based data indicate that mortality rates are not improving in parallel. With increased focus on performance measures based on laboratory-based outcomes (e.g., hematocrit, albumin, and parathyroid hormone), less emphasis has been placed on other markers, some of which may be stronger predictors of mortality. We performed a systematic review to interpret the predictive value of laboratory-based outcome measures in dialysis. We identified studies with data regarding the predictive value of laboratory-based outcomes for mortality in dialysis. We calculated the sample size-weighted pooled relative risk of death with dichotomized "high" vs. "low" levels of each measure. We rank-ordered predictors by scaling the pooled relative risk of each measure by its pooled standard deviation. There were 5171 titles, of which 128 (representing 44 laboratory-based outcomes) were selected. Nine were significantly associated with mortality, in order of decreasing scaled effect size: (1) tumor necrosis factor-alpha, (2) hematocrit, (3) interleukin-6, (4) troponin T, (5) Kt/V(urea), (6) prealbumin, (7) urea reduction ratio, (8) serum albumin, and (9) C-reactive protein. Other oft-cited measures such as calcium phosphate product and parathyroid hormone were not significantly associated with mortality in pooled analysis. Quality improvement efforts to improve traditional laboratory-based outcomes in end-stage renal disease are necessary, but likely insufficient, to improve overall mortality in dialysis. Renewed consideration of cardiovascular, inflammatory, and nutritional markers that are especially strong predictors of mortality may have important implications for risk stratification and targeted therapeutic interventions.

Abstract

An observational study suggests that administration of phosphorus binders dramatically improves survival rates in patients on incident hemodialysis-even in those without hyperphosphatemia. Randomized clinical trials should drive changes in the relevant clinical practice.

Abstract

Frailty is common in the elderly and in persons with chronic diseases. Few studies have examined the association of frailty with chronic kidney disease.We used data from the Third National Health and Nutrition Examination Survey to estimate the prevalence of frailty among persons with chronic kidney disease. We created a definition of frailty based on established validated criteria, modified to accommodate available data. We used logistic regression to determine whether and to what degree stages of chronic kidney disease were associated with frailty. We also examined factors that might mediate the association between frailty and chronic kidney disease.The overall prevalence of frailty was 2.8%. However, among persons with moderate to severe chronic kidney disease (estimated glomerular filtration rate < 45 mL/min/1.73 m2), 20.9% were frail. The odds of frailty were significantly increased among all stages of chronic kidney disease, even after adjustment for the residual effects of age, sex, race, and prevalent chronic diseases. The odds of frailty associated with chronic kidney disease were only marginally attenuated with additional adjustment for sarcopenia, anemia, acidosis, inflammation, vitamin D deficiency, hypertension, and cardiovascular disease. Frailty and chronic kidney disease were independently associated with mortality.Frailty is significantly associated with all stages of chronic kidney disease and particularly with moderate to severe chronic kidney disease. Potential mechanisms underlying the chronic kidney disease and frailty connection remain elusive.

Abstract

Recent outbreaks of nephrolithiasis and acute kidney injury among children in China have been linked to ingestion of milk-based infant formula contaminated with melamine. These cases provide evidence in humans for the nephrotoxicity of melamine, which previously had been described only in animals. The consequences of this outbreak are already severe and will likely continue to worsen. Herein we summarize the global impact of the melamine milk contamination, the reemergence of melamine-tainted animal feed, and potential mechanisms of melamine nephrotoxicity. Large-scale epidemiologic studies are necessary to further characterize this disease and to assess its potential long-term sequelae. This epidemic of environmental kidney disease highlights the morbidity associated with adulterated food products available in today's global marketplace and reminds us of the unique vulnerability of the kidney to environmental insults. Melamine is the latest in a growing list of diverse potentially toxic compounds about which nephrologists and other health-care providers responsible for the diagnosis and management of kidney disease must now be aware.

Abstract

The patient was a 41 year-old Mexican American women who presented with a decrease in visual acuity along with periorbital and peripheral edema. She was diagnosed with bilateral serous retinal detachment and diffuse proliferative lupus nephritis. She improved considerably in hospital after treatment with corticosteroids.

Abstract

Determination of the optimal dose of renal replacement therapy in critically ill patients with acute kidney injury has been controversial. Questions have recently been raised regarding the design and execution of the US Department of Veterans Affairs/National Institutes of Health Acute Renal Failure Trial Network (ATN) Study, which demonstrated no improvement in 60-day all-cause mortality with more intensive management of renal replacement therapy. In the present article we present our rationale for these aspects of the design and conduct of the study, including our use of both intermittent and continuous modalities of renal support, our approach to initiation of study therapy and the volume management during study therapy. In addition, the article presents data on hypotension during therapy and recovery of kidney function in the perspective of other studies of renal support in acute kidney injury. Finally, we address the implications of the ATN Study results for clinical practice from the perspective of the study investigators.

Abstract

Proposals to make decisions about coverage of new technology by comparing the technology's incremental cost-effectiveness with the traditional benchmark of dialysis imply that the incremental cost-effectiveness ratio of dialysis is seen a proxy for the value of a statistical year of life. The frequently used ratio for dialysis has, however, not been updated to reflect more recently available data on dialysis.We developed a computer simulation model for the end-stage renal disease population and compared cost, life expectancy, and quality adjusted life expectancy of current dialysis practice relative to three less costly alternatives and to no dialysis. We estimated incremental cost-effectiveness ratios for these alternatives relative to the next least costly alternative and no dialysis and analyzed the population distribution of the ratios. Model parameters and costs were estimated using data from the Medicare population and a large integrated health-care delivery system between 1996 and 2003. The sensitivity of results to model assumptions was tested using 38 scenarios of one-way sensitivity analysis, where parameters informing the cost, utility, mortality and morbidity, etc. components of the model were by perturbed +/-50%.The incremental cost-effectiveness ratio of dialysis of current practice relative to the next least costly alternative is on average $129,090 per quality-adjusted life-year (QALY) ($61,294 per year), but its distribution within the population is wide; the interquartile range is $71,890 per QALY, while the 1st and 99th percentiles are $65,496 and $488,360 per QALY, respectively. Higher incremental cost-effectiveness ratios were associated with older age and more comorbid conditions. Sensitivity to model parameters was comparatively small, with most of the scenarios leading to a change of less than 10% in the ratio.The value of a statistical year of life implied by dialysis practice currently averages $129,090 per QALY ($61,294 per year), but is distributed widely within the dialysis population. The spread suggests that coverage decisions using dialysis as the benchmark may need to incorporate percentile values (which are higher than the average) to be consistent with the Rawlsian principles of justice of preserving the rights and interests of society's most vulnerable patient groups.

Abstract

Clinical outcomes after kidney transplant have improved considerably in the United States over the past several decades. However, the degree to which this has occurred uniformly across the country is unknown.Regional variations in graft failure after kidney transplant during three different time periods were examined. These time periods were chosen to coincide with major shifts in immunosuppressant usage: Era 1, cyclosporine usage, 1988 through 1989; Era 2, introduction of tacrolimus and mycophenolate mofetil, 1994 through 1995; and Era 3, widespread use of tacrolimus and mycophenolate mofetil, 1998 through 1999. Patient data were obtained from the United States Renal Data System database. For each period, regional differences in time from transplant to graft failure (organ removal, death, or return to dialysis) were examined. For each region, differences in graft failure over time were examined.One-year graft survival rates ranged from 76% to 83% between regions in Era 1 (n = 13,669), from 84% to 89% in Era 2 (n = 17,456), and from 87.5% to 92% in Era 3 (n = 20,375). Three-year graft survival ranged from 65% to 75% between regions in Era 1, from 84% to 89% in Era 2, and from 77% to 86% in Era 3. Adjusted models for donor and recipient characteristics showed improvements in graft survival over time in all United Network for Organ Sharing regions with minimal variation across regions.Regional differences in graft survival after kidney transplant are minimal, particularly when compared with the dramatic improvements in graft survival that have occurred over time.

Abstract

The degree to which low transplant rates among Asians and Pacific Islanders in the United States are confounded by poverty and reduced access to care is unknown. We examined the relationship between neighborhood poverty and kidney transplant rates among 22 152 patients initiating dialysis during 1995-2003 within 1800 ZIP codes in California, Hawaii and the US-Pacific Islands. Asians and whites on dialysis were distributed across the spectrum of poverty, while Pacific Islanders were clustered in the poorest areas. Overall, worsening neighborhood poverty was associated with lower relative rates of transplant (adjusted HR [95% CI] for areas with > or =20% vs. <5% residents living in poverty, 0.41 [0.32-0.53], p < 0.001). At every level of poverty, Asians and Pacific Islanders experienced lower transplant rates compared with whites. The degree of disparity increased with worsening neighborhood poverty (adjusted HR [95% CI] for Asians-Pacific Islanders vs. whites, 0.64 [0.51-0.80], p < 0.001 for areas with <5% and 0.30 [0.21-0.44], p < 0.001 for areas with > or =20% residents living in poverty; race-poverty level interaction, p = 0.039). High levels of neighborhood poverty are associated with lower transplant rates among Asians and Pacific Islanders compared with whites. Our findings call for studies to identify cultural and local barriers to transplant among Asians and Pacific Islanders, particularly those residing in resource-poor neighborhoods.

Abstract

Although evidence suggests that a higher hemodialysis dose and/or frequency may be associated with improved outcomes, the cost-effectiveness of a daily hemodialysis strategy for critically ill patients with acute kidney injury (AKI) is unknown.We developed a Markov model of the cost, quality of life, survival, and incremental cost-effectiveness of daily hemodialysis, compared with alternate-day hemodialysis, for patients with AKI in the intensive care unit (ICU). We employed a societal perspective with a lifetime analytic time horizon. We modeled the efficacy of daily hemodialysis as a reduction in the relative risk of death on the basis of data reported in the 2004 clinical trial published by Schiffl et al. We performed 1- and 2-way sensitivity analyses across cost, efficacy, and utility input variables. The main outcome measure was cost per quality-adjusted life-year (QALY).In the base case for a 60-year-old man, daily hemodialysis was projected to add 2.14 QALYs and $10,924 in cost. We found that the cost-effectiveness of daily hemodialysis compared with alternate-day hemodialysis was $5084 per QALY gained. The incremental cost-effectiveness ratio became less favorable (>$50,000 per QALY gained) when the maintenance hemodialysis rate of the daily hemodialysis group was varied to more than 27% and when the difference in 14-day postdischarge mortality between the alternatives was varied to less than 0.5%.Daily hemodialysis is a cost-effective strategy compared with alternate-day hemodialysis for patients with severe AKI in the ICU.

Abstract

All glomerular filtration rate (GFR) estimating equations have been developed from cross-sectional data. The aims of this study were to examine the concordance between use of measured GFR (mGFR) and estimated GFR (eGFR) in tracking changes in kidney function over time among patients with moderately severe chronic kidney disease.A retrospective cohort study of subjects who had been enrolled in the MDRD Study A and who had two or more contemporaneous assessments of mGFR and eGFR (n = 542; mGFR range, 25 to 55 ml/min per 1.73 m(2)) during the chronic phase (month 4 and afterwards). mGFR was based on urinary iothalamate clearance; eGFR was based on the 4-variable MDRD Study equation. Temporal changes in GFR were assessed by within-subject linear regression of time on GFR.Median follow-up time for all subjects was 2.6 yr; median number of GFR measurements was six. The eGFR slope tended to underestimate measured decrements in GFR. The absolute value of the difference in mGFR and eGFR slopes was

Abstract

Published evidence suggests that frequent hemodialysis (more than three times per week) for patients with ESRD may improve health-related quality of life and has the potential to increase longevity and reduce hospitalization and other complications. Here, a Monte Carlo simulation model was used to compare varying combinations of in-center hemodialysis frequency (three to six treatments per week) and session length (2 to 4.5 h per session) with regard to unadjusted and quality-adjusted life-years and total lifetime costs for a cohort of 200,000 patients, representing the prevalent ESRD population. The incremental cost-effectiveness ratio was calculated for the various regimens relative to a conventional hemodialysis regimen (three treatments per week, 3.5 h per session). Using conservative assumptions of the potential effects of more frequent hemodialysis on outcomes, most strategies achieved a cost-effectiveness ratio of $75,000. The cost-effectiveness ratio increased with the frequency of hemodialysis. More frequent in-center hemodialysis strategies could become cost-neutral if the cost per hemodialysis session could be reduced by 32 to 43%. No other change in model assumptions achieved cost neutrality. In conclusion, given the extraordinarily high costs of the ESRD program, the viability of more frequent hemodialysis strategies depends on significant improvements in the economic model underlying the delivery of hemodialysis.

Abstract

Dialysis is measured as Kt/V, which scales the dose (Kt) to body water content (V). Scaling dialysis dose to body surface area (S(dub)) has been advocated, but the implications of such rescaling have not been examined. We developed a method of rescaling measured Kt/V to S(dub) and studied the effect of such alternative scaling on the minimum adequacy values that might then be applied in male and female patients of varying body size. We examined anthropometric estimates of V and S (Watson vs. Dubois estimates) in 1765 patients enrolled in the HEMO study after excluding patients with amputations. An S-normalized target stdKt/V was defined, and an adequacy ratio (R) was computed for each patient as R = D/N where D = delivered stdKt/V (calculated using the Gotch-Leypoldt equation for stdKt/V) and N = the S-normalized minimum target value. In the HEMO data set, we determined the extent to which baseline (prerandomization) stdKt/V values would have exceeded such an S-based minimum target stdKt/V. The median V(wat):S(dub) ratios were significantly higher in men (21.34) than in women (18.50). The average of these (20) was used to normalize the current suggested minimally adequate value (stdKt/V > or = 2.0/week) to the S-normalized target value (stdKt/S > or = 40 L/M(2)), assuming that average modeled V = average anthropometric V. To achieve this S-normalized target, the required single-pool (sp) Kt/V was always higher in women than in men at any level of body size. For small patients (V(wat) = 25L), required stdKt/V values were 2.05 and 2.21/week for men and women, respectively, corresponding to spKt/V values of 1.31 and 1.52/session. On the other hand, large (V(wat) = 50L) male patients would need spKt/V values of only 1.0/session. Prerandomization baseline dialysis sessions in the HEMO study were found to meet such a new S-based standard in almost all (766/773) men and in 885/992 women. An analysis of scaling dose to anthropometrically estimated liver size (L) showed similar gender ratios for V(wat):L and V(wat):S(dub), providing a potential physiologic explanation underpinning S-based scaling. S-based scaling of the dialysis dose would require considerably higher doses in small patients and in women, and would allow somewhat lower doses in larger male patients. Current dialysis practice would largely meet such an S-based adequacy standard if the dose were normalized to a V(wat):S(dub) ratio of 20.

Abstract

A number of denominators for scaling the dose of dialysis have been proposed as alternatives to the urea distribution volume (V). These include resting energy expenditure (REE), mass of high metabolic rate organs (HMRO), visceral mass, and body surface area. Metabolic rate is an unlikely denominator as it varies enormously among humans with different levels of activity and correlates poorly with the glomerular filtration rate. Similarly, scaling based on HMRO may not be optimal, as many organs with high metabolic rates such as spleen, brain, and heart are unlikely to generate unusually large amounts of uremic toxins. Visceral mass, in particular the liver and gut, has potential merit as a denominator for scaling; liver size is related to protein intake and the liver, along with the gut, is known to be responsible for the generation of suspected uremic toxins. Surface area is time-honored as a scaling method for glomerular filtration rate and scales similarly to liver size. How currently recommended dialysis doses might be affected by these alternative rescaling methods was modeled by applying anthropometric equations to a large group of dialysis patients who participated in the HEMO study. The data suggested that rescaling to REE would not be much different from scaling to V. Scaling to HMRO mass would mandate substantially higher dialysis doses for smaller patients of either gender. Rescaling to liver mass would require substantially more dialysis for women compared with men at all levels of body size. Rescaling to body surface area would require more dialysis for smaller patients of either gender and also more dialysis for women of any size. Of these proposed alternative rescaling measures, body surface area may be the best, because it reflects gender-based scaling of liver size and thereby the rate of generation of uremic toxins.

Abstract

The optimal intensity of renal-replacement therapy in critically ill patients with acute kidney injury is controversial.We randomly assigned critically ill patients with acute kidney injury and failure of at least one nonrenal organ or sepsis to receive intensive or less intensive renal-replacement therapy. The primary end point was death from any cause by day 60. In both study groups, hemodynamically stable patients underwent intermittent hemodialysis, and hemodynamically unstable patients underwent continuous venovenous hemodiafiltration or sustained low-efficiency dialysis. Patients receiving the intensive treatment strategy underwent intermittent hemodialysis and sustained low-efficiency dialysis six times per week and continuous venovenous hemodiafiltration at 35 ml per kilogram of body weight per hour; for patients receiving the less-intensive treatment strategy, the corresponding treatments were provided thrice weekly and at 20 ml per kilogram per hour.Baseline characteristics of the 1124 patients in the two groups were similar. The rate of death from any cause by day 60 was 53.6% with intensive therapy and 51.5% with less-intensive therapy (odds ratio, 1.09; 95% confidence interval, 0.86 to 1.40; P=0.47). There was no significant difference between the two groups in the duration of renal-replacement therapy or the rate of recovery of kidney function or nonrenal organ failure. Hypotension during intermittent dialysis occurred in more patients randomly assigned to receive intensive therapy, although the frequency of hemodialysis sessions complicated by hypotension was similar in the two groups.Intensive renal support in critically ill patients with acute kidney injury did not decrease mortality, improve recovery of kidney function, or reduce the rate of nonrenal organ failure as compared with less-intensive therapy involving a defined dose of intermittent hemodialysis three times per week and continuous renal-replacement therapy at 20 ml per kilogram per hour. (ClinicalTrials.gov number, NCT00076219.)

Abstract

Few studies have defined how the risk of hospital-acquired acute renal failure varies with the level of estimated glomerular filtration rate (GFR). It is also not clear whether common factors such as diabetes mellitus, hypertension and proteinuria increase the risk of nosocomial acute renal failure independent of GFR. To determine this we compared 1,746 hospitalized adult members of Kaiser Permanente Northern California who developed dialysis-requiring acute renal failure with 600,820 hospitalized members who did not. Patient GFR was estimated from the most recent outpatient serum creatinine measurement prior to admission. The adjusted odds ratios were significantly and progressively elevated from 1.95 to 40.07 for stage 3 through stage 5 patients (not yet on maintenance dialysis) compared to patients with estimated GFR in the stage 1 and 2 range. Similar associations were seen after controlling for inpatient risk factors. Pre-admission baseline diabetes mellitus, diagnosed hypertension and known proteinuria were also independent risk factors for acute kidney failure. Our study shows that the propensity to develop in-hospital acute kidney failure is another complication of chronic kidney disease whose risk markedly increases even in the upper half of stage 3 estimated GFR. Several common risk factors for chronic kidney disease also increase the peril of nosocomial acute kidney failure.

Abstract

Serum albumin concentrations are associated with mortality, and respond to nutritional and inflammatory states. To explore whether changing demographics and practice patterns in dialysis have influenced serum albumin concentrations, we analyzed trends in serum albumin among incident patients on dialysis from 1995 through 2004.Mean serum albumin concentrations declined significantly over time, even after accounting for changes in age, diabetes, body size, and other factors. Although laboratory assays were not uniform within or across years, serum albumin declined over time, regardless of the reported laboratory lower limit of normal. Moreover, serum albumin retained its potent association with mortality over time. Lower serum albumin was especially hazardous among younger patients and blacks, and was less hazardous among persons with diabetes as a primary cause of kidney disease.Despite higher body weights and the initiation of dialysis earlier in the course of progressive chronic kidney disease, hypoalbuminemia remains common and hazardous to persons starting dialysis.

Abstract

Acute kidney injury is an increasingly common and potentially catastrophic complication in hospitalized patients. Early observational studies from the 1980s and 1990s established the general epidemiologic features of acute kidney injury: the incidence, prognostic significance, and predisposing medical and surgical conditions. Recent multicenter observational cohorts and administrative databases have enhanced our understanding of the overall disease burden of acute kidney injury and trends in its epidemiology. An increasing number of clinical studies focusing on specific types of acute kidney injury (e.g., in the setting of intravenous contrast, sepsis, and major surgery) have provided further details into this heterogeneous syndrome. Despite our sophisticated understanding of the epidemiology and pathobiology of acute kidney injury, current prevention strategies are inadequate and current treatment options outside of renal replacement therapy are nonexistent. This failure to innovate may be due in part to a diagnostic approach that has stagnated for decades and continues to rely on markers of glomerular filtration (blood urea nitrogen and creatinine) that are neither sensitive nor specific. There has been increasing interest in the identification and validation of novel biomarkers of acute kidney injury that may permit earlier and more accurate diagnosis. This review summarizes the major epidemiologic studies of acute kidney injury and efforts to modernize the approach to its diagnosis.

Abstract

To identify biological and clinical predictors of acute kidney injury in subjects with acute lung injury.Secondary data analysis from a multicenter, randomized clinical trial.Intensive care units in ten university medical centers.A total of 876 patients enrolled in the first National Heart, Lung, and Blood Institute Acute Respiratory Distress Syndrome Clinical Network trial.Study subjects were randomized to receive a low tidal volume ventilation strategy and pharmacologic therapy with ketoconazole or lisofylline in a factorial design.We tested the association of baseline levels of interleukin-6, interleukin-8, interleukin-10, von Willebrand factor, tumor necrosis factor-[alpha], type I and II soluble tumor necrosis factor receptors (sTNFR-I and -II), protein C, plasminogen activator inhibitor-1 (PAI-1), surfactant protein-A, surfactant protein-D, and intracellular adhesion molecule-1 with subsequent acute kidney injury. Of 876 study participants who did not have end-stage renal disease, 209 (24%) developed acute kidney injury, defined as a rise in serum creatinine of >50% from baseline over the first four study days. The 180-day mortality rate for subjects with acute kidney injury was 58%, compared with 28% in those without acute kidney injury (p < .001). Interleukin-6, sTNFR-I, sTNFR-II, and PAI-1 levels were independently associated with acute kidney injury after adjustment for demographics, interventions, and severity of illness. A combination of clinical and biological predictors had the best area under the receiver operating characteristic curve, and the contribution of sTNFR-I and PAI-1 to this model was highly significant (p = .0003).Elevations in PAI-1, interleukin-6, and the sTNFRs in subjects with acute kidney injury suggest that disordered coagulation, inflammation, and neutrophil-endothelial interactions play important roles in the pathogenesis of acute kidney injury. The combination of these biological and clinical risk factors may have important and additive value in predictive models for acute kidney injury.

Abstract

To assess the association between kidney function and change in body composition in older individuals.Prospective cohort study.Two sites, Pittsburgh, Pennsylvania, and Memphis, Tennessee.Three thousand twenty-six well-functioning, participants aged 70 to 79 in the Health, Aging and Body Composition Study.Body composition (bone-free lean mass and fat mass) was measured using dual x-ray absorptiometry annually for 4 years. Kidney function was measured at baseline according to serum creatinine (SCr). Comorbidity and inflammatory markers were evaluated as covariates in mixed-model, repeated-measures analysis.High SCr was associated with loss of lean mass in men but not women, with a stronger relationship in black men (P=.02 for difference between slopes for white and black men). In white men, after adjustment for age and comorbidity, higher SCr remained associated with loss of lean mass (-0.07+/-0.03 kg/y greater loss per 0.4 mg/dL (1 standard deviation (SD)), P=.009) but was attenuated after adjustment for inflammatory factors (-0.05+/-0.03 kg/y greater loss per SD, P=.10). In black men, the relationship between SCr and loss of lean mass (-0.19+/-0.04 kg/y per SD, P

Abstract

Black patients receiving dialysis for end-stage renal disease in the United States have lower mortality rates than white patients. Whether racial differences exist in mortality after acute renal failure is not known. We studied acute renal failure in patients hospitalized between 2000 and 2003 using the Nationwide Inpatient Sample and found that black patients had an 18% (95% confidence interval [CI] 16 to 21%) lower odds of death than white patients after adjusting for age, sex, comorbidity, and the need for mechanical ventilation. Similarly, among those with acute renal failure requiring dialysis, black patients had a 16% (95% CI 10 to 22%) lower odds of death than white patients. In stratified analyses of patients with acute renal failure, black patients had significantly lower adjusted odds of death than white patients in settings of coronary artery bypass grafting, cardiac catheterization, acute myocardial infarction, congestive heart failure, pneumonia, sepsis, and gastrointestinal hemorrhage. Black patients were more likely than white patients to be treated in hospitals that care for a larger number of patients with acute renal failure, and black patients had lower in-hospital mortality than white patients in all four quartiles of hospital volume. In conclusion, in-hospital mortality is lower for black patients with acute renal failure than white patients. Future studies should assess the reasons for this difference.

Rise of pay for performance: Implications for care of people with chronic kidney diseaseCLINICAL JOURNAL OF THE AMERICAN SOCIETY OF NEPHROLOGYDesai, A. A., Garber, A. M., Chertow, G. M.2007; 2 (5): 1087-1095

Abstract

Many health care providers and policy makers believe that health care financing systems fail to reward high-quality care. In recent years, federal and private payers have begun to promote pay for performance, or value-based purchasing, initiatives to raise the quality of care. This report describes conceptual issues in the design and implementation of pay for performance for chronic kidney disease and ESRD care. It also considers the implications of recent ESRD payment policy changes on the broader goals of pay for performance. Congressionally mandated bundle payment demonstration for dialysis, newly implemented case-mix adjustment of the composite rate, and G codes for the monthly capitation payment are important opportunities to understand facility and provider behavior with particular attention to patient selection and treatment practices. Well-designed payment systems will reward quality care for patients while maintaining appropriate accountability and fairness for health care providers.

Abstract

The dramatically high rates of mortality and cardiovascular morbidity observed among dialysis patients highlights the importance of identifying and implementing strategies to lower cardiovascular risk in this population. Results from clinical trials undertaken thus far, including trials on lipid reduction, normalization of hematocrit, and increased dialysis dosage, have been unsuccessful. Available data indicate that abnormalities in calcium and phosphorus metabolism, as a result of either secondary hyperparathyroidism alone or the therapeutic measures used to manage secondary hyperparathyroidism, are associated with an increased risk for death and cardiovascular events. However, no prospective trials have evaluated whether interventions that modify these laboratory parameters result in a reduction in adverse cardiovascular outcomes.Evaluation of Cinacalcet Therapy to Lower Cardiovascular Events is a global, phase 3, double-blind, randomized, placebo-controlled trial evaluating the effects of cinacalcet on mortality and cardiovascular events in hemodialysis patients with secondary hyperparathyroidism. Approximately 3800 patients from 22 countries will be randomly assigned to cinacalcet or placebo. Flexible use of traditional therapies will be permitted. The primary end point is the composite of time to all-cause mortality or first nonfatal cardiovascular event (myocardial infarction, hospitalization for unstable angina, heart failure, or peripheral vascular disease, including lower extremity revascularization and nontraumatic amputation).The study will be event driven (terminated at 1882 events) with an anticipated duration of approximately 4 yr.Evaluation of Cinacalcet Therapy to Lower Cardiovascular Events will determine whether management of secondary hyperparathyroidism with cinacalcet reduces the risk for mortality and cardiovascular events in hemodialysis patients.

Abstract

Our objective was to determine the extent to which chronic kidney disease mineral bone disorder (CKD-MBD) is associated with health-related quality of life among incident dialysis patients.This study's design was a cross-sectional analysis.This was part of the United States Renal Data System Dialysis Morbidity and Mortality Study (DMMS), Wave 2.The patients comprised 2590 adult participants in DMMS Wave 2, for whom quality of life and laboratory data were available.We stratified patients according to their serum concentrations of phosphorus, calcium, and parathyroid hormone (PTH), and compared health-related quality of life as a function of these indicators in analyses adjusted for demographic, clinical, and other laboratory variables.Main outcome measures included Physical Component Summary (PCS) and Mental Component Summary (MCS) scores, and the Symptom score of the Kidney Disease Quality of Life.Both high and low serum phosphorus concentrations were associated with lower PCS scores (-1.25 to -1.48 points compared with the reference category), as was low PTH (-1.49 points). Low serum phosphorus was associated with more severe symptoms of kidney disease (-3.88 points), but there were no associations between high phosphorus or either extreme of PTH and the Symptom score. Serum calcium concentration and the calcium x phosphorus product were unassociated with PCS or Symptom scores. There were no associations among phosphorus, calcium, or PTH and MCS. Analyses simultaneously controlling for serum phosphorus, calcium, and PTH showed similar results.High and low serum phosphorus and low PTH are associated with slightly poorer self-reported physical functioning. Clinical trials will be necessary to determine whether and to what extent improvement in health status may occur with the correction of selected disorders of mineral metabolism.

Abstract

There is limited information about the true incidence of acute renal failure (ARF). Most studies could not quantify disease frequency in the general population as they are hospital-based and confounded by variations in threshold and the rate of hospitalization. Earlier studies relied on diagnostic codes to identify non-dialysis requiring ARF. These underestimated disease incidence since the codes have low sensitivity. Here we quantified the incidence of non-dialysis and dialysis-requiring ARF among members of a large integrated health care delivery system - Kaiser Permanente of Northern California. Non-dialysis requiring ARF was identified using changes in inpatient serum creatinine values. Between 1996 and 2003, the incidence of non-dialysis requiring ARF increased from 322.7 to 522.4 whereas that of dialysis-requiring ARF increased from 19.5 to 29.5 per 100,000 person-years. ARF was more common in men and among the elderly, although those aged 80 years or more were less likely to receive acute dialysis treatment. We conclude that the use of serum creatinine measurements to identify cases of non-dialysis requiring ARF resulted in much higher estimates of disease incidence compared with previous studies. Both dialysis-requiring and non-dialysis requiring ARFs are becoming more common. Our data underscore the public health importance of ARF.

Abstract

Fetuin-A is a multifunctional hepatic secretory protein that inhibits dystrophic vascular and valvular calcification. Lower serum fetuin-A concentrations are associated with valvular calcification in persons with end-stage renal disease. Whether fetuin-A is associated with valvular calcification in other patient populations is unknown.We evaluated the associations among serum fetuin-A concentrations, mitral annular calcification, and aortic stenosis in 970 ambulatory persons with coronary heart disease and without severe kidney disease. The presence or absence of mitral annular calcification and aortic stenosis was determined by transthoracic echocardiography. The subjects' mean age was 66 years; 81% were men; 189 (20%) had mitral annular calcification; and 79 (8%) had aortic stenosis. Participants were categorized by tertiles of fetuin-A concentrations. Those within the highest fetuin-A tertile had significantly lower odds of mitral annular calcification compared with the lowest tertile (adjusted odds ratio, 0.47; 95% confidence interval, 0.29 to 0.77; P=0.002); this association was similar regardless of diabetes status (P for interaction=0.34). In contrast, the association of fetuin-A with aortic stenosis was modified by the presence or absence of diabetes mellitus (P for interaction=0.03). Among participants without diabetes, the highest fetuin-A tertile had a significantly lower odds of aortic stenosis compared with the lowest tertile (adjusted odds ratio, 0.37; 95% confidence interval, 0.15 to 0.92; P=0.03), whereas among participants with diabetes, no statistically significant association was observed between fetuin-A and aortic stenosis (adjusted odds ratio, 1.49; 95% confidence interval, 0.48 to 4.63; P=0.49).Among persons with coronary heart disease, we observed an inverse association of fetuin-A and mitral annular calcification. An inverse association also was observed between fetuin-A and aortic stenosis among participants without diabetes mellitus. Fetuin-A may represent an important inhibitor of dystrophic calcification in persons with coronary heart disease.

Abstract

Acute kidney injury is an increasingly common and potentially catastrophic complication in hospitalized patients. This review summarizes the major epidemiologic studies that have informed our understanding of the incidence and prognostic significance of acute kidney injury.Early observational studies from the 1980s and 1990s established the general epidemiologic features of acute kidney injury, including the incidence, prognostic significance and predisposing medical and surgical conditions. Recent multicenter observational cohorts and administrative databases have enhanced our understanding of the overall disease burden of acute kidney injury and trends in its epidemiology. An increasing number of clinical studies focusing on specific types of acute kidney injury (e.g. following exposure to intravenous contrast, sepsis and major surgery) have provided further details into this heterogeneous syndrome.In light of the increasing incidence and prognostic significance of acute kidney injury, new strategies for prevention and treatment are desperately needed.

Abstract

The elderly constitute the fastest-growing segment of the end-stage renal disease (ESRD) population, but the epidemiology and outcomes of dialysis among the very elderly, that is, those 80 years of age and older, have not been previously examined at a national level.To describe recent trends in the incidence and outcomes of octogenarians and nonagenarians starting dialysis.Observational study.U.S. Renal Data System, a comprehensive, national registry of patients with ESRD.Octogenarians and nonagenarians initiating dialysis between 1996 and 2003.Rates of dialysis initiation and survival.The number of octogenarians and nonagenarians starting dialysis increased from 7054 persons in 1996 to 13,577 persons in 2003, corresponding to an average annual increase in dialysis initiation of 9.8%. After we accounted for population growth, the rate of dialysis initiation increased by 57% (rate ratio, 1.57 [95% CI, 1.53 to 1.62]) between 1996 and 2003. One-year mortality for octogenarians and nonagenarians after dialysis initiation was 46%. Compared with octogenarians and nonagenarians initiating dialysis in 1996, those starting dialysis in 2003 had a higher glomerular filtration rate and less morbidity related to chronic kidney disease but no difference in 1-year survival. Clinical characteristics strongly associated with death were older age, nonambulatory status, and more comorbid conditions.Survival of patients with incident ESRD who did not begin dialysis could not be assessed.The number of octogenarians and nonagenarians initiating dialysis has increased considerably over the past decade, while overall survival for patients on dialysis remains modest. Estimates of prognosis based on patient characteristics, when considered in conjunction with individual values and preferences, may aid in dialysis decision making for the very elderly.

Abstract

Observational studies suggest improvements with frequent hemodialysis (HD), but its true efficacy and safety remain uncertain. The Frequent Hemodialysis Network Trials Group is conducting two multicenter randomized trials of 250 subjects each, comparing conventional three times weekly HD with (1) in-center daily HD and (2) home nocturnal HD. Daily HD will be delivered for 1.5-2.75 h, 6 days/week, with target eK(t)/V(n) > or = 0.9/session, whereas nocturnal HD will be delivered for > or = 6 h, 6 nights/week, with target stdK(t)/V of > or = 4.0/week. Subjects will be followed for 1 year. The composite of mortality with the 12-month change in (i) left ventricular mass index (LVMI) by magnetic resonance imaging, and (ii) SF-36 RAND Physical Health Composite (PHC) are specified as co-primary outcomes. The seven main secondary outcomes are between group comparisons of: change in LVMI, change in PHC, change in Beck Depression Inventory score, change in Trail Making Test B score, change in pre-HD serum albumin, change in pre-HD serum phosphorus, and rates of non-access hospitalization or death. Changes in blood pressure and erythropoiesis will also be assessed. Safety outcomes will focus on vascular access complications and burden of treatment. Data will be obtained on the cost of delivering frequent HD compared to conventional HD. Efforts will be made to reduce bias, including blinding assessment of subjective outcomes. Because no large-scale randomized trials of frequent HD have been previously conducted, the first year has been designated a Vanguard Phase, during which feasibility of randomization, ability to deliver the interventions, and adherence will be evaluated.

Abstract

End stage renal disease (ESRD) affects over 1500 people per million population in countries with a high prevalence, such as the USA and Japan. Approximately two thirds of people with ESRD receive haemodialysis, a quarter have kidney transplants, and a tenth receive peritoneal dialysis. METHODS AND OUTCOMES: We conducted a systematic review and aimed to answer the following clinical questions: What are the effects of different doses and osmotic agents for peritoneal dialysis? What are the effects of different doses and membrane fluxes for haemodialysis? What are the effects of interventions aimed at preventing secondary complications? We searched: Medline, Embase, The Cochrane Library and other important databases up to April 2007 (Clinical Evidence reviews are updated periodically, please check our website for the most up-to-date version of this review). We included harms alerts from relevant organisations such as the US Food and Drug Administration (FDA) and the UK Medicines and Healthcare products Regulatory Agency (MHRA).We found 20 systematic reviews, RCTs, or observational studies that met our inclusion criteria. We performed a GRADE evaluation of the quality of evidence for interventions.In this systematic review we present information relating to the effectiveness and safety of the following interventions: cinacalcet, darbepoetin, dextrose solutions, erythropoietin, haemodialysis (standard-dose, increased-dose), high-membrane-flux haemodialysis, icodextrin, increased-dose peritoneal dialysis, low-membrane-flux haemodialysis, mupirocin, sevelamer, and standard-dose dialysis.

Abstract

The optimal surgical approach for tertiary hyperparathyroidism (HPT) after kidney transplantation is unknown. Existing studies are limited by small sample size, lack of adjustment for kidney function, and no long-term follow-up.We retrospectively analyzed 74 patients with tertiary HPT who underwent parathyroidectomy at two centers since 1978. Persistent HPT was defined as parathyroid hormone (PTH) concentrations in excess of the K/DOQI target range for the corresponding estimated creatinine clearance (eCrCl).Seventy-four patients had 83 operations (72 subtotal and 11 less-than-subtotal parathyroidectomies). Mean follow-up time was 5.4 +/- 4.7 years. Calcium concentrations decreased significantly after parathyroidectomy (2.83 vs 2.28 mmol/L, P < 0.001), as did eCrCl (54.5 vs 44.9 mL/min, P < 0.001) and PTH (382 vs 132 pg/mL, P < 0.001). In the multivariable regression analysis, only the type of operation and postoperative eCrCl were significantly correlated with PTH at follow-up. A limited parathyroidectomy was associated with a fivefold increase in risk of persistent or recurrent hyperparathyroidism.The use of limited parathyroidectomy for tertiary HPT after kidney transplantation has a higher risk of persistent/recurrent HPT. Subtotal parathyroidectomy is recommended for patients with tertiary HPT.

Abstract

Rates of ESRD are rising faster in Hispanic than non-Hispanic white individuals, but reasons for this are unclear. Whether rates of cardiovascular events and mortality differ among Hispanic and non-Hispanic white patients with chronic kidney disease (CKD) also is not well understood. Therefore, this study examined the associations between Hispanic ethnicity and risks for ESRD, cardiovascular events, and death in patients with CKD. A total of 39,550 patients with stages 3 to 4 CKD from Kaiser Permanente of Northern California were included. Hispanic ethnicity was obtained from self-report supplemented by surname matching. GFR was estimated from the abbreviated Modification of Diet in Renal Disease equation, and clinical outcomes, patient characteristics, and longitudinal medication use were ascertained from health plan databases and state mortality files. After adjustment for sociodemographic characteristics, Hispanic ethnicity was associated with an increased risk for ESRD (hazard ratio [HR] 1.93; 95% confidence interval [CI] 1.72 to 2.17) when compared with non-Hispanic white patients, which was attenuated after controlling for diabetes and insulin use (HR 1.50; 95% CI 1.33 to 1.69). After further adjustment for potential confounders, Hispanic ethnicity remained independently associated with an increased risk for ESRD (HR 1.33; 95% CI 1.17 to 1.52) as well as a lower risk for cardiovascular events (HR 0.82; 95% CI 0.76 to 0.88) and death (HR 0.72; 95% CI 0.66 to 0.79). Among a large cohort of patients with CKD, Hispanic ethnicity was associated with lower rates of death and cardiovascular events and a higher rate of progression to ESRD. The higher prevalence of diabetes among Hispanic patients only partially explained the increased risk for ESRD. Further studies are required to elucidate the cause(s) of ethnic disparities in CKD-associated outcomes.

Abstract

To retrospectively determine the long-term outcome (>6 months) of placement of tunneled hemodialysis catheters.The HIPAA-compliant study protocol was approved by the Committee on Human Research, which waived the requirement for informed consent. The records of patients who underwent hemodialysis with the Tesio system (Medcomp, Harleysville, Pa) at a single outpatient dialysis unit between March 1994 and March 2004 were reviewed. The length of catheter access and the requirements for percutaneous revision were recorded, and unassisted- and assisted-access survival times were computed by using the Kaplan-Meier method.Three hundred three primary Tesio accesses were created in 200 patients (mean age, 62.3 years +/- 16.3 [standard deviation]; 102 women [51.0%]). Fifty-nine of 303 accesses (19.5%) were percutaneously revised with catheter exchange. During follow-up, 200 of 303 accesses (66.0%) were terminated (117 because they were no longer needed and 83 because of catheter malfunction), and 103 (34.0%) accesses were functioning at the time of last follow-up. The mean duration of catheter access was 247 days (range, 3-2016 days). One hundred twenty-six (41.6%) accesses remained in use for more than 6 months; 50 (16.5%), for more than 1 year; 20 (6.6%), for more than 2 years; 14 (4.6%), for more than 3 years; and five (1.7%), for more than 4 years. Assisted-access survival was 78.1%, 60.0%, 51.5%, 51.5%, and 46.8% at 6 months and 1, 2, 3, and 4 years, respectively.Tesio catheters frequently function for periods longer than 6 months and, when necessary, they can function for many years.

Abstract

Patients with end-stage renal disease (ESRD) require dialysis to maintain survival. The optimal timing of dialysis initiation in terms of cost-effectiveness has not been established.We developed a simulation model of individuals progressing towards ESRD and requiring dialysis. It can be used to analyze dialysis strategies and scenarios. It was embedded in an optimization frame worked to derive improved strategies.Actual (historical) and simulated survival curves and hospitalization rates were virtually indistinguishable. The model overestimated transplantation costs (10%) but it was related to confounding by Medicare coverage. To assess the model's robustness, we examined several dialysis strategies while input parameters were perturbed. Under all 38 scenarios, relative rankings remained unchanged. An improved policy for a hypothetical patient was derived using an optimization algorithm.The model produces reliable results and is robust. It enables the cost-effectiveness analysis of dialysis strategies.

Abstract

Among critically ill patients, acute kidney injury (AKI) is a relatively common complication that is associated with an increased risk for death and other complications. To date, no treatment has been developed to prevent or attenuate established AKI. Dialysis often is required, but the optimal timing of initiation of dialysis is unknown. Data from the Program to Improve Care in Acute Renal Disease (PICARD), a multicenter observational study of AKI, were analyzed. Among 243 patients who did not have chronic kidney disease and who required dialysis for severe AKI, we examined the risk for death within 60 d from the diagnosis of AKI by the blood urea nitrogen (BUN) concentration at the start of dialysis (BUN < or = 76 mg/dl in the low degree of azotemia group [n = 122] versus BUN > 76 mg/dl in the high degree of azotemia group [n = 121]). Standard Kaplan-Meier product limit estimates, proportional hazards (Cox) regression methods, and a propensity score approach were used to account for selection effects. Crude survival rates were slightly lower for patients who started dialysis at higher BUN concentrations, despite a lesser burden of organ system failure. Adjusted for age, hepatic failure, sepsis, thrombocytopenia, and serum creatinine and stratified by site and initial dialysis modality, the relative risk for death that was associated with initiation of dialysis at a higher BUN was 1.85 (95% confidence interval 1.16 to 2.96). Further adjustment for the propensity score did not materially alter the association (relative risk 1.97; 95% confidence interval 1.21 to 3.20). Among critically ill patients with AKI, initiation of dialysis at higher BUN concentrations was associated with an increased risk for death. Although the results could reflect residual confounding by severity of illness, they provide a rationale for prospective testing of alternative dialysis initiation strategies in critically ill patients with severe AKI.

Abstract

To adjust adequately for comorbidity and severity of illness in quality improvement efforts and prospective clinical trials, predictors of death after acute renal failure (ARF) must be accurately identified. Most epidemiological studies of ARF in the critically ill have been based at single centers, or have examined exposures at single time points using discrete outcomes (e.g., in-hospital mortality). We analyzed data from the Program to Improve Care in Acute Renal Disease (PICARD), a multi-center observational study of ARF. We determined correlates of mortality in 618 patients with ARF in intensive care units using three distinct analytic approaches. The predictive power of models using information obtained on the day of ARF diagnosis was extremely low. At the time of consultation, advanced age, oliguria, hepatic failure, respiratory failure, sepsis, and thrombocytopenia were associated with mortality. Upon initiation of dialysis for ARF, advanced age, hepatic failure, respiratory failure, sepsis, and thrombocytopenia were associated with mortality; higher blood urea nitrogen and lower serum creatinine were also associated with mortality in logistic regression models. Models incorporating time-varying covariates enhanced predictive power by reducing misclassification and incorporating day-to-day changes in extra-renal organ system failure and the provision of dialysis during the course of ARF. Using data from the PICARD multi-center cohort study of ARF in critically ill patients, we developed several predictive models for prognostic stratification and risk-adjustment. By incorporating exposures over time, the discriminatory power of predictive models in ARF can be significantly improved.

Abstract

Recent studies suggest a high prevalence of cognitive impairment and dementia in persons with end-stage renal disease (ESRD), yet risk factors for dementia and its prognostic significance in persons with ESRD remain unclear. The goals of this study were to determine the prevalence, correlates and dialysis-related outcomes of dementia in an international sample of haemodialysis patients.We analysed data collected from a cohort of 16 694 patients in the Dialysis Outcomes and Practice Patterns Study. Dementia was defined as a diagnosis of dementia documented in the medical record. We used logistic regression to determine the baseline correlates of dementia and Cox proportional hazards models to determine the relative risk (RR) of death and dialysis withdrawal for patients with dementia, while adjusting for a number of confounding factors.Overall, 4% of the cohort had a recorded diagnosis of dementia. In the cross-sectional analyses, risk factors for dementia in the general population including age, black race, low educational attainment, cerebrovascular disease and diabetes, as well as modifiable uraemia-related factors, including markers of malnutrition and anaemia, were independently associated with dementia. After adjustment for a number of confounding factors, dementia was associated with an increased risk of death [RR 1.48, 95% confidence interval (CI) 1.32-1.66] and dialysis withdrawal (RR 2.01, 95% CI 1.57-2.57).Dementia is associated with adverse outcomes among ESRD patients. Dialysis providers should consider instituting routine screening for cognitive impairment among elderly patients in order to identify those at risk for associated adverse outcomes.

Abstract

Chronic kidney disease is a risk factor for heart failure, an association that may be particularly important in blacks who are disproportionately affected by both processes. Our objective was to determine whether the association of chronic kidney disease with incident heart failure differs between blacks and whites.The study population comprised participants in the Health, Aging, and Body Composition Study without a diagnosis of heart failure (1124 black and 1676 white community-dwelling older persons). The main predictors were quintiles of cystatin C and creatinine concentrations and estimated glomerular filtration rate. The main outcome measure was incident heart failure.Over a mean 5.7 years, 200 participants developed heart failure. High concentrations of cystatin C and low estimated glomerular filtration rate were each associated with heart failure, but the magnitude was greater for blacks than for whites (cystatin C concentration: adjusted hazard ratio for quintile 5 [> or =1.18 mg/dL] vs quintile 1 [<0.84 mg/dL] was 3.0 [95% confidence interval 1.4-6.5] in blacks and 1.4 [95% confidence interval, 0.8-2.5] in whites; estimated glomerular filtration rate: adjusted hazard ratio for quintile 5 (<59.2 mL/min) vs quintile 1 (>86.7 mL/min) was 2.7 [95% confidence interval, 1.4-4.9] in blacks and 1.8 [95% confidence interval, 0.9-3.6] in whites). For cystatin C, this association was observed at more modest decrements in kidney function among blacks as well. The population attributable risk of heart failure was 47% for blacks with moderate or high concentrations of cystatin C (> or =0.94 mg/dL) (56% prevalence) but only 5% among whites (64% prevalence).The association of kidney dysfunction with heart failure appears stronger in blacks than for whites, particularly when cystatin C is used to measure kidney function.

Abstract

The metabolic syndrome is a constellation of physical and laboratory abnormalities including hypertension, hyperglycemia, hyperlipidemia and abdominal obesity. Over the past decade, the metabolic syndrome has emerged as a critically important risk factor for cardiovascular disease.A large population-based cross-sectional analysis (the National Health and Nutrition Evaluation Survey III) found that the presence of the metabolic syndrome was associated with chronic kidney disease, defined as an estimated glomerular filtration rate of less than 60 ml/min per 1.73 m and was also associated with proteinuria. More recently, a prospective cohort study found that the presence of the metabolic syndrome was associated with incident chronic kidney disease by the same definition, even when excluding individuals with diabetes mellitus and hypertension. More studies are required to determine whether the relationship between the metabolic syndrome and chronic kidney disease is mainly mediated by hyperglycemia (with insulin resistance) and hypertension, or other metabolic or hemodynamic factors.The metabolic syndrome is associated with chronic kidney disease. Efforts aimed at determining the mechanisms underlying this association and strategies for the prevention of chronic kidney disease (or slowing the progression of chronic kidney disease) in affected patients should be research priorities in the future.

Abstract

The importance of hemodialysis session length relative to small solute (e.g., urea) clearance has been debated for many years. Longer session length augments clearance of larger molecules and may facilitate ultrafiltration; however, the independent effects of session length on survival and other outcomes are unknown. In this report, we review two recently published observational studies examining the association between hemodialysis session length and survival. Prospective clinical trials will be required to resolve the debate.

Abstract

Disturbances of mineral metabolism are associated with significant morbidity and mortality in patients with chronic kidney disease. Unfortunately, some of the treatments for these disturbances also have been found to be associated with morbidity. More recently, there is increasing evidence in the form of prospective, randomized trials that the use of calcium-based phosphate binders contributes to progressive coronary artery and aorta calcification compared with the non-calcium-containing binder sevelamer. Moreover, there is compelling biologic plausibility that hyperphosphatemia and excess exogenous calcium administration can accelerate vascular calcification. Unfortunately, there is no bedside test that can determine whether there is a dose of calcium salts (either as maintenance or as cumulative dose) that can be administered safely, and, unfortunately, the serum calcium concentration does not reflect calcium balance. Therefore, calcium-based phosphate binders should be avoided in many, if not most, patients who are undergoing dialysis.

Abstract

This brief article is a response to the article by Monge et al. on page 326 entitled Reappraisal of 2003 NKF-K/DOQI guidelines for management of hyperparathyroidism in chronic kidney disease patients. We contend that there is insufficient evidence to support the changes to clinical practice and clinical practice guidelines proposed by Monge and colleagues. We recommend that clinical trials be conducted to resolve these points of contention and other critical issues in the management of disorders of mineral metabolism in chronic kidney disease, including secondary hyperparathyroidism. The focus should be on evaluating the effects of alternative strategies on survival, as well as clinical manifestations of cardiovascular and bone disease.

Abstract

Administrative and claims databases may be useful for the study of acute renal failure (ARF) and ARF that requires dialysis (ARF-D), but the validity of the corresponding diagnosis and procedure codes is unknown. The performance characteristics of International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes for ARF were assessed against serum creatinine-based definitions of ARF in 97,705 adult discharges from three Boston hospitals in 2004. For ARF-D, ICD-9-CM codes were compared with review of medical records in 150 patients with ARF-D and 150 control patients. As compared with a diagnostic standard of a 100% change in serum creatinine, ICD-9-CM codes for ARF had a sensitivity of 35.4%, specificity of 97.7%, positive predictive value of 47.9%, and negative predictive value of 96.1%. As compared with review of medical records, ICD-9-CM codes for ARF-D had positive predictive value of 94.0% and negative predictive value of 90.0%. It is concluded that administrative databases may be a powerful tool for the study of ARF, although the low sensitivity of ARF codes is an important caveat. The excellent performance characteristics of ICD-9-CM codes for ARF-D suggest that administrative data sets may be particularly well suited for research endeavors that involve patients with ARF-D.

Abstract

Despite improvements in intensive care and dialysis, some experts have concluded that outcomes associated with acute renal failure (ARF) have not improved significantly over time. ARF was studied in hospitalized patients between 1988 and 2002 using the Nationwide Inpatient Sample, a nationally representative sample of discharges from acute-care, nonfederal hospitals. During a 15-yr period, 5,563,381 discharges with ARF and 598,768 with ARF that required dialysis (ARF-D) were identified. Between 1988 and 2002, the incidence of ARF rose from 61 to 288 per 100,000 population; the incidence of ARF-D increased from 4 to 27 per 100,000 population. Between 1988 and 2002, in-hospital mortality declined steadily in patients with ARF (40.4 to 20.3%; P < 0.001) and in those with ARF-D (41.3 to 28.1%; P < 0.001). Compared with 1988 to 1992, the multivariable-adjusted odds ratio (OR) of death was lower in 1993 to 1997 (ARF: OR 0.62, 95% confidence interval [CI] 0.61 to 0.64; ARF-D: OR 0.63, 95% CI 0.59 to 0.66) and 1998 to 2002 (ARF: OR 0.40, 95% CI 0.39 to 0.41; ARF-D: OR 0.47, 95% CI 0.45 to 0.50). The percentage of patients who had ARF with a Deyo-Charlson comorbidity index of 3 or more increased from 16.4% in 1988 to 26.6% in 2002 (P < 0.001). This study provides evidence from an administrative database that the incidence of ARF and ARF-D is rising. Despite an increase in the degree of comorbidity, in-hospital mortality has declined.

Abstract

Greater weight-for-height has been associated with prolonged survival in patients with end-stage renal disease (ESRD) but not in the general population. The association between body size and health status has not been carefully evaluated.We compared the self-reported health status of 2467 participants in the Dialysis Morbidity and Mortality Study Wave 2 by using body mass index (BMI; in kg/m2) to approximate body size and composition.BMI was categorized into 4 groups (<19, 19 to <25, 25 to <30, and > or = 30) corresponding to World Health Organization criteria for underweight, normal-weight, overweight, and obese status. We adjusted for demographic, clinical, and laboratory factors that may have confounded the association between body size and health status.Scores on the physical component summary and the physical functioning scale were significantly lower for obese subjects than for those with normal weight or moderately high BMI after adjustment for demographic factors, comorbidity, and laboratory markers of nutritional status. Mental component summary and symptom scores were unrelated to BMI. The underweight group scored lower on many Medical Outcomes Study 36-Item Short Form scales than did the normal-weight group.Whereas higher BMI has consistently been associated with enhanced dialysis-related survival, health status-particularly physical function-may be impaired by obesity. Additional longitudinal studies of body weight and composition are needed for a better understanding of the complex effects of obesity and undernutrition in persons with ESRD and advanced chronic kidney disease.

Abstract

Writing grants that are subsequently funded is an integral part of the process of patient-oriented research. A catalogue of common deficiencies that are identified in the grant review process can yield valuable insights into the process of grant writing. This article provides the authors' opinion on the common pitfalls in the current patient-oriented research applications that if identified before submission can lead to a stronger application. The authors participated in the review of clinical research grants to the National Kidney Foundation and catalogued the weaknesses of the grants that were reviewed and discussed. The top five reasons identified with grants were problems with study design (76%); statistical issues (34%); general issues such as ownership of the work, mentor, and environment (29%); weak hypothesis (24%); and problems with the research question, such as novelty or lack of creation of new data (24%). Patient-oriented research grants that have strong mentoring, are hypothesis driven, and have a strong study design that addresses sample size, analysis, and confounding factors have an increased chance of yielding high-quality research and, therefore, successful funding.

Abstract

Endovascular aneurysm repair (EVAR) is an increasingly used alternative to open surgical repair of unruptured abdominal aortic aneurysms (AAAs). The effect of EVAR on postprocedure acute renal failure has not been determined. We hypothesized that EVAR would be associated with a lower risk of acute renal failure and acute renal failure requiring hemodialysis.A retrospective cohort study was conducted of the 2002 Nationwide Inpatient Sample, the largest all-payer inpatient care database in the United States, reflecting discharges from a representative sample of United States hospitals. We identified 6614 discharges with a primary diagnosis of unruptured AAA and a primary procedure code for open AAA repair or EVAR. We excluded 56 patients with end-stage renal disease and 42 patients who underwent concomitant aortorenal bypass. We compared EVAR vs open repair in this cohort. The main outcome measures were acute renal failure and acute renal failure requiring dialysis.A total of 6516 patient discharges met the inclusion criteria for the study, and postprocedure acute renal failure developed in 439 (6.7%). EVAR was associated with lower odds of acute renal failure (adjusted odds ratio, 0.42; 95% confidence interval, 0.33 to 0.53) and acute renal failure requiring dialysis (adjusted odds ratio, 0.30, 95% confidence interval, 0.15 to 0.63). Results were similar when EVAR and open AAA repair were compared within quintiles of the propensity score for the receipt of EVAR.Compared with open AAA repair, EVAR is associated with a lower risk of postprocedure acute renal failure.

Abstract

We previously compared the safety profile of three formulations of intravenous iron used during 1998-2000 and found higher rates of adverse drug events (ADEs) associated with the use of higher molecular weight iron dextran and sodium ferric gluconate complex compared with lower molecular weight iron dextran. Since that time, iron sucrose has become widely available and clinicians have gained additional experience with sodium ferric gluconate complex.We obtained data from the United States Food and Drug Administration (FDA) on ADEs attributed to the provision of four formulations of intravenous iron during 2001-2003, including higher and lower molecular weight iron dextran, sodium ferric gluconate complex and iron sucrose. We estimated the odds of intravenous iron-related ADEs using 2 x 2 tables and the chi(2) test.The total number of reported parenteral iron-related ADEs was 1141 among approximately 30,063,800 doses administered, yielding a rate of 3.8 x 10(-5), or roughly 38 per million. Eleven individuals died in association with the ADE. Relative to lower molecular weight iron dextran, total and life-threatening ADEs were significantly more frequent among recipients of higher molecular weight iron dextran and significantly less frequent among recipients of sodium ferric gluconate complex and iron sucrose. The absolute rates of life-threatening ADEs were 0.6, 0.9, 3.3 and 11.3 per million for iron sucrose, sodium ferric gluconate complex, lower molecular weight iron dextran and higher molecular weight iron dextran, respectively. Based on differences in the average wholesale price of iron sucrose and lower molecular weight iron dextran in the US, the cost to prevent one life-threatening ADE related to the use of lower molecular weight iron dextran was estimated to be 5.0-7.8 million dollars. The cost to prevent one lower molecular weight iron dextran-related death was estimated to be 33 million dollars.The frequency of intravenous iron-related ADEs reported to the FDA has decreased, and overall, the rates are extremely low. This is the fourth report suggesting increased risks associated with the provision of higher molecular weight iron dextran. Life-threatening and other ADEs appear to be lower with the use of non-dextran iron formulations, although the cost per ADE prevented is extremely high.

Abstract

With use of recombinant erythropoietin (EPO) and intravenous iron, the majority of hemodialysis patients can achieve target hemoglobin concentrations. EPO resistance arises as a consequence of inflammation and other processes that can adversely affect survival. We hypothesized that the EPO dose-hematocrit (EPO/Hct) ratio, also known as the EPO index, may be a surrogate for inflammation and that greater EPO/Hct ratios would be associated with decreased survival.We used proportional hazards regression models and time-varying logistic models to analyze the association between EPO index and survival in US hemodialysis patients initiating hemodialysis therapy between January 1, 1999, and December 31, 2000, and followed up for up to 3 years until December 31, 2001.We found an unexpected and consistent association between greater EPO index and survival in all models. The associations of EPO/Hct ratio were most prominent at intermediate Hct values and with longer dialysis vintage. Iron administration was associated with a lower risk for death independent of Hct. Conversely, greater average prior EPO dose was associated with a greater risk for death.EPO resistance may be reflected better by total cumulative EPO dose than the EPO/Hct ratio. The mechanism(s) responsible for the association between a greater EPO/Hct ratio and survival remains to be established, but may be a result of nonerythrogenic effects of EPO.

Abstract

Kidney dysfunction is known to decrease life expectancy in the elderly. Cystatin C is a novel biomarker of kidney function that may have prognostic utility in older adults. The association of cystatin C with mortality was evaluated in a biracial cohort of black and white ambulatory elderly and compared with that of serum creatinine concentrations. The Health, Aging and Body Composition study is a cohort of well-functioning elderly that was designed to evaluate longitudinal changes in weight, body composition, and function. A total of 3075 participants who were aged 70 to 79 yr and had no disability were recruited at sites in Memphis, TN, and Pittsburgh, PA, between April 1997 and June 1998 with a follow-up of 6 yr. At entry, the mean cystatin C was 1.05 mg/L and the mean creatinine was 1.06 mg/dl. After 6 yr of follow-up, 557 participants had died. The mortality rates in each ascending cystatin C quintile were 1.7, 2.7, 2.9, 3.1, and 5.4%/yr. After adjustment for demographic risk factors, comorbid health conditions, and inflammatory biomarkers (C-reactive protein, IL-6. and TNF-alpha), each quintile of cystatin C was significantly associated with increased mortality risk compared with the lowest: Hazard ratios (HR; 95% confidence intervals) quintile 1, -1.0 (referent); quintile 2, -1.74 (1.21 to 2.50); quintile 3, -1.51 (1.05 to 2.18); quintile 4, -1.49 (1.04 to 2.13); and quintile 5, -2.18 (1.53 to 3.10). These associations did not differ by gender or race. Results were consistent for cardiovascular and other-cause mortality, but not cancer mortality. Creatinine quintiles were not associated with mortality after multivariate adjustment (HR: 1.0 [referent], 1.00 [0.72 to 1.39], 0.95 [0.68 to 1.32], 1.11 [0.79 to 1.57], 1.16 [0.86 to 1.58]). Cystatin C is a strong, independent risk factor for mortality in the elderly. Future studies should investigate whether cystatin C has a role in clinical medicine.

Abstract

Few investigations have described fracture risk and its relation to disorders in calcium (Ca), phosphorus (P), and parathyroid hormone (PTH) metabolism in the end-stage renal disease population.Laboratory values for Ca, P, and PTH were obtained from Dialysis Morbidity and Mortality Study (DMMS) Waves 1 to 4. Additional data available from the US Renal Data System were used to determine the incidence and associated costs of hip, vertebral, and pelvic fractures in 9,007 patients with nonmissing laboratory values and Medicare as primary payor. Cox proportional hazards and Poisson models were used to analyze time to first fracture and numbers of fractures, respectively.There was no association between Ca or P values and risk for fracture; risks for vertebral and hip fractures and PTH concentrations were U shaped and weakly significant using Poisson regression (P = 0.03). The age- and sex-adjusted mortality rate after fracture was 2.7 times greater (580/1,000 person-years) than for general dialysis patients from the DMMS (217/1,000 person-years). Mean total episodic costs of hip, vertebral, and pelvic fractures were 20,810 dollars +/- 16,743 dollars (SD), 17,063 dollars +/- 26,201 dollars, and 14,475 dollars +/- 19,209 dollars, respectively.Using data from the DMMS, there were no associations between Ca and P concentrations and risk for fracture. Risks for hip and vertebral fracture were associated weakly with PTH concentration, with the lowest risk observed around a PTH concentration of 300 pg/mL (ng/L). Fractures were associated with high subsequent mortality and costs. Prospective studies are needed to determine whether therapies that maintain PTH concentrations within or near the National Kidney Foundation-Kidney Disease Outcomes Quality Initiative range will result in fewer complications of disordered mineral metabolism.

Abstract

Medicare beneficiaries without prescription drug coverage consistently fill fewer prescriptions than beneficiaries with some form of drug coverage due to cost. ESRD patients, who are disproportionately poor and typically use multiple oral medications, would likely benefit substantially from any form of prescription drug coverage. Because most hemodialysis patients are Medicare-eligible, they as well as their providers would be expected to be well informed of changes in Medicare prescription drug coverage. By examining the level of understanding and use of the temporary Medicare Prescription Drug Discount Card Program in the hemodialysis population, we can gain a better understanding of the potential long-term utilization for Medicare Part D.We surveyed English-speaking adult hemodialysis patients with Medicare coverage from two urban hemodialysis centers affiliated with the University of California San Francisco (UCSF) during July and August 2005 (n = 70). We also surveyed University- and community-based nephrologists and non-physician dialysis health care professionals over the same time frame (n = 70).Fifty-nine percent of patients received prescription drug coverage through Medi-Cal, 20% through another insurance program, and 21% had no prescription drug coverage. Forty percent of patients with no prescription drug coverage reported "sometimes" or "rarely" being able to obtain medications vs. 22% of patients with some form of drug coverage. None of the patients surveyed actually had a Medicare-approved prescription drug card, and of those who intended to apply, only 10% reported knowing how to do so. Only 11% health care professionals knew the eligibility requirements of the drug discount cards.Despite a significant need, hemodialysis patients and providers were poorly educated about the Medicare Prescription Drug Discount Cards. This has broad implications for the dissemination of information about Medicare Part D.

Abstract

Health-related quality of life and estimates of utility have been carefully evaluated in persons with end-stage renal disease. Fewer studies have examined these parameters in persons with chronic kidney disease (CKD).To determine the relations among kidney function, health-related quality of life, and estimates of utility, we administered the Kidney Disease Quality of Life Short Form 36 (KDQOL-36), Health Utilities Index (HUI)-3, and Time Trade-off (TTO) questionnaires to 205 persons with CKD. Persons with CKD stages 4 and 5 (estimated GFR <30 mL/min/1.73 m2, N= 115) were tested two to eight times over the subsequent two years. The relations among estimated glomerular filtration rate (eGFR), and changes in health-related quality of life and utility over time were estimated using mixed effect regression models. Models were adjusted for age, sex, race, and diabetes.Mean scores on the KDQOL-36 generic components, HUI-3, and TTO suggested considerable loss of function and well-being in CKD relative to population norms. On cross-sectional analysis, lower levels of kidney function were associated with significantly lower scores on the SF-12 Physical Health Composite (P= 0.002), the Burden of Kidney Disease subscale (P < 0.0001), and the Effects of Kidney Disease subscale (P < 0.0001) of the KDQOL-36trade mark. Kidney function was significantly associated with the TTO (P= 0.008) and global HUI-3 utility (P= 0.016) although these associations were attenuated after adjustment for diabetes. A decline in eGFR was associated with a significant increase in the reported Burden of Kidney Disease (5.0 point change per year per mL/min/1.73 m2 decline in eGFR) and with marginally significant changes in the Dexterity and Pain attributes of the HUI-3. Mean HUI-3 scores for persons with CKD stages 4 and 5, absent dialysis, were in the range previously reported for persons with stroke and severe peripheral vascular disease.Health-related quality of life and estimates of utility are distressingly low in persons with CKD. Self-reported outcomes should be considered when evaluating health policy decisions that affect this population.

Abstract

Prealbumin (transthyretin) is a hepatic secretory protein thought to be important in the evaluation of nutritional deficiency and nutrition support. Prior studies have suggested that the serum prealbumin concentration is independently associated with mortality in hemodialysis patients, even with adjustment for serum albumin and other nutritional parameters.To determine whether prealbumin was independently associated with mortality and morbidity (cause-specific hospitalization) in hemodialysis patients, we analyzed data on 7815 hemodialysis patients with at least one determination of serum prealbumin during the last three months of 1997. Unadjusted, case mix-adjusted, and multivariable-adjusted relative risks of death were calculated for categories of serum prealbumin using proportional hazards regression. We also determined whether the prealbumin concentration was associated with all-cause, cardiovascular, infection-related, and vascular access-related hospitalization.The relative risk (RR) of death was inversely related to the serum prealbumin concentration. Relative to prealbumin > or =40 mg/dL, the adjusted RRs of death were 2.41, 1.85, 1.49, and 1.23 for prealbumin <15, 15-20, 20-25, and 25-30 mg/dL, respectively. The adjusted RRs of hospitalization due to infection were 2.97, 1.95, 1.81, and 1.61 for prealbumin <15, 15-20, 20-25, and 25-30 mg/dL, respectively. The adjusted RRs of vascular access-related hospitalization were 0.48, 0.52, 0.58, and 0.71 for prealbumin <15, 15-20, 20-25, and 25-30 mg/dL, respectively. While serum albumin was strongly associated with mortality and all-cause hospitalization, it was not associated with hospitalization due to infection, and lower levels were associated with higher rather than lower rates of vascular access-related hospitalization.In hemodialysis patients, lower prealbumin concentrations were associated with mortality and hospitalization due to infection, independent of serum albumin and other clinical characteristics. Higher prealbumin concentrations were associated with vascular access-related hospitalization. In light of these findings, more intensive study into the determinants and biological actions of prealbumin (transthyretin) in end-stage renal disease is warranted.

Abstract

Anorexia is an important cause of protein-energy malnutrition (PEM) in haemodialysis patients. We investigated whether self-reported appetite was associated with death and hospitalization in subjects enrolled in the Hemodialysis (HEMO) Study.The HEMO Study was a 7-year, multicentre, randomized trial (N = 1846), which examined the effects of dialysis dose and membrane flux on mortality and morbidity. Three questions from the Appetite and Diet Assessment Tool (ADAT) were used to determine whether appetite had changed over time in the randomized treatment groups. The relations among ADAT scores, dietary protein and energy intakes, biochemical and anthropometric measures, and quality of life were assessed. We used Cox proportional hazards models to evaluate the relative risks of death and hospitalization associated with static and dynamic ADAT scores, adjusted for demographic factors, dose and flux assignments, and co-morbidity.The average length of follow-up was 2.84 years. After adjusting for demographic factors and randomized treatment assignments, there was a significant association between poorer self-reported appetite and death (RR 1.52, 95% CI 1.16-1.98); however, the association became non-significant with further adjustment for co-morbidity (RR 1.23, 95% CI 0.94-1.62). Poorer appetite was unequivocally associated with increased hospitalization rates (multivariable RR 1.35, 95% CI 1.13-1.61). The longitudinal effect of worsening appetite from baseline to 1 year was not associated with mortality or hospitalization rate after adjusting for co-morbidity.The association between appetite and death was confounded by co-morbidity. Self-reported appetite was associated with hospitalization rate in haemodialysis patients and, thus, it may be a useful screening tool for this outcome. Patients who report poor or very poor appetites should be monitored, and they should receive more comprehensive nutritional assessments.

Abstract

Few studies in patients with ESRD have examined outcomes in Asian or Pacific Islander subgroups compared with white individuals. The objective of this study was to assess ethnic disparities in mortality and kidney transplantation among a multiethnic cohort of incident dialysis patients. A total of 24,963 patients who initiated dialysis within the TransPacific Renal Network (Network 17) between April 1, 1995, and September 30, 2001, were studied to ascertain death and kidney transplantation through September 30, 2002. Overall, 12,902 deaths and 2258 kidney transplantations were observed during 59,075 person-years of follow-up. Mortality on dialysis among Asians and Pacific Islanders (except Chamorros) was lower than that of white individuals after controlling for differences in sociodemographic characteristics, comorbid conditions, and other risk factors for death (adjusted hazard ratio [95% confidence interval] versus white individuals: Japanese 0.64 [0.57 to 0.72], Chinese 0.64 [0.52 to 0.78], Filipino 0.64 [0.57 to 0.72], Native Hawaiian 0.84 [0.72 to 0.96], Samoan 0.62 [0.48 to 0.82], and Chamorro 0.96 [0.84 to 1.20]). In contrast, Asians and Pacific Islanders were much less likely to undergo kidney transplantation (adjusted rate ratio [95% confidence interval] versus white individuals: Japanese 0.34 [0.24 to 0.46], Chinese 0.54 [0.30 to 0.88], Filipino 0.32 [0.26 to 0.47], Native Hawaiian 0.17 [0.10 to 0.30], Samoan 0.17 [0.07 to 0.38], and Chamorro 0.04 [0.01 to 0.14]). Despite wide variations in primary cause of ESRD, clinical characteristics, and body size at dialysis initiation, Asians and Pacific Islanders experience better survival but substantially lower transplantation rates compared with white individuals. Strategies that are aimed at improving access to transplantation in Asian and Pacific Islander communities may further enhance survival among Asians and Pacific Islanders with ESRD.

Abstract

The marginal effects of acute kidney injury on in-hospital mortality, length of stay (LOS), and costs have not been well described. A consecutive sample of 19,982 adults who were admitted to an urban academic medical center, including 9210 who had two or more serum creatinine (SCr) determinations, was evaluated. The presence and degree of acute kidney injury were assessed using absolute and relative increases from baseline to peak SCr concentration during hospitalization. Large increases in SCr concentration were relatively rare (e.g., >or=2.0 mg/dl in 105 [1%] patients), whereas more modest increases in SCr were common (e.g., >or=0.5 mg/dl in 1237 [13%] patients). Modest changes in SCr were significantly associated with mortality, LOS, and costs, even after adjustment for age, gender, admission International Classification of Diseases, Ninth Revision, Clinical Modification diagnosis, severity of illness (diagnosis-related group weight), and chronic kidney disease. For example, an increase in SCr >or=0.5 mg/dl was associated with a 6.5-fold (95% confidence interval 5.0 to 8.5) increase in the odds of death, a 3.5-d increase in LOS, and nearly 7500 dollars in excess hospital costs. Acute kidney injury is associated with significantly increased mortality, LOS, and costs across a broad spectrum of conditions. Moreover, outcomes are related directly to the severity of acute kidney injury, whether characterized by nominal or percentage changes in serum creatinine.

Abstract

The purpose of this study was to determine if indicators of nutritional status were associated with subsequent mortality in hemodialysis patients.Twelve selected nutrition indicators were measured prior to randomization in the Mortality and Morbidity in Hemodialysis (HEMO) Study. Relative risks (RR) of mortality were assessed at <6 months and >6 months of follow-up using Cox regression after controlling for case mix, comorbidity, and treatment assignment (high vs. standard Kt/V and high vs. low membrane flux).Low values of most nutritional status indicators were associated with increased RR of mortality. RRs were greatest over the short term (<6 months) and diminished with increasing follow-up (>6 months). Increases in body mass index (BMI) at lower levels (e.g., < or =25 kg/m(2)) and increases in serum albumin at any level were associated with reduced short-term RR, even after adjusting for case mix, treatment assignment, and for the joint effects of equilibrated normalized protein catabolic rate, total cholesterol, and serum creatinine. For >6 months' follow-up, increases in values among those with lower levels of BMI and serum albumin (< or =3.635 g/dL) and increases in all serum creatinine levels were associated with lower RR.Nutrition indicators are associated with subsequent mortality in a time-dependent manner, with greatest effects at <6 months of follow-up. The RR for these indicators may also vary within different ranges of values.

Abstract

Previous studies suggest a link between chronic kidney disease (CKD) and cognitive impairment. Whether the longitudinal course of cognitive impairment differs among people with or without CKD is unknown. Data collected in 3034 elderly individuals who participated in the Health, Aging, and Body Composition study were analyzed. Cognitive function was assessed with the Modified Mini-Mental State Exam (3MS) at baseline and then 2 and 4 yr after baseline. Cognitive impairment was defined as a 3MS score <80 or a decline in 3MS >5 points after 2 or 4 yr of follow-up among participants with baseline 3MS scores > or =80. Participants with CKD, defined as an estimated GFR (eGFR) <60 ml/min per 1.73 m2, were further divided into two eGFR strata. Unadjusted mean baseline 3MS scores and mean declines in 3MS scores over 4 yr were significantly more pronounced for participants with lower baseline eGFR. More advanced stages of CKD were associated with an increased risk for cognitive impairment: Odds ratio (OR) 1.32 (95% confidence interval [CI] 1.03 to 1.69) and OR 2.43 (95% CI, 1.38 to 4.29) for eGFR 45 to 59 ml/min per 1.73 m2 and <45 ml/min per 1.73 m2, respectively, adjusted for case mix, baseline 3MS scores, and other potential confounders. CKD is associated with an increased risk for cognitive impairment in the elderly that cannot be fully explained by other well-established risk factors. Studies aimed at understanding the mechanism(s) responsible for cognitive impairment in CKD and efforts to interrupt this decline are warranted.

Abstract

The metabolic syndrome is a risk factor for the development of diabetes and cardiovascular disease; however, no prospective studies have examined the metabolic syndrome as a risk factor for chronic kidney disease (CKD). A total of 10,096 nondiabetic participants who were in the Atherosclerosis Risk in Communities study and had normal baseline kidney function composed the study cohort. The metabolic syndrome was defined according to recent guidelines from the National Cholesterol Education Program. Incident CKD was defined as an estimated GFR (eGFR) <60 ml/min per 1.73 m2 at study year 9 among those with an eGFR > or =60 ml/min per 1.73 m2 at baseline. After 9 yr of follow-up, 691 (7%) participants developed CKD. The multivariable adjusted odds ratio (OR) of developing CKD in participants with the metabolic syndrome was 1.43 (95% confidence interval [CI], 1.18 to 1.73). Compared with participants with no traits of the metabolic syndrome, those with one, two, three, four, or five traits of the metabolic syndrome had OR of CKD of 1.13 (95% CI, 0.89 to 1.45), 1.53 (95% CI, 1.18 to 1.98), 1.75 (95% CI, 1.32 to 2.33), 1.84 (95% CI, 1.27 to 2.67), and 2.45 (95% CI, 1.32 to 4.54), respectively. After adjusting for the subsequent development of diabetes and hypertension during the 9 yr of follow-up, the OR of incident CKD among participants with the metabolic syndrome was 1.24 (95% CI, 1.01 to 1.51). The metabolic syndrome is independently associated with an increased risk for incident CKD in nondiabetic adults.

Abstract

We hypothesized that elevated blood urea nitrogen (BUN) would be associated with adverse outcomes independent of serum creatinine (sCr)-based estimates of kidney function in patients with acute coronary syndromes (ACS).Although lower glomerular filtration rates (GFR) have prognostic significance among patients with ACS, estimates of GFR based on sCr may perform less accurately among patients with milder kidney dysfunction. In this population in particular, BUN, which can reflect increased proximal tubular reabsorption in addition to decreased GFR, may have independent prognostic value.Data were drawn from 9,420 patients with unstable coronary syndromes from Orbofiban in Patients With Unstable Coronary Syndromes-Thrombolysis In Myocardial Infarction (OPUS-TIMI)-16, a trial that excluded patients with sCr >1.6 mg/dl or estimated creatinine clearance <40 ml/min.Patients with elevated BUN were older, had a higher prevalence of comorbidities, and had higher heart rates, lower systolic blood pressures, and an abnormal Killip class more often on admission. In univariate analyses, as well as in stratified and multivariable analyses including sCr-based estimates of GFR as a covariate, a stepwise increase in mortality occurred with increasing BUN (multivariable hazard ratio with BUN 20 to 25 mg/dl 1.9, 95% confidence interval 1.3 to 2.6; with BUN >/=25 mg/dl 3.2 [95% confidence interval 2.2 to 4.7]) compared with BUN =20 mg/dl. A higher BUN was also associated with increased mortality among strata of troponin-I, B-type natriuretic peptide, and C-reactive protein concentrations.Among patients with unstable coronary syndromes and predominantly normal or mildly reduced GFR, an elevated BUN is associated with increased mortality, independent of sCr-based estimates of GFR and other biomarkers.

Abstract

Although improved control of hypertension is known to attenuate progression of chronic kidney disease (CKD), little is known about the adequacy of hypertension treatment in adults with CKD in the United States. Using data from the Fourth National Health and Nutrition Survey, we assessed adherence to national hypertension guideline targets for patients with CKD (blood pressure <130/80 mm Hg), we assessed control of systolic (<130 mm Hg) and diastolic (<80 mm Hg) blood pressure, and we evaluated determinants of adequate blood pressure control. Presence of CKD was defined as glomerular filtration rate <60 mL/min per 1.73 m2 or presence of albuminuria (albumin:creatinine ratio >30 microg/mg). Multivariable logistic regression with appropriate weights was used to determine predictors of inadequate hypertension control and related outcomes. Among 3213 participants with CKD, 37% had blood pressure <130/80 mm Hg (95% confidence interval [CI], 34.5% to 41.8%). Of those with inadequate blood pressure control, 59% (95% CI, 54% to 64%) had systolic >130 mm Hg, with diastolic < or =80 mm Hg, whereas only 7% (95% CI, 3.9 to 9.8%) had a diastolic pressure >80 mm Hg, with systolic blood pressure < or =130 mm Hg. Non-Hispanic black race (odds ratio [OR], 2.4; 95% CI, 1.5 to 3.9), age >75 years (OR, 4.7; 95% CI, 2.7 to 8.2), and albuminuria (OR, 2.4; 95% CI, 1.4 to 4.1) were independently associated with inadequate blood pressure control. We conclude that control of hypertension is poor in participants with CKD and that lack of control is primarily attributable to systolic hypertension. Future guidelines and antihypertensive therapies for patients with CKD should target isolated systolic hypertension.

Abstract

We performed a post hoc analysis of a 52-week randomized trial conducted in adult hemodialysis patients that compared the effects of calcium-based phosphate binders and sevelamer, a nonabsorbable polymer, on parameters of mineral metabolism and vascular calcification by electron beam tomography. In this analysis, we evaluated the relative effects of calcium and sevelamer on thoracic vertebral attenuation by CT and markers of bone turnover. Subjects randomized to calcium salts experienced a significant reduction in trabecular bone attenuation and a trend toward reduction in cortical bone attenuation, in association with higher concentrations of serum calcium, lower concentrations of PTH, and reduced total and bone-specific alkaline phosphatase.In patients with chronic kidney disease, hyperphosphatemia is associated with osteodystrophy, vascular and soft tissue calcification, and mortality. Calcium-based phosphate binders are commonly prescribed to reduce intestinal phosphate absorption and to attenuate secondary hyperparathyroidism. Clinicians and investigators have presumed that, in hemodialysis patients, calcium exerts beneficial effects on bone.We performed a post hoc analysis of a 52-week randomized trial conducted in adult hemodialysis patients that compared the effects of calcium-based phosphate binders and sevelamer, a nonabsorbable polymer, on parameters of mineral metabolism and vascular calcification by electron beam tomography. In this analysis, we evaluated the relative effects of calcium and sevelamer on thoracic vertebral attenuation by CT and markers of bone turnover.The average serum phosphorus and calcium x phosphorus products were similar for both groups, although the average serum calcium concentration was significantly higher in the calcium-treated group. Compared with sevelamer-treated subjects, calcium-treated subjects showed a decrease in thoracic vertebral trabecular bone attenuation (p = 0.01) and a trend toward decreased cortical bone attenuation. More than 30% of calcium-treated subjects experienced a 10% or more decrease in trabecular and cortical bone attenuation. On study, sevelamer-treated subjects had higher concentrations of total and bone-specific alkaline phosphatase, osteocalcin, and PTH (p < 0.001). When used to correct hyperphosphatemia, calcium salts lead to a reduction in thoracic trabecular and cortical bone attenuation. Calcium salts may paradoxically decrease BMD in hemodialysis patients.

Abstract

Studies suggest that more frequent hemodialysis (HD; short daily and long nocturnal dialysis) may be associated with a variety of clinical benefits, including improvement in blood pressure, anemia, and hyperphosphatemia, regression of left ventricular hypertrophy, and reduced rates of hospitalization. Whether these clinical benefits are paralleled by improvements in health-related quality of life (HRQOL) has been unclear. In addition, the psychosocial burden of more intensive HD schedules has not been critically evaluated. Recent reports have suggested beneficial effects of frequent HD on global HRQOL, dialysis-related and uremic symptoms, patient satisfaction, and psychosocial burden. However, the interpretation of many of these studies is restricted by limitations in study design, follow-up, and generalizability. This article reviews the current literature focusing on psychosocial and HRQOL effects of frequent HD and suggests future directions for research in this important area.

Abstract

Although depression and dialysis withdrawal are relatively common among individuals with ESRD, there have been few systematic studies of suicide in this population. The goals of this study were to compare the incidence of suicide with national rates and to contrast the factors associated with suicide with those associated with withdrawal in persons with ESRD. All individuals who were aged 15 yr and older and initiated dialysis between April 1, 1995, and November 30, 2000, composed the analytic cohort. Patients were censored at the time of death, transplantation, or October 31, 2001. Death as a result of suicide in the ESRD population and the general US population was ascertained from the Death Notification Form and the Centers for Disease Control and Prevention, respectively. Standardized incidence ratios for suicide among patient subgroups were computed using national data from the year 2000 as the reference population. The crude suicide rate from 1995 to 2001 was 24.2 suicides per 100,000 patient-years, and the overall standardized incidence ratio for suicide was 1.84 (95% confidence interval, 1.50 to 2.27). In multivariable models, age > or =75 yr, male gender, white or Asian race, geographic region, alcohol or drug dependence, and recent hospitalization with mental illness were significant independent predictors of death as a result of suicide. Persons with ESRD are significantly more likely to commit suicide than persons in the general population. Although relatively rare, risk assessment can be used to identify patients for whom counseling and other interventions might be beneficial.

Abstract

Several studies have shown an association between the hemodialysis session length (the t of Kt or Kt/V) and favorable outcomes for patients on maintenance hemodialysis. In a single randomized controlled trial that systematically varied hemodialysis session length, shorter session length was associated with an increased risk for morbidity and mortality, independent of the time-averaged concentration of urea. Observational studies of dialysis session length have yielded conflicting results, although virtually all studies have confounded hemodialysis session length with hemodialysis efficiency or dose. Limited observational data from nocturnal hemodialysis programs more strongly suggest an independent beneficial effect of longer session length. In aggregate, data on the effects of hemodialysis session length are inconclusive. Future studies should evaluate hemodialysis session length independent of efficiency, and should consider the evaluation of dose by using other clearance parameters and the adequacy of ultrafiltration in addition to solute kinetics.

Abstract

Previous studies have suggested a higher prevalence of thyroid abnormalities in persons with end-stage renal disease. However, little is known regarding the epidemiology of thyroid disorders in persons with less severe kidney dysfunction.We used data from the Third National Health and Nutrition Examination Survey to examine the prevalence of hypothyroidism (clinical and subclinical) at different levels of estimated glomerular filtration rate (GFR). We used multivariable logistic regression to evaluate the association between GFR and prevalent hypothyroidism.Among 14,623 adult participants with serum creatinine and thyroid function test results, the mean age was 48.7 years, and 52.6% were women. The prevalence of hypothyroidism increased with lower levels of GFR (in units of mL/min/1.73 m(2)), occurring in 5.4% of subjects with GFR >/=90, 10.9% with GFR 60-89, 20.4% with GFR 45-59, 23.0% with GFR 30-44, and 23.1% with GFR <30 (P < 0.001 for trend). Overall, 56% of hypothyroidism cases were considered subclinical. Compared with GFR >/=90 mL/min/1.73 m(2), reduced GFR was associated with an increased risk of hypothyroidism, after adjusting for age, gender, and race/ethnicity: adjusted odds ratio 1.07 (95% confidence interval: 0.86-1.32) for GFR 60-89, 1.57 (1.11-2.22) for GFR 45-59, 1.81 (1.04-3.16) for GFR 30-44, and 1.97 (0.69-5.61) for GFR <30 mL/min/1.73 m(2) (P= 0.008 for trend).Among a nationally representative sample of adults, reduced glomerular filtration rate was associated with a higher prevalence of hypothyroidism, with many subclinical cases. Future studies are needed to determine the potential adverse effects of subclinical and clinical hypothyroidism in persons with chronic kidney disease.

Abstract

Although sleep complaints are commonly reported in persons with end stage renal disease (ESRD), little is known about the prevalence of sleep complaints in chronic kidney disease (CKD), and the relation of sleep quality to the severity of kidney disease.We administered the Kidney Disease Quality of Life (KDQOL) sleep scale to 156 subjects, 78 with ESRD and 78 with CKD. Glomerular filtration rate (GFR) was estimated using the six variable Modification of Diet in Renal Disease (MDRD) equation and used to stratify subjects with CKD as mild-moderate (GFR >25 ml/min/1.73 m(2)) and advanced (GFR <25 ml/min/1.73 m(2)). We used multivariable linear regression to determine independent predictors of KDQOL sleep scale scores. Higher scores indicate higher self-reported quality of sleep.Median scores on the KDQOL sleep scale were 59 (interquartile range 40-80) in subjects with ESRD and 69 (interquartile range 53-80) in subjects with CKD (P=0.04). Thirty-four percent of subjects with ESRD, 27% of subjects with advanced CKD, and 14% of subjects with mild to moderate CKD had sleep maintenance disturbances (P=0.05). Thirteen percent of subjects with ESRD, 11% of subjects with advanced CKD, and no subjects with mild-moderate CKD had complaints of daytime somnolence (P=0.03). There was no significant difference in the prevalence of sleep adequacy complaints in persons with ESRD versus CKD. In multivariable analyses, only age and ESRD status (vs. CKD) were significant predictors of lower KDQOL sleep scores. Among subjects with CKD, there was a significant direct association between estimated GFR and scores on the KDQOL sleep scale in non-African American subjects (P=0.01).Sleep complaints are common in persons with CKD and ESRD and may be associated with the severity of kidney disease.

Abstract

Inferior outcomes after kidney transplantation among African Americans are poorly understood. It was hypothesized that unequal access to medical care among transplant recipients might contribute to worse posttransplantation outcomes among African Americans and that racial disparities in kidney transplant outcomes would be less pronounced among patients who receive health care within versus outside the Department of Veterans Affairs (VA), because eligible veterans who receive care within the VA are entitled to receive universal access to care, including coverage of prescription drugs. A study cohort of 79,361 patients who were undergoing their first kidney transplant in the United States between October 1, 1991, and October 31, 2000, was assembled, with follow-up data on graft survival obtained through October 31, 2001. After multivariable proportional hazards adjustment for a wide range of recipient and donor characteristics, African-American patients were at increased risk for graft failure compared with non-African-American patients (relative risk [RR] 1.31; 95% confidence interval [CI] 1.26 to 1.36). African-American race was associated with a similarly increased risk for graft failure among patients who were VA users (RR 1.31; 95% CI 1.11 to 1.54) and non-VA users (RR 1.31; 95% CI 1.26 to 1.36). In conclusion, racial disparities in kidney transplant outcomes seem to persist even in a universal access-to-care system such as the VA. Reasons for worse outcomes among African Americans require further investigation.

Abstract

Almost 20 million people in the US have chronic kidney disease (CKD). Cardiovascular disease and arterial wall abnormalities are common in this population. Because angiotensin II may have adverse effects on the arterial wall, we hypothesized that an angiotensin receptor blocker (ARB) would improve arterial compliance as compared with placebo in subjects with CKD.We performed a double-blinded, placebo-controlled pilot study in which 25 subjects with stages 2 or 3 CKD and proteinuria <1 g were randomized to either the ARB, eprosartan, or placebo and titrated to achieve a goal blood pressure (BP) <130/85 mm Hg. Arterial compliance was measured at baseline and at 8 weeks.Baseline characteristics were similar between the groups and included mean estimated glomerular filtration rate 63 +/- 14 ml/min/1.73 m(2), heart rate 76 +/- 10 beats/min, BP 142 +/- 12/81 +/- 8 mm Hg, 64% diabetic, 44% male, and 40% white, though subjects in the eprosartan group were younger (60 +/- 12 vs. 70 +/- 6 years, p = 0.01). There were no significant differences between the groups in large or small artery compliance measurements either at baseline or at 8 weeks, but there was a statistically significant improvement from baseline in small artery compliance in the eprosartan group (from median 2.5 ml/mm Hg x 100 [90% CI (1.1, 4.7)] to 4.0 ml/mm Hg x 100 [90% CI (1.9, 6.7)] (p = 0.01)) not seen in the placebo group.Use of an ARB to achieve recommended BP is associated with improved small artery compliance in people with CKD, though larger studies are needed to confirm these findings.

Abstract

Gout affects a large fraction of persons with advanced chronic kidney disease, and hyperuricemia may increase the risk of cardiovascular disease. Several hypouricemic agents are contraindicated in patients with end-stage renal disease. Sevelamer is a nonabsorbed hydrogel that binds phosphorus and bile acids in the intestinal tract. Results of short-term and open-label studies suggest that sevelamer might lower the concentration of uric acid, another organic anion. We undertook this study to test our hypothesis that the reduction in serum uric acid concentration induced by sevelamer would be confirmed in a long-term, randomized, clinical trial comparing sevelamer with calcium-based phosphate binders.Two hundred subjects undergoing maintenance hemodialysis were randomly assigned to receive either sevelamer or calcium-based phosphorus binders in an international, multicenter, clinical trial. Data on baseline and end-of-study uric acid concentrations were available in 169 subjects (85%); the change in uric acid concentration from baseline to the end of the study was the outcome of interest.Baseline clinical characteristics, including mean uric acid concentrations, were similar in subjects randomly assigned to receive sevelamer and calcium-based phosphate binders. The mean change in uric acid concentration (from baseline to the end of the study) was significantly larger in sevelamer-treated subjects (-0.64 mg/dl versus -0.26 mg/dl; P = 0.03). The adjusted mean change in uric acid concentration was more pronounced when the effects of age, sex, diabetes, vintage (time since initiation of dialysis), dialysis dose, and changes in blood urea nitrogen and bicarbonate concentrations were considered (-0.72 mg/dl versus -0.15 mg/dl; P = 0.001). Twenty-three percent of sevelamer-treated subjects experienced a study-related reduction in the concentration of uric acid equal to -1.5 mg/dl or more, compared with 10% of calcium-treated subjects (P = 0.02).In a randomized clinical trial comparing sevelamer and calcium-based phosphate binders, treatment with sevelamer was associated with a significant reduction in serum uric acid concentrations.

Abstract

Although end-stage renal disease has been associated with cognitive impairment, the relation between lesser degrees of chronic kidney disease (CKD) and cognitive impairment is less well understood.Data for 1,015 women enrolled at 10 of the 20 Heart Estrogen/Progestin Replacement Study clinical sites were analyzed. All participants were younger than 80 years and had established coronary artery disease at study entry. Participants underwent 6 standard tests of cognitive function evaluating various domains. Unadjusted, residual age- and race-adjusted, and multivariable-adjusted linear and logistic regression models were used. Glomerular filtration rate (GFR) was estimated using the Modification of Diet in Renal Disease regression equation. In addition to analyses across the spectrum of GFRs, CKD was categorized as mild (estimated GFR [eGFR], 45 to 60 mL/min/1.73 m2), moderate (eGFR, 30 to 44 mL/min/1.73 m2), and severe (eGFR, <30 mL/min/1.73 m2) according to a modification of recently established classification guidelines.Mean eGFR was 57 +/- 14 mL/min/1.73 m2. In multivariable analyses, eGFR was associated significantly with impairment in global cognition, executive function, language, and memory (approximately 15% to 25% increase in risk for dysfunction/10-mL/min/1.73 m2 decrement in eGFR). Associations among eGFR and cognitive function were independent of residual effects of age and race (2 key determinants of GFR) and the contributions of education, lifestyle factors, stroke, diabetes, and other laboratory variables.CKD is associated with cognitive impairment in menopausal women with coronary artery disease.

Abstract

Persons with end-stage renal disease and those with lesser degrees of chronic kidney disease (CKD) have an increased risk of death after myocardial infarction (MI) that is not fully explained by associated comorbidities. Future cardiovascular event rates and the relative response to therapy in persons with mild to moderate CKD are not well characterized.We calculated the estimated glomerular filtration rate (eGFR) using the 4-variable Modification of Diet in Renal Disease method in 2183 Survival And Ventricular Enlargement (SAVE) trial subjects. SAVE randomized post-MI subjects (3 to 16 days after MI) with left ventricular ejection fraction < or =40% and serum creatinine <2.5 mg/dL to captopril or placebo. Cox proportional hazards models were used to evaluate the relative hazard rates for death and cardiovascular events associated with reduced eGFR. Subjects with reduced eGFR were older and had more extensive comorbidities. The multivariable adjusted risk ratio for total mortality associated with reduced eGFR from 60 to 74, 45 to 59, and <45 mL x min(-1) x 1.73 m(-2) (compared with eGFR > or =75 mL x min(-1) x 1.73 m(-2)) was 1.11 (0.86 to 1.42), 1.24 (0.96 to 1.60) and 1.81 (1.32 to 2.48), respectively (P for trend =0.001). Similar adjusted trends were present for CV mortality (P=0.001), recurrent MI (P=0.017), and the combined CV mortality and morbidity outcome (P=0.002). The absolute benefit of captopril tended to be greater in subjects with CKD: 12.4 versus 5.5 CV events prevented per 100 subjects with (n=719) and without (n=1464) CKD, respectively.CKD was associated with a heightened risk for all major CV events after MI, particularly among subjects with an estimated glomerular filtration rate <45 mL x min(-1) x 1.73 m(-2). Randomization to captopril resulted in a reduction of CV events irrespective of baseline kidney function.

Abstract

Formal cognitive function testing is cumbersome, and no self-administered instruments for estimating cognitive function in persons with chronic kidney disease (CKD) and end-stage renal disease (ESRD) have been validated. The goal of this study was to determine the validity of the Kidney Disease Quality of Life Cognitive Function scale (KDQOL-CF) for the assessment of cognitive impairment in persons with kidney disease.We administered the KDQOL-CF to 157 subjects, 79 with ESRD and 78 with CKD participating in a cross-sectional study of cognitive function. Scores on the Modified Mini-Mental State Exam (3MS) were considered the gold standard measure of global cognitive function. Performance characteristics of the KDQOL-CF were assessed using correlation coefficients, Bland-Altman plots, and receiver operating characteristic curves.Median scores on the KDQOL-CF were 73 (interquartile range 60-87) for subjects with ESRD and 87 (interquartile range 73-100) for subjects with CKD (P < 0.0001). Scores on the KDQOL-CF were directly correlated with scores on the 3MS (r = 0.31, P = 0.0001). Defining global cognitive impairment as a 3MS score < 80, a cut-point of 60 on the KDQOL-CF accurately classified 76% of subjects, with 52% sensitivity and 81% specificity. On multivariable analysis, cerebral and peripheral vascular disease, benzodiazepine use, and higher serum phosphorus concentrations were associated with lower KDQOL-CF scores, while beta blocker use, education, and higher serum albumin concentrations were associated with higher KDQOL-CF scores.The KDQOL-CF is a valid instrument for estimating cognitive function in patients with CKD and ESRD. KDQOL-CF screening followed by 3MS testing in selected individuals may prove to be an effective and efficient strategy for identifying cognitive impairment in patients with kidney disease.

Abstract

To assess the prevalence of cognitive impairment in persons with chronic kidney disease (CKD) and its relation to the severity of CKD.Cross-sectional study.University-affiliated ambulatory nephrology and dialysis practices.Eighty subjects with CKD Stages III and IV not requiring dialysis (CKD) and 80 subjects with CKD Stage V on hemodialysis (end-stage renal disease (ESRD)) with a mean age+/-standard deviation of 62.5+/-14.3.Three standardized cognitive tests, the Modified Mini-Mental State Examination (3MS), Trailmaking Test B (Trails B), and California Verbal Learning Trial (CVLT). Glomerular filtration rate was estimated in subjects with CKD using the six-variable Modification of Diet in Renal Disease equation.There was a graded relation between cognitive function and severity of CKD. Mean scores on the 3MS, Trails B, and CVLT immediate and delayed recall were significantly worse for subjects with ESRD than for subjects with CKD or published norms (P

Abstract

Radiocontrast nephropathy is a common cause of acute renal failure in hospitalized patients. Several studies have examined the capacity of theophylline or aminophylline to prevent radiocontrast nephropathy, with conflicting results. We conducted a meta-analysis of published randomized controlled trials to determine if the pre-procedural administration of theophylline or aminophylline prevents radiocontrast-induced declines in kidney function.We searched MEDLINE, EMBASE, the Cochrane Collaboration Database, bibliographies of retrieved articles, and consulted with experts to identify relevant studies. Randomized controlled trials of theophylline or aminophylline in hospitalized patients receiving radiocontrast were included. Studies were excluded if they did not report changes in serum creatinine or creatinine clearance within 48 h after radiocontrast exposure.Seven randomized controlled trials satisfied all inclusion criteria and were included in the analysis (pooled sample size n = 480). The difference in mean change in serum creatinine was 11.5 micromol/l (95% confidence intervals 5.3-19.4 micromol/l, P = 0.004) lower in the theophylline- or aminophylline-treated groups than controls. One participant (0.6%) required dialysis.Prophylactic administration of theophylline or aminophylline appears to protect against radiocontrast-induced declines in kidney function. Whether these agents reduce the proportion of patients who experience large decrements in serum creatinine concentration, or require dialysis, is unknown.

Abstract

Acute renal failure (ARF) in the critically ill is associated with extremely high mortality rates. Understanding the changing spectrum of ARF will be necessary to facilitate quality improvement efforts and to design successful interventional trials.We conducted an observational cohort study of 618 patients with ARF in intensive care units at five academic medical centers in the United States. Participants were required to sign (or have a proxy sign) informed consent for data collection. A comprehensive data collection instrument captured more than 800 variables, most on a daily basis, throughout the course of ARF. Patient characteristics, dialysis status, and major outcomes were determined and stratified by clinical site.The mean age was 59.5 years, 41% were women, and 20% were of minority race or ethnicity. There was extensive comorbidity; 30% had chronic kidney disease, 37% had coronary artery disease, 29% had diabetes mellitus, and 21% had chronic liver disease. Acute renal failure was accompanied by extrarenal organ system failure in most patients, even those who did not require dialysis. Three hundred and ninety-eight (64%) patients required dialysis. The in-hospital mortality rate was 37%, and the rate of mortality or nonrecovery of renal function was 50%. The median hospital length of stay was 25 days (26 days, excluding patients who died).There is a changing spectrum of ARF in the critically ill, characterized by a large burden of comorbid disease and extensive extrarenal complications, obligating the need for dialysis in the majority of patients. There is wide variation across institutions in patient characteristics and practice patterns. These differences highlight the need for additional multicenter observational and interventional studies in ARF.

Abstract

End-stage renal disease substantially increases the risks of death, cardiovascular disease, and use of specialized health care, but the effects of less severe kidney dysfunction on these outcomes are less well defined.We estimated the longitudinal glomerular filtration rate (GFR) among 1,120,295 adults within a large, integrated system of health care delivery in whom serum creatinine had been measured between 1996 and 2000 and who had not undergone dialysis or kidney transplantation. We examined the multivariable association between the estimated GFR and the risks of death, cardiovascular events, and hospitalization.The median follow-up was 2.84 years, the mean age was 52 years, and 55 percent of the group were women. After adjustment, the risk of death increased as the GFR decreased below 60 ml per minute per 1.73 m2 of body-surface area: the adjusted hazard ratio for death was 1.2 with an estimated GFR of 45 to 59 ml per minute per 1.73 m2 (95 percent confidence interval, 1.1 to 1.2), 1.8 with an estimated GFR of 30 to 44 ml per minute per 1.73 m2 (95 percent confidence interval, 1.7 to 1.9), 3.2 with an estimated GFR of 15 to 29 ml per minute per 1.73 m2 (95 percent confidence interval, 3.1 to 3.4), and 5.9 with an estimated GFR of less than 15 ml per minute per 1.73 m2 (95 percent confidence interval, 5.4 to 6.5). The adjusted hazard ratio for cardiovascular events also increased inversely with the estimated GFR: 1.4 (95 percent confidence interval, 1.4 to 1.5), 2.0 (95 percent confidence interval, 1.9 to 2.1), 2.8 (95 percent confidence interval, 2.6 to 2.9), and 3.4 (95 percent confidence interval, 3.1 to 3.8), respectively. The adjusted risk of hospitalization with a reduced estimated GFR followed a similar pattern.An independent, graded association was observed between a reduced estimated GFR and the risk of death, cardiovascular events, and hospitalization in a large, community-based population. These findings highlight the clinical and public health importance of chronic renal insufficiency.

Abstract

Relatively few U.S.-based studies in chronic kidney disease have focused on Asian/Pacific Islanders. Clinical reports suggest that Asian/Pacific Islanders are more likely to be affected by IgA nephropathy (IgAN), and that the severity of disease is increased in these populations.To explore whether these observations are borne out in a multi-ethnic, tertiary care renal pathology practice, we examined clinical and pathologic data on 298 patients with primary glomerular lesions (IgAN, focal segmental glomerulosclerosis, membranous nephropathy and minimal change disease) at the University of California San Francisco Medical Center from November 1994 through May 2001. Pathologic assessment of native kidney biopsies with IgAN was conducted using Haas' classification system.Among individuals with IgAN (N = 149), 89 (60%) were male, 57 (38%) white, 53 (36%) Asian/Pacific Islander, 29 (19%) Hispanic, 4 (3%) African American and 6 (4%) were of other or unknown ethnicity. The mean age was 37 +/- 14 years and median serum creatinine 1.7 mg/dL. Sixty-six patients (44%) exhibited nephrotic range proteinuria at the time of kidney biopsy. The distributions of age, gender, mean serum creatinine, and presence or absence of nephrotic proteinuria and/or hypertension at the time of kidney biopsy were not significantly different among white, Hispanic, and Asian/Pacific Islander groups. Of the 124 native kidney biopsies with IgAN, 10 (8%) cases were classified into Haas subclass I, 12 (10%) subclass II, 23 (18%) subclass III, 30 (25%) subclass IV, and 49 (40%) subclass V. The distribution of Haas subclass did not differ significantly by race/ethnicity. In comparison, among the random sample of patients with non-IgAN glomerular lesions (N = 149), 77 (52%) patients were male, 51 (34%) white, 42 (28%) Asian/Pacific Islander, 25 (17%) Hispanic, and 30 (20%) were African American.With the caveats of referral and biopsy biases, the race/ethnicity distribution of IgAN differs significantly from that of other major glomerulonephridities. However, among individuals undergoing native kidney biopsy, we see no evidence of a race/ethnicity association with severity of disease in IgAN by clinical and IgAN-specific histopathologic criteria. Further studies are needed to identify populations at higher risk for progressive disease in IgAN.

Abstract

To determine trends in the significance of HLA matching and other risk factors in kidney transplantation, we analyzed data on graft survival in a consecutive sample of 33 443 transplant recipients who received deceased donor kidneys from December 1994 to December 1998 with a mean follow-up time of 2.2 years. HLA matching and other risk factors (peak panel reactive antibody, donor age, sex and cause of death, cold ischemia time, donor and recipient body size) were examined. Mean likelihood ratios of models, fit with and without each variable of interest, were calculated by generating bootstrapped samples from each single year cohort. Pooled censored and uncensored graft survival rates were 90.6% and 89.9% at 1 year, 85.8% and 84.5% at 2 years, and 80.7% and 78.6% at 3 years. HLA matching declined in significance while other factors retained similar levels of statistical significance over the four yearly cohorts. With evolving clinical practice, including the provision of safer and more potent immunosuppressive therapy, the significance of HLA matching has diminished. Non-immunologic factors continue to impede more marked improvements in long-term graft survival. Recognizing these trends, organ allocation algorithms may need to be revised.

Abstract

Evaluation of dialysis adequacy has focused on parameters of solute (principally urea) clearance. Relatively little attention has been paid to the adequacy of ultrafiltration. At a given phase angle, the bioimpedance vector length reflects the degree of tissue hydration, as the vector lengthens with ultrafiltration.We determined the relative risk of death associated with different bioimpedance vector lengths in a 3009 patient hemodialysis cohort using proportional hazards regression.The mean phase angle was 4.8 degrees, and the mean vector length 300 +/- 70 ohm/m (range 140 to 630 ohm/m). Vector length was much longer in women than men (mean 340 vs. 270 ohm/m) and significantly longer in African Americans and patients without diabetes. Adjusted for the effects of age, gender, race, diabetes, vintage, weight, albumin, prealbumin, creatinine, hemoglobin, ferritin, and dialysis dose, the relative risk (RR) of death was 0.75 (95% CI 0.57 to 0.88) per 100 ohm/m decrease in vector length. The effect of vector length on RR was somewhat more pronounced among men (vector length x gender interaction, P= 0.07). Considering vector length of 300 to 350 ohm/m as the referent category, the RRs of death were 1.54 (95% CI 1.08 to 2.21) and 2.83 (95% CI 1.55 to 5.14) for patients with vector length 200 to 250 and <200 ohm/m, respectively.Shorter predialysis bioimpedance vectors, indicating greater soft tissue hydration, were associated with diminished survival in hemodialysis patients. These findings validate clinical observations linking longevity to maintenance of dry body weight.

Abstract

Higher risk patients (including the elderly) receive more conservative therapy for cardiovascular diseases, even though the relative benefits of therapy tend to be greater. The perceived risk of radiocontrast-associated nephrotoxicity may influence the provision of coronary angiography and subsequent revascularization, especially among individuals with chronic kidney disease (CKD). The aim of this study was to determine whether there is excessive variation in the provision of coronary angiography after acute myocardial infarction on the basis of the presence of CKD and whether there is an association between angiography and mortality. Elderly (age 65 to 89 yr) individuals with acute myocardial infarction from the Cooperative Cardiovascular Project were classified by the presence or absence of CKD (defined as a baseline serum creatinine of 1.5 to 5.0 mg/dl). In CKD patients, the propensity to undergo coronary angiography was determined and the effect of coronary angiography on mortality was estimated using multivariable logistic regression and stratification. Mortality was significantly higher with CKD (52.6 versus 26.4%). Fewer patients with CKD underwent coronary angiography (25.2 versus 46.8%) despite the observation that a similar proportion of patients were deemed appropriate for angiography by standard, published criteria. When limiting the analysis to CKD patients who are considered appropriate, the multivariable estimate of the odds of death associated with coronary angiography was 0.58 (95% confidence interval, 0.50 to 0.67). With adjustment using propensity scores, the odds ratio averaged across propensity score quintiles was 0.62 (95% confidence interval, 0.54 to 0.70). Results were qualitatively similar when patients were stratified by CKD stage IV (estimated GFR <30 ml/min per 1.73 m(2)). There is a large relative decrease in utilization of coronary angiography among patients with CKD. Alteration in practice because of an aversion to the risk of radiocontrast-associated nephrotoxicity ("renalism") is inappropriate, even if the true relative benefit of invasive strategies is a fraction of what is estimated here.

Abstract

Mortality rates in ESRD are unacceptably high. Disorders of mineral metabolism (hyperphosphatemia, hypercalcemia, and secondary hyperparathyroidism) are potentially modifiable. For determining associations among disorders of mineral metabolism, mortality, and morbidity in hemodialysis patients, data on 40,538 hemodialysis patients with at least one determination of serum phosphorus and calcium during the last 3 mo of 1997 were analyzed. Unadjusted, case mix-adjusted, and multivariable-adjusted relative risks of death were calculated for categories of serum phosphorus, calcium, calcium x phosphorus product, and intact parathyroid hormone (PTH) using proportional hazards regression. Also determined was whether disorders of mineral metabolism were associated with all-cause, cardiovascular, infection-related, fracture-related, and vascular access-related hospitalization. After adjustment for case mix and laboratory variables, serum phosphorus concentrations >5.0 mg/dl were associated with an increased relative risk of death (1.07, 1.25, 1.43, 1.67, and 2.02 for serum phosphorus 5.0 to 6.0, 6.0 to 7.0, 7.0 to 8.0, 8.0 to 9.0, and >/=9.0 mg/dl). Higher adjusted serum calcium concentrations were also associated with an increased risk of death, even when examined within narrow ranges of serum phosphorus. Moderate to severe hyperparathyroidism (PTH concentrations >/=600 pg/ml) was associated with an increase in the relative risk of death, whereas more modest increases in PTH were not. When examined collectively, the population attributable risk percentage for disorders of mineral metabolism was 17.5%, owing largely to the high prevalence of hyperphosphatemia. Hyperphosphatemia and hyperparathyroidism were significantly associated with all-cause, cardiovascular, and fracture-related hospitalization. Disorders of mineral metabolism are independently associated with mortality and morbidity associated with cardiovascular disease and fracture in hemodialysis patients.

Abstract

Although obesity confers an increased risk of mortality in the general population, observational reports on the dialysis population have suggested that obesity is associated with improved survival. These reports have generally not examined extremely high values of body mass index (BMI; in kg/m(2)), survival >1 y, or alternative measures of adiposity.We sought to clarify the relation between body size and outcomes among a large cohort of patients beginning dialysis.Data on 418 055 patients beginning dialysis between 1 April 1995 and 1 November 2000 were analyzed by using US Renal Data System data. BMI was divided into 8 categories in increments of 3 units, ranging from < 19 to > or =37, and the relation between survival and BMI was examined by using proportional hazards regression with adjustment for demographic, laboratory, and comorbidity data.High BMI was associated with increased survival in this cohort, even at extremely high BMI, after adjustment, and over a 2-y average follow-up time. This was true for whites, African Americans, and Hispanics but not for Asians. High BMI was also associated with a reduced risk of hospitalization and a lower rate of mortality in all mortality categories. Alternative estimates of adiposity, including the Benn index and estimated fat mass, yielded similar results, and adjustments for lean body mass did not substantially alter the findings.High BMI is not associated with increased mortality among patients beginning dialysis. This finding does not appear to be a function of lean body mass and, although modified by certain patient characteristics, it is a robust finding.

Abstract

Vascular calcification has been associated with all cause and cardiovascular mortality in patients with end-stage kidney disease (ESRD). Whether vascular calcification is present in persons with advanced chronic kidney disease starting dialysis or develops in patients on dialysis is unknown. The purpose of this study was to examine the prevalence of vascular and coronary calcification in patients new to hemodialysis.A total of 129 subjects new to dialysis were evaluated using electron beam computed tomography. The primary outcome was the presence and extent of coronary artery, aortic, and valvular calcification.Forty-three percent of subjects had no significant coronary artery calcification (total score = 30) and 27% had no detectable aortic calcification. Thirty-four percent had coronary artery scores that placed them above the 90th percentile for age and sex. Coronary artery calcification was significantly associated with a history of coronary artery disease and atherosclerotic vascular disease (ASVD) whereas aortic calcification was significantly associated with ASVD. Age (p < 0.0001), pulse pressure (p = 0.004), diabetes mellitus (p = 0.009), and a history of smoking (p = 0.026) were independently associated with the extent of coronary artery calcification. Age (p < 0.0001) and pulse pressure (p = 0.0003) were independently associated with the extent of aortic calcification.A large fraction of patients new to hemodialysis had no evidence of coronary artery or aortic calcification. Coupled with the extensive vascular calcification reported by others in prevalent dialysis patients these findings suggest that dialysis-specific factors contribute to calcific vascular disease in ESRD.

Abstract

Despite the acute shortage of cadaveric organs for kidney transplantation, more than 10% of cadaveric kidneys are discarded each year because of marginal quality. Transplant recipients' access to these kidneys and to information about their quality is limited. A Monte Carlo model was developed to simulate the operations of an organ procurement organization over a 10-yr period. Donor and recipient characteristics were generated from the United States Renal Data System. Kidneys were assigned one of five possible grades, which were determined by calculating the relative risk of graft failure associated with donor characteristics and HLA matching for every donor-candidate pair. Modeled were recipient decisions to accept or reject a kidney on the basis of the relative change in quality-adjusted life years (QALY). Compared were the United Network of Organ Sharing (UNOS) policy, the UNOS expanded donor criteria policy, two benchmark policies (one equity driven and the other efficiency driven), and a hybrid policy that incorporated recipient choice into the UNOS algorithm. Sensitivity analyses for major input variables were performed. Compared with UNOS, an algorithm that incorporated recipient choice predicted a 6% increase in QALY, a 12% decrease in median waiting time, a 39% increase in the likelihood of transplantation, and a 56% reduction in the number of discarded kidneys. Benefits were observed across categories of age, gender, and race. Incorporating recipient choice in kidney transplantation would improve equity, efficiency, and QALY of the end-stage renal disease population.

Abstract

Intravenous iron is usually required to optimize the correction of anaemia in persons with advanced chronic kidney disease and end-stage renal disease. Randomized clinical trials may have insufficient power to detect differences in the safety profiles of specific formulations.We obtained data from the US Food and Drug Administration on reported adverse drug events (ADEs) related to the provision of three formulations of intravenous iron during 1998-2000. We estimated the relative risks [odds ratios (OR)] of ADEs associated with the use of higher molecular weight iron dextran and sodium ferric gluconate complex compared with lower molecular weight iron dextran using 2 x 2 tables.The total number of reported parenteral iron-related ADEs was 1981 among approximately 21,060,000 doses administered, yielding a rate of 9.4 x 10(-5), or approximately 94 per million. Total major ADEs were significantly increased among recipients of higher molecular weight iron dextran (OR 5.5, 95% CI 4.9-6.0) and sodium ferric gluconate complex (OR 6.2, 95% CI 5.4-7.2) compared with recipients of lower molecular weight iron dextran. We observed significantly higher rates of life-threatening ADEs, including death, anaphylactoid reaction, cardiac arrest and respiratory depression among users of higher molecular weight compared with lower molecular weight iron dextran. There was insufficient power to detect differences in life-threatening ADEs when comparing lower molecular weight iron dextran with sodium ferric gluconate complex.Parenteral iron-related ADEs are rare. Using observational data, overall and most specific ADE rates were significantly higher among recipients of higher molecular weight iron dextran and sodium ferric gluconate complex than among recipients of lower molecular weight iron dextran. These data may help to guide clinical practice, as head-to-head clinical trials comparing different formulations of intravenous iron have not been conducted.

Abstract

We determined recently that targeted treatment with calcium-based phosphate binders (calcium acetate and carbonate) led to progressive coronary artery and aortic calcification by electron beam tomography (EBT), while treatment with the non-calcium-containing phosphate binder, sevelamer, did not. Aside from the provision of calcium, we hypothesized that other factors might be related to the likelihood of progressive calcification in both or either treatment groups.We explored potential determinants of progressive vascular calcification in 150 randomized study subjects who underwent EBT at baseline and at least once during follow-up (week 26 or 52).Among calcium-treated subjects, higher time-averaged concentrations of calcium, phosphorus and the calcium-phosphorus product were associated with more pronounced increases in EBT scores; no such associations were demonstrated in sevelamer-treated subjects. The relation between parathyroid hormone (PTH) and the progression of calcification was more complex. Lower PTH was associated with more extensive calcification in calcium-treated subjects, whereas higher PTH was associated with calcification in sevelamer-treated subjects. Serum albumin was inversely correlated with progression in aortic calcification. Sevelamer was associated with favourable effects on lipids, although the link between these effects and the observed attenuation in vascular calcification remains to be elucidated.Calcium-based phosphate binders are associated with progressive coronary artery and aortic calcification, especially when mineral metabolism is not well controlled. Calcium may directly or indirectly (via PTH) adversely influence the balance of skeletal and extraskeletal calcification in haemodialysis patients.

Abstract

Cross-sectional studies suggest an association between functional status and chronic kidney disease (CKD). Whether physical function deteriorates with progression of CKD is unknown.To determine associations among CKD, physical function, and sexual function in women, we conducted cross-sectional and longitudinal analyses of 2,761 women enrolled in the Heart and Estrogen/Progestin Replacement Study. Physical and sexual function were evaluated using the Duke Activity Status Index (DASI) and the Sexual Problems Scale of the Medical Outcomes Study, respectively. Glomerular filtration rate (GFR) was estimated using the Modification of Diet in Renal Disease regression equation. In addition to analyses across the spectrum of GFR, CKD was categorized as mild (estimated GFR, 45 to 60 mL/min/1.73 m2), moderate (estimated GFR, 30 to 44 mL/min/1.73 m2), and severe (estimated GFR, <30 mL/min/1.73 m2) according to a modification of recently established classification guidelines.Mean age of study participants was 67 +/- 7 years, and mean estimated GFR was 61 +/- 14 mL/min/1.73 m2. In unadjusted analyses, mean baseline DASI score was 10 points lower in women with an estimated GFR less than 30 mL/min/1.73 m2 than in women with an estimated GFR of 60 mL/min/1.73 m2 or greater (P < 0.0001). Estimated GFR remained significantly associated with DASI score after multivariable adjustment. In longitudinal analyses, a decline in estimated GFR was associated with a significant decline in DASI score independent of baseline estimated GFR and other factors. There were no significant associations between estimated GFR and psychosocial aspects of sexual function.CKD is associated with impaired physical function, and a decline in estimated GFR is associated with a decline in physical function.

Abstract

Parameters of nutritional status, including serum albumin, serum creatinine, and body mass index (BMI), are powerful predictors of mortality and hospitalization in patients with end stage renal disease (ESRD). Patient-specific characteristics and facility-related practice patterns modify certain parameters of nutritional status. We aimed to determine whether patient and facility characteristics modify the risk profiles associated with malnutrition in hemodialysis patients.We analyzed data on 5,234 prevalent hemodialysis patients from the Dialysis Morbidity and Mortality Study (DMMS) Wave 1 for whom information on demographic, clinical, nutritional, and facility-related characteristics were available. We evaluated the associations among facility characteristics and serum albumin, serum creatinine, and BMI, adjusting for the effects of age, sex, race/ethnicity, diabetes, and dialysis vintage. We determined correlates of mortality and hospitalization, focusing on nutritional parameters, facility effects, and the interactions among patient-specific and facility-specific characteristics, albumin, creatinine, and BMI.Serum albumin was lower with older age, diabetes, nonblack race, and hemodialysis using a catheter. Serum albumin was higher with annual vascular access surveillance, higher BMI among women, higher urea reduction ratio, among patients in whom dialyzers were reprocessed (particularly with bleach), among dialysis units in which water purification was used, and when vascular access blood flow rates were > or =350 mL/min. Overall survival was decreased with lower albumin, creatinine, and BMI. There were interactions among albumin, age, and vintage. Whereas lower serum albumin concentrations consistently were associated with an increased risk of death, the differences were attenuated among older patients and accentuated among patients of longer vintage.Some facility-specific factors are associated with nutritional parameters including serum albumin, serum creatinine, and BMI. The associations of nutritional parameters with mortality and hospitalization vary by age, sex, and vintage but not by facility-specific factors, including those associated with the nutritional parameters themselves.

Abstract

Critically ill patients with acute renal failure (ARF) experience a high mortality rate. Animal and human studies suggest that proinflammatory cytokines lead to the development of a systemic inflammatory response syndrome (SIRS), which is temporally followed by a counter anti-inflammatory response syndrome (CARS). This process has not been specifically described in critically ill patients with ARF.The Program to Improve Care in Acute Renal Disease (PICARD) is a prospective, multicenter cohort study designed to examine the natural history, practice patterns, and outcomes of treatment in critically ill patients with ARF. In a subset of 98 patients with ARF, we measured plasma proinflammatory cytokines [interleukin (IL)-1beta, IL-6, IL-8, tumor necrosis factor-alpha (TNF-alpha)], the acute-phase reactant C-reactive protein (CRP), and the anti-inflammatory cytokine IL-10 at study enrollment and over the course of illness.When compared with healthy subjects and end-stage renal disease patients on maintenance hemodialysis, patients with ARF had significantly higher plasma levels of all measured cytokines. Additionally, the proinflammatory cytokines IL-6 and IL-8 were significantly higher in nonsurvivors versus survivors [median 234.7 (interdecile range 64.8 to 1775.9) pg/mL vs. 113.5 (46.1 to 419.3) pg/mL, P= 0.02 for IL-6; 35.5 (14.1 to 237.9) pg/mL vs. 21.2 (8.5 to 87.1) pg/mL, P= 0.03 for IL-8]. The anti-inflammatory cytokine IL-10 was also significantly higher in nonsurvivors [3.1 (0.5 to 41.9) pg/mL vs. 2.4 (0.5 to 16.9) pg/mL, P= 0.04]. For each natural log unit increase in the levels of IL-6, IL-8, and IL-10, the odds of death increased by 65%, 54%, and 34%, respectively, corresponding to increases in relative risk of approximately 30%, 25%, and 15%. The presence or absence of SIRS or sepsis was not a major determinant of plasma cytokine concentration in this group of patients.There is evidence of ongoing SIRS with concomitant CARS in critically ill patients with ARF, with higher levels of plasma IL-6, IL-8, and IL-10 in patients with ARF who die during hospitalization. Strategies to modulate inflammation must take into account the complex cytokine biology in patients with established ARF.

Abstract

Valvular calcification is common in patients with end-stage renal disease, and is associated with an unfavorable prognosis. It was hypothesized that sevelamer, a non-calcium-based phosphorus binder, might attenuate the progression of valvular calcification.Two hundred subjects on maintenance hemodialysis received either sevelamer or calcium-based phosphorus binders. To assess the extent of calcification, 186 subjects underwent baseline electron beam tomography (EBT) of the coronary arteries, aorta and mitral and aortic valves, and 132 had follow up EBT scans at week 52. Changes in valvular calcification and combined valvular/vascular calcification were monitored and compared.At baseline, mitral valve calcification was seen in 46% of subjects, aortic valve calcification in 33%. Most subjects with zero values at baseline failed to progress over one year. Aortic valve calcification was significantly increased in calcium-treated subjects. Changes in mitral valve calcification, and combined mitral + aortic valve calcification were less in sevelamer-treated than in calcium-treated subjects, but not significantly so. When combining valvular and vascular calcification, the median (10%, 90%) change in sevelamer-treated subjects was significantly lower than in calcium-treated subjects (6, -5084 to 1180 versus 81, -1150 to 2944, p = 0.04). The effect of sevelamer remained significant after adjustment for baseline calcification and the time-averaged calcium-phosphorus product, and was independent of the calcium preparation (acetate versus carbonate), geographic region (US versus Europe), LDL- or HDL-cholesterol, C-reactive protein and statin use. Significantly more sevelamer-treated subjects experienced an arrest (45 versus 28%, p = 0.047) or regression (26 versus 10%, p = 0.02) in total valvular and vascular calcification.Sevelamer arrested the progression of valvular and vascular calcification in almost 50% of hemodialysis subjects. Sevelamer treatment, plus intensive control of calcium and phosphorus levels, may attenuate progression of, or achieve regression in, cardiac valvular calcification.

Abstract

The modeled volume of urea distribution (Vm) in intermittently hemodialyzed patients is often compared with total body water (TBW) volume predicted from population studies of patient anthropometrics (Vant).Using data from the HEMO Study, we compared Vm determined by both blood-side and dialysate-side urea kinetic models with Vant as calculated by the Watson, Hume-Weyers, and Chertow anthropometric equations.Median levels of dialysate-based Vm and blood-based Vm agreed (43% and 44% of body weight, respectively). These volumes were lower than anthropometric estimates of TBW, which had median values of 52% to 55% of body weight for the three formulas evaluated. The difference between the Watson equation for TBW and modeled urea volume was greater in Caucasians (19%) than in African Americans (13%). Correlations between Vm and Vant determined by each of the three anthropometric estimation equations were similar; but Vant derived from the Watson formula had a slightly higher correlation with Vm. The difference between Vm and the anthropometric formulas was greatest with the Chertow equation, less with the Hume-Weyers formula, and least with the Watson estimate. The age term in the Watson equation for men that adjusts Vant downward with increasing age reduced an age effect on the difference between Vant and Vm in men.The findings show that kinetically derived values for V from blood-side and dialysate-side modeling are similar, and that these modeled urea volumes are lower by a substantial amount than anthropometric estimates of TBW. The higher values for anthropometry-derived TBW in hemodialyzed patients could be due to measurement errors. However, the possibility exists that TBW space is contracted in patients with end-stage renal disease (ESRD) or that the TBW space and the urea distribution space are not identical.

Abstract

Acute renal failure (ARF) is associated strongly with in-hospital mortality and morbidity. Previous clinical trials of ARF have been hampered by the heterogeneous population affected, difficulty defining ARF, delays in identification of ARF, and significant comorbid conditions, among other factors.The Program to Improve Care in Acute Renal Disease (PICARD) phase I was a multicenter cohort study aimed to identify clinical characteristics and practice patterns associated with adverse and favorable outcomes in patients with ARF in intensive care units. Although PICARD used no interventions, signed informed consent was required of all study subjects or their proxies.Signed informed consent was obtained in 645 of 1,243 ARF episodes (52%). The fraction of patients not enrolled and reasons for non-enrollment varied widely across the 5 PICARD centers. Refusal by potential study subjects was infrequent, although the absence of family or proxy (15%) and refusal by family or proxy (18%) accounted for large fractions of non-enrolled subjects. Death (23%) and discharge (11%) before study personnel could evaluate patients were additional important reasons for non-enrollment.Understanding reasons for non-enrollment may help rationalize mortality and other outcome differences seen in clinical trials and cohort studies that require informed consent compared with historic reports of "all comers" with ARF.

Slowing the progression of vascular calcification in hemodialysisJOURNAL OF THE AMERICAN SOCIETY OF NEPHROLOGYChertow, G. M.2003; 14 (9): S310-S314

Abstract

Hyperphosphatemia and secondary hyperparathyroidism are common complications of ESRD (chronic kidney disease stage 5) that, when untreated, may result in increased morbidity and mortality. Hyperphosphatemia and hypercalcemia have been associated with increased coronary artery calcification. Achieving control of serum phosphorus without increasing serum calcium is an important goal for patients with ESRD. Although calcium-based phosphate binders effectively reduce serum phosphorus and parathyroid hormone concentrations, these agents can lead to hypercalcemia and have been associated with increased vascular calcification. The phosphorus binder sevelamer was developed to overcome the limitations associated with the usual management of hyperphosphatemia and secondary hyperparathyroidism (i.e., mineral salts). Sevelamer, a nonabsorbable hydrogel, is as efficacious as calcium-based phosphate binders for reducing serum phosphorus but does not cause hypercalcemia or other adverse metabolic effects. Sevelamer also exhibits beneficial effects on lipids, consistently and significantly decreasing LDL cholesterol and increasing HDL cholesterol in most studies. In a head-to-head randomized clinical trial, sevelamer and calcium-based binders achieved similarly excellent phosphorus control, but the use of calcium-based binders led to significantly higher serum calcium concentrations and an increased incidence of hypercalcemia and unintended suppression of parathyroid hormone. Treatment with calcium-based binders also led to the progression of coronary artery and aortic calcification, whereas sevelamer attenuated or arrested progression. Strategies that use oral calcium and vitamin D in patients with ESRD should be reexamined, and the potential advantages of sevelamer should be considered when selecting a primary agent to reduce serum phosphorus in hemodialysis patients.

Abstract

Moderate to severe pain frequently accompanies chronic diseases in general and end-stage renal disease (ESRD) in particular. Several analgesic agents and associated metabolites show altered pharmacokinetics in the presence of reduced glomerular filtration rate. Drug-related side effects may exacerbate symptoms frequently observed in persons with chronic kidney disease (CKD; eg, fatigue, nausea, vomiting, and constipation) or those often attributed to hemodialysis therapy (eg, orthostatic hypotension and impaired cognition). Persons with advanced CKD and ESRD are at increased risk for adverse effects of analgesic agents because of enhanced drug sensitivity, comorbid conditions, and concurrent medication use. Dose adjustment and avoidance of certain analgesics may be required in patients with advanced CKD and ESRD. We review the available evidence on pharmacokinetics and adverse drug effects of various analgesic agents commonly used in patients with advanced CKD and ESRD. Determining an optimal approach to the control of pain in patients with advanced CKD and ESRD will require additional research.

Abstract

To determine the associations among dietary intake and inflammatory cytokines with physical activity, function, and performance in maintenance dialysis patients.Cross-sectional analysis of cohort study.University-affiliated dialysis units, general clinical research center.Multiethnic cohort of maintenance hemodialysis patients.Physical activity by accelerometry; physical performance by gait speed, stair climbing, and chair raising; physical functioning by the Medical Outcomes Study Short Form 36-item questionnaire subscale scores; and maximal and adjusted activity scores of human activity profile.Levels of inflammatory cytokines were uniformly high. Tumor necrosis factor-alpha was directly correlated with dietary protein and energy intake; no other cytokines were directly or inversely correlated with intake. Dietary intake was associated with physical activity, as expected, and not significantly associated with performance or function (with the exception of gait speed). There were no significant associations among inflammatory cytokines and physical activity, performance, or function.Although dietary intake and inflammation may independently influence traditional proxies of nutritional status, this analysis provides no evidence for a link between cytokines and physical activity, performance, or function in hemodialysis patients. More research is required to understand the role of cytokines in protein energy malnutrition and the mechanisms of wasting and functional decline in the dialysis population.

Abstract

Insights into end-stage renal disease have emerged from many investigations but less is known about the epidemiology of chronic renal insufficiency (CRI) and its relationship to cardiovascular disease (CVD). The Chronic Renal Insufficiency Cohort (CRIC) Study was established to examine risk factors for progression of CRI and CVD among CRI patients and develop models to identify high-risk subgroups, informing future treatment trials, and increasing application of preventive therapies. CRIC will enroll approximately 3000 individuals at seven sites and follow participants for up to 5 yr. CRIC will include a racially and ethnically diverse group of adults aged 21 to 74 yr with a broad spectrum of renal disease severity, half of whom have diagnosed diabetes mellitus. CRIC will exclude subjects with polycystic kidney disease and those on active immunosuppression for glomerulonephritis. Subjects will undergo extensive clinical evaluation at baseline and at annual clinic visits and via telephone at 6 mo intervals. Data on quality of life, dietary assessment, physical activity, health behaviors, depression, cognitive function, health care resource utilization, as well as blood and urine specimens will be collected annually. (125)I-iothalamate clearances and CVD evaluations including a 12-lead surface electrocardiogram, an echocardiogram, and coronary electron beam or spiral CT will be performed serially. Analyses planned in CRIC will provide important information on potential risk factors for progressive CRI and CVD. Insights from CRIC should lead to the formulation of hypotheses regarding therapy that will serve as the basis for targeted interventional trials focused on reducing the burden of CRI and CVD.

The decline in residual renal function in hemodialysis is slow and age dependent.Hemodialysis international. International Symposium on Home HemodialysisHung, A. M., Young, B. S., Chertow, G. M.2003; 7 (1): 17-22

Abstract

Persons on peritoneal dialysis and hemodialysis with preserved residual renal function experience lower mortality rates than those without. Previous studies have shown slower rates of decline of residual renal function for peritoneal dialysis (PD)(2 to 3% decrease/month), compared with hemodialysis (HD)(6 to 7% decrease/month). However, our clinical observations suggested a lower rate of decline in hemodialysis patients.We evaluated data in 174 hemodialysis patients cared for from January 2000 through October 2001. Eighty-seven (50%) patients had at least two timed quarterly urine collections to estimate the rate of change of residual renal function over time (urea clearance, or KrU). All patients underwent thrice-weekly hemodialysis using polysulfone dialyzers with formaldehyde reprocessing. The rate of decline of residual renal function and the effect of KrU on laboratory variables were estimated using a random effects (MIXED) model, adjusting for the effects of age, sex, race, diabetes, and dialysis vintage.The mean KrU at baseline was 3.5 mL/min. Men (P < 0.001) and persons of shorter vintage (P < 0.0001) had more residual renal function at baseline. The estimated rate of decline of residual renal function was - 0.07 mL/min/month (- 1.9% decrease/month). The rate of decline in residual renal function was unaffected by sex, race, diabetes, or vintage, although the rate of decline was significantly attenuated among older individuals (age x time interaction, P = 0.01). Serum phosphorus (P = 0.03) and the calcium x phosphorus product (P = 0.009) increased over time and were influenced by the level of residual renal function (P = 0.06 and P = 0.006, respectively). Residual renal function did not influence the rate of change of other laboratory variables.In an ethnically diverse cohort of hemodialysis patients, the rate of decline of residual renal function was relatively slow and age dependent, as well as consistent with values others have reported for patients on peritoneal dialysis. Universal use of biocompatible dialyzers and bicarbonate dialysate may have contributed to differences discussed in prior reports. Residual renal function attenuated the increase in calcium-phosphorus product over time. A better understanding of the determinants of the rate of decline in residual renal function, and the specific benefits afforded to patients via maintenance of residual renal function, would help to inform the debates on timing of initiation and various dosing strategies in hemodialysis.

Abstract

Acute renal failure is associated with high mortality and morbidity. Diuretic agents continue to be used in this setting despite a lack of evidence supporting their benefit.To determine whether the use of diuretics is associated with adverse or favorable outcomes in critically ill patients with acute renal failure.Cohort study conducted from October 1989 to September 1995.A total of 552 patients with acute renal failure in intensive care units at 4 academic medical centers affiliated with the University of California. Patients were categorized by the use of diuretics on the day of nephrology consultation and, in companion analyses, by diuretic use at any time during the first week following consultation.All-cause hospital mortality, nonrecovery of renal function, and the combined outcome of death or nonrecovery.Diuretics were used in 326 patients (59%) at the time of nephrology consultation. Patients treated with diuretics on or before the day of consultation were older and more likely to have a history of congestive heart failure, nephrotoxic (rather than ischemic or multifactorial) origin of acute renal failure, acute respiratory failure, and lower serum urea nitrogen concentrations. With adjustment for relevant covariates and propensity scores, diuretic use was associated with a significant increase in the risk of death or nonrecovery of renal function (odds ratio, 1.77; 95% confidence interval, 1.14-2.76). The risk was magnified (odds ratio, 3.12; 95% confidence interval, 1.73-5.62) when patients who died within the first week following consultation were excluded. The increased risk was borne largely by patients who were relatively unresponsive to diuretics.The use of diuretics in critically ill patients with acute renal failure was associated with an increased risk of death and nonrecovery of renal function. Although observational data prohibit causal inference, it is unlikely that diuretics afford any material benefit in this clinical setting. In the absence of compelling contradictory data from a randomized, blinded clinical trial, the widespread use of diuretics in critically ill patients with acute renal failure should be discouraged.

Abstract

Secondary hyperparathyroidism (SHPT) is an important complication of end-stage renal disease. However, SHPT begins during earlier stages of chronic renal insufficiency (CRI), and little is known about risk factors for SHPT in this population. This study evaluated 218 patients in an ethnically diverse ambulatory nephrology practice at the University of California San Francisco during calendar years 1999 and 2000. Demographic data, comorbid diseases, medications, and laboratory parameters were collected, and independent correlates of intact parathyroid hormone (PTH) were identified by using multiple linear regression. The mean estimated GFR was 34 ml/min per 1.73 m(2) (10%-90% range, 13 to 61 ml/min per 1.73 m(2)); PTH was inversely related to GFR (P < 0.0001). The adjusted mean PTH was higher among African Americans and lower among Asian/Pacific Islanders compared with white patients (233 versus 95 versus 139 pg/ml; P < 0.0001). Moreover, among the 196 patients with GFR <60 ml/min per 1.73 m(2), the slope of GFR versus PTH was significantly steeper among African Americans than among white patients (10.6 versus 3.9 pg/ml per ml per min per 1.73 m(2); P = 0.01). After adjusting for age and diabetes, PTH was associated with a history of myocardial infarction (OR, 1.6; 95% CI, 1.1 to 2.3 per unit natural log PTH) and congestive heart failure (OR, 2.0; 95% CI, 1.3 to 2.9 per unit natural log PTH) and not associated with other co-morbid conditions. These factors should be considered when screening and managing SHPT in CRI.

Abstract

Patients with end-stage renal disease are known to have decreased survival after myocardial infarction, but the association of less severe renal dysfunction with survival after myocardial infarction is unknown.To determine how patients with renal insufficiency are treated during hospitalization for myocardial infarction and to determine the association of renal insufficiency with survival after myocardial infarction.Cohort study.All nongovernment hospitals in the United States.130 099 elderly patients with myocardial infarction hospitalized between April 1994 and July 1995.Patients were categorized according to initial serum creatinine level: no renal insufficiency (creatinine level < 1.5 mg/dL [<132 micromol/L]; n = 82 455), mild renal insufficiency (creatinine level, 1.5 to 2.4 mg/dL [132 to 212 micromol/L]; n = 36 756), or moderate renal insufficiency (creatinine level, 2.5 to 3.9 mg/dL [221 to 345 micromol/L]; n = 10 888). Vital status up to 1 year after discharge was obtained from Social Security records.Compared with patients with no renal insufficiency, patients with moderate renal insufficiency were less likely to receive aspirin, beta-blockers, thrombolytic therapy, angiography, and angioplasty during hospitalization. One-year mortality was 24% in patients with no renal insufficiency, 46% in patients with mild renal insufficiency, and 66% in patients with moderate renal insufficiency (P < 0.001). After adjustment for patient and treatment characteristics, mild (hazard ratio, 1.68 [95% CI, 1.63 to 1.73]) and moderate (hazard ratio, 2.35 [CI, 2.26 to 2.45]) renal insufficiency were associated with substantially elevated risk for death during the first month of follow-up. This increased mortality risk continued until 6 months after myocardial infarction.Renal insufficiency was an independent risk factor for death in elderly patients after myocardial infarction. Targeted interventions may be needed to improve treatment for this high-risk population.

Abstract

Reduced renal function is associated with a variety of biochemical abnormalities. However, the extent of these changes and their magnitude in relation to renal function is not well defined, especially among individuals with mild to moderate chronic renal insufficiency (CRI).We analysed the Third National Health and Nutrition Examination Survey (NHANES III; 1988-1994) data for 14722 adults aged >/=17 years with measurements of serum creatinine and all electrolytes including ionized calcium. General linear models were used to determine the relationship between mean concentrations of electrolytes and different levels of Cockcroft-Gault creatinine clearance (CrCl). Sample weights were used to produce weighted regression parameters.Changes in mean serum phosphorus and potassium concentration were evident at relatively modest reductions in CrCl (around 50 to 60 ml/min). Changes in the anion gap and mean levels of ionized calcium and bicarbonate were not apparent until CRI was advanced (CrCl =20 ml/min). For example, compared with women with CrCl >80 ml/min, those with CrCl 60-50, 50-40, 40-30, 30-20 and =20 ml/min had mean serum phosphorus concentrations that were higher by 0.1, 0.1, 0.2, 0.3 and 0.8 mg/dl (all P<0.05), and mean serum potassium concentrations that were higher by 0.1, 0.1, 0.1, 0.2 and 0.4 mmol/l (all P<0.05), respectively. These changes were independent of dietary intake and the use of angiotensin converting enzyme (ACE) inhibitors or non-steroidal anti-inflammatory drugs (NSAIDs).Increases in serum phosphorus and potassium levels are apparent even among people with mild to moderate CRI. These findings should be broadly generalizable to the larger CRI population in the United States. Subtle elevations in serum phosphorus might contribute to the initiation and maintenance of secondary hyperparathyroidism, which is known to occur in mild to moderate CRI.

Abstract

To determine the associations among dietary intake and inflammatory cytokines with physical activity, function, and performance in maintenance dialysis patients.Cross-sectional analysis of cohort study.University-affiliated dialysis units, general clinical research center.Multiethnic cohort of maintenance hemodialysis patients.Physical activity by accelerometry; physical performance by gait speed, stair climbing, and chair raising; physical functioning by the Medical Outcomes Study Short Form 36-item questionnaire subscale scores; and maximal and adjusted activity scores of human activity profile.Levels of inflammatory cytokines were uniformly high. Tumor necrosis factor-alpha was directly correlated with dietary protein and energy intake; no other cytokines were directly or inversely correlated with intake. Dietary intake was associated with physical activity, as expected, and not significantly associated with performance or function (with the exception of gait speed). There were no significant associations among inflammatory cytokines and physical activity, performance, or function.Although dietary intake and inflammation may independently influence traditional proxies of nutritional status, this analysis provides no evidence for a link between cytokines and physical activity, performance, or function in hemodialysis patients. More research is required to understand the role of cytokines in protein energy malnutrition and the mechanisms of wasting and functional decline in the dialysis population.

Abstract

There is increasing interest in studying the epidemiology of subjects with mild to moderate chronic renal insufficiency (CRI), defined as reduced glomerular filtration rate (GFR) not requiring renal replacement therapy. This review discusses some of the methodological challenges presented by the epidemiological study of mild to moderate CRI that have not been adequately addressed in the literature. Issues that relate to defining the prevalence of CRI include between-laboratory differences in serum creatinine (SCr) assays, within-person measurement errors in SCr, and differences in SCr in different demographic groups that are independent of GFR. Issues that relate to examining CRI as an outcome include the choice between a "slope" or "threshold" analysis. Issues that relate to examining CRI as an exposure include the choice of renal function measure (for example, SCr vs. estimated GFR) in multivariable analysis, whether to normalize renal function to body surface area or other body size parameters, potential effect modification of the association between CRI and the outcome and the complex relation between CRI, adverse outcomes, potential confounders and intermediary variables. As we enter an era of more intensive study of mild to moderate CRI, recognition of these potential pitfalls should guide researchers toward improving the quality of epidemiological research in this field.

Abstract

Mortality rates in acute renal failure remain extremely high, and risk-adjustment tools are needed for quality improvement initiatives and design (stratification) and analysis of clinical trials. A total of 605 patients with acute renal failure in the intensive care unit during 1989-1995 were evaluated, and demographic, historical, laboratory, and physiologic variables were linked with in-hospital death rates using multivariable logistic regression. Three hundred and fourteen (51.9%) patients died in-hospital. The following variables were significantly associated with in-hospital death: age (odds ratio [OR], 1.02 per yr), male gender (OR, 2.36), respiratory (OR, 2.62), liver (OR, 3.06), and hematologic failure (OR, 3.40), creatinine (OR, 0.71 per mg/dl), blood urea nitrogen (OR, 1.02 per mg/dl), log urine output (OR, 0.64 per log ml/d), and heart rate (OR, 1.01 per beat/min). The area under the receiver operating characteristic curve was 0.83, indicating good model discrimination. The model was superior in all performance metrics to six generic and four acute renal failure-specific predictive models. A disease-specific severity of illness equation was developed using routinely available and specific clinical variables. Cross-validation of the model and additional bedside experience will be needed before it can be effectively applied across centers, particularly in the context of clinical trials.

Abstract

Persons with end-stage renal disease are at higher risk for osteopenia and hip fracture relative to the age-matched general population. Persons with mild to moderate chronic renal insufficiency (CRI) may have reduced bone mineral density (BMD) as a result of abnormalities in acid-base and vitamin D-parathyroid hormone homeostasis.We analyzed data on 13,848 adults aged 20 and above from the Third National Health and Nutrition Examination Survey (NHANES III; 1988-1994). Regression models were used to determine the relationship between femoral BMD and renal function, the latter assessed using serum creatinine, blood urea nitrogen or Cockcroft-Gault creatinine clearance. To control for confounding, we fit sex-stratified models that adjusted for age, weight, height, race-ethnicity, menopausal status, estrogen use, activity level, family history of osteoporosis, diuretic use, and dietary intake of calcium and alcohol.Although subjects with reduced renal function had significantly lower femoral BMD in unadjusted analysis, the association between CRI and bone mineral density was extinguished after adjustment in the multivariate models. In fact, controlling for only sex, age and weight was sufficient to extinguish any negative association between decreased renal function and decreased bone mineral density.Although subjects with worse renal function have significantly lower femoral BMD, this association can be explained by confounding, principally by sex, age and weight. After taking into account the facts that women, older individuals and smaller individuals have less renal function and lower BMD, renal function itself is not independently associated with BMD.

Abstract

We sought to determine clinical and laboratory correlates of calcification of the coronary arteries (CAs), aorta and mitral and aortic valves in adult subjects with end-stage renal disease (ESRD) receiving hemodialysis.Vascular calcification is known to be a risk factor for ischemic heart disease in non-uremic individuals. Patients with ESRD experience accelerated vascular calcification, due at least in part to dysregulation of mineral metabolism. Clinical correlates of the extent of calcification in ESRD have not been identified. Moreover, the clinical relevance of calcification as measured by electron-beam tomography (EBT) has not been determined in the ESRD population.We conducted a cross-sectional analysis of 205 maintenance hemodialysis patients who received baseline EBT for evaluation of vascular and valvular calcification. We compared subjects with and without clinical evidence of atherosclerotic vascular disease and determined correlates of the extent of vascular and valvular calcification using multivariable linear regression and proportional odds logistic regression analyses.The median coronary artery calcium score was 595 (interquartile range, 76 to 1,600), values consistent with a high risk of obstructive coronary artery disease in the general population. The CA calcium scores were directly related to the prevalence of myocardial infarction (p < 0.0001) and angina (p < 0.0001), and the aortic calcium scores were directly related to the prevalence of claudication (p = 0.001) and aortic aneurysm (p = 0.02). The extent of coronary calcification was more pronounced with older age, male gender, white race, diabetes, longer dialysis vintage and higher serum concentrations of calcium and phosphorus. Total cholesterol (and high-density lipoprotein and low-density lipoprotein subfractions), triglycerides, hemoglobin and albumin were not significantly related to the extent of CA calcification. Only dialysis vintage was significantly associated with the prevalence of valvular calcification.Coronary artery calcification is common, severe and significantly associated with ischemic cardiovascular disease in adult ESRD patients. The dysregulation of mineral metabolism in ESRD may influence vascular calcification risk.

Abstract

Usual drug-prescribing practices may not consider the effects of renal insufficiency on the disposition of certain drugs. Decision aids may help optimize prescribing behavior and reduce medical error.To determine if a system application for adjusting drug dose and frequency in patients with renal insufficiency, when merged with a computerized order entry system, improves drug prescribing and patient outcomes.Four consecutive 2-month intervals consisting of control (usual computerized order entry) alternating with intervention (computerized order entry plus decision support system), conducted in September 1997-April 1998 with outcomes assessed among a consecutive sample of 17 828 adults admitted to an urban tertiary care teaching hospital.Real-time computerized decision support system for prescribing drugs in patients with renal insufficiency. During intervention periods, the adjusted dose list, default dose amount, and default frequency were displayed to the order-entry user and a notation was provided that adjustments had been made based on renal insufficiency. During control periods, these recommended adjustments were not revealed to the order-entry user, and the unadjusted parameters were displayed.Rates of appropriate prescription by dose and frequency, length of stay, hospital and pharmacy costs, and changes in renal function, compared among patients with renal insufficiency who were hospitalized during the intervention vs control periods.A total of 7490 patients were found to have some degree of renal insufficiency. In this group, 97 151 orders were written on renally cleared or nephrotoxic medications, of which 14 440 (15%) had at least 1 dosing parameter modified by the computer based on renal function. The fraction of prescriptions deemed appropriate during the intervention vs control periods by dose was 67% vs 54% (P

Abstract

Physical performance measures, particularly gait speed, have been useful as predictors of loss of independence, institutionalization, and mortality in older nonuremic individuals. Gait speed has not been evaluated as a predictor of these important outcomes in patients on hemodialysis, nor have the determinants of gait speed in the dialysis population been studied.We performed a cross-sectional analysis to determine whether demographic, clinical, or nutritional status variables were related to physical performance in a group of 46 hemodialysis patients treated at three University of California San Francisco-affiliated dialysis units. Three physical performance measures were examined, including gait speed, time to climb stairs, and time to rise from a chair five times in succession. Forward stepwise linear-regression analysis was performed with each physical performance measure as the dependent variable and the following candidate predictor variables: age, gender, body mass index, dialysis vintage, Kt/V, albumin, blood urea nitrogen, creatinine, hematocrit, lean body mass, phase angle, ferritin, and the following comorbidities: hypertension, diabetes mellitus, coronary artery disease, peripheral vascular disease, and cerebrovascular disease.Subjects included 31 men and 15 women aged 22 to 87 years (mean +/- SD, 52 +/- 17). The mean gait speed for the group was 113.1 +/- 34.5 cm/s (low compared with norms established for persons of similar age). Results of multivariable regression showed that age, albumin, and Kt/V were important determinants of gait speed in this population. Overall, the model explained 52% of the variability in gait speed (r = 0.72, P < 0.0001). Qualitatively similar results were obtained using stair-climbing time or chair-rising time as the dependent variables, except that comorbidity was more important than age for stair climbing. The addition of physical activity level to the models did not eliminate the associations of albumin or Kt/V with physical performance.Physical performance is significantly impaired in ambulatory hemodialysis patients and is related to age, serum albumin, and dialysis dose. Prospective studies are needed to determine whether modification of dialysis dose or nutritional interventions can improve physical performance in patients on hemodialysis.

Abstract

While parenteral amphotericin B is an effective therapy for serious fungal infections, it frequently causes acute renal failure (ARF). This study identified correlates of ARF in amphotericin B therapy and used them to develop clinical prediction rules.All 643 inpatients receiving parenteral amphotericin B therapy at one tertiary care hospital were included. Data regarding correlates were obtained both electronically and from manual chart review in a subsample of 231 patients. ARF was defined as a 50% increase in the baseline creatinine with a peak > or =2.0 mg/dL.Among 643 episodes, ARF developed in 175 (27%). In the larger group, the only independent correlate of ARF was male gender (OR = 2.2, 95% CI, 1.5 to 3.3). In the subsample (N = 231), independent correlates of ARF were maximum daily amphotericin dosage, location at the time of initiation of amphotericin therapy, and concomitant use of cyclosporine. These data were used to develop two clinical prediction rules. A rule using only data available at initiation of therapy stratified patients into groups with probability of ARF ranging from 15 to 54%, while a rule including data available during therapy (maximum daily dose) stratified patients into groups with probability of ARF ranging from 4 to 80%.Acute renal failure occurred in a quarter of the patients. Correlates of ARF at the beginning and during the course of amphotericin therapy were identified and then combined to allow stratification according to ARF risk. These data also provide evidence for guidelines for the selection of patients for alternative therapies.

Abstract

For patients with end-stage renal disease and their providers, dialysis unit-based cardiac arrest is the most feared complication of hemodialysis. However, relatively little is known regarding its frequency or epidemiology, or whether a fraction of these events could be prevented.To explore clinical correlates of dialysis unit-based cardiac arrest, 400 reported arrests over a nine-month period from October 1998 through June 1999 were reviewed in detail. Clinical characteristics of patients who suffered cardiac arrest were compared with a nationally representative cohort of> 77,000 hemodialysis patients dialyzed at Fresenius Medical Care North America-affiliated facilities.The cardiac arrest rate was 400 out of 5,744,708, corresponding to a rate of 7 per 100,000 hemodialysis sessions. Cardiac arrest was more frequent during Monday dialysis sessions than on other days of the week. Case patients were nearly twice as likely to have been dialyzed against a 0 or 1.0 mEq/L potassium dialysate on the day of cardiac arrest (17.1 vs. 8.8%). Patients who suffered a cardiac arrest were on average older (66.3 +/- 12.9 vs. 60.2 +/- 15.4 years), more likely to have diabetes (61.8 vs. 46.8%), and more likely to use a catheter for vascular access (34.1 vs. 27.8%) than the general hemodialysis population. Sixteen percent of patients experienced a drop in systolic pressure of 30 mm Hg or more prior to the arrest. Thirty-seven percent of patients who suffered cardiac arrest had been hospitalized within the past 30 days. Sixty percent of patients died within 48 hours of the arrest, including 13% while in the dialysis unit.Cardiac arrest is a relatively infrequent but devastating complication of hemodialysis. To reduce the risk of adverse cardiac events on hemodialysis, the dialysate prescription should be evaluated and modified on an ongoing basis, especially following hospitalization in high-risk patients.

Abstract

To assess the mortality and resource utilization that results from acute renal failure associated with amphotericin B therapy, 707 adult admissions in which parenteral amphotericin B therapy was given were studied at a tertiary-care hospital. Main outcome measures were mortality, length of stay, and costs; we controlled for potential confounders, including age, sex, insurance status, baseline creatinine level, length of stay before beginning amphotericin B therapy, and severity of illness. Among 707 admissions, there were 212 episodes (30%) of acute renal failure. When renal failure developed, the mortality rate was much higher: 54% versus 16% (adjusted odds of death, 6.6). When acute renal failure occurred, the mean adjusted increase in length of stay was 8.2 days, and the adjusted total cost was $29,823. Although residual confounding exists despite adjustment, the increases in resource utilization that we found are large and the associated mortality is high when acute renal failure occurs following amphotericin B therapy.

Abstract

A local inflammatory reaction to beta(2)-microglobulin (beta(2)m) amyloid deposits by monocytes/macrophages is a characteristic histologic feature of dialysis-related amyloidosis (DRA). Since beta(2)m modified with advanced glycation end products (AGE-beta(2)m) is a major constituent of amyloid in DRA, we tested the hypothesis that AGE-beta(2)m affects apoptosis and phenotype of human monocytes.Human peripheral blood monocytes were incubated with or without in vitro-derived AGE-beta(2)m, and their viability, extent of apoptosis, morphology, and function examined over the subsequent four days.AGE-modified but not unmodified beta(2)m significantly delayed spontaneous apoptosis of human peripheral blood monocytes in adherent and nonadherent cultures. The effect of AGE-beta(2)m on monocytes apoptosis was time- and dose-dependent and was attenuated by a blocking antibody directed against the human AGE receptor (RAGE). There was no difference in effect between AGE-beta(2)m and that of AGE-modified human serum albumin. Culture of monocytes with AGE-beta(2)m did not alter membrane expression of Fas or Fas ligand. Monocytes cultured with AGE-beta(2)m underwent substantial changes in morphology similar to those observed when monocytes differentiate into macrophages. The cultured cells increased in size and vacuolization, and their content of beta-glucuronidase and acid phosphatase increased by 5- to 10-fold at day 4. Expression of the monocyte--macrophage membrane antigens HLA-DR, CD11b, and CD11c also increased at day 4. Although exhibiting phenotypic characteristics of macrophages, monocytes cultured with AGE-beta(2)m functioned differently than macrophages cultured with serum. Superoxide production in response to phorbol myristic acetate was maintained in monocytes cultured with AGE-beta(2)m, but declined with time in cells cultured with serum. Constitutive synthesis of tumor necrosis factor-alpha (TNF-alpha), interleukin-1 beta (IL-1 beta) and prostaglandin E2 (PGE2) increased in monocytes cultured for four to six days with AGE-beta(2)m.These findings support a novel role for AGE-modified proteins such as AGE-beta(2)m that may contribute to the development of a local inflammatory response, with predominant accumulation of monocytes/macrophages, in DRA.

Abstract

Patients on dialysis are less physically active than sedentary persons with normal kidney function. To assess the consequences of inactivity and the results of efforts to increase activity in the end-stage renal disease (ESRD) population, valid instruments to measure physical activity and physical functioning in this group are needed.We performed a cross-sectional study to establish the validity in ESRD of several questionnaires designed to measure physical activity or physical functioning in the general population. Questionnaires studied included the Stanford 7-day Physical Activity Recall questionnaire (PAR), the Physical Activity Scale for the Elderly (PASE), the Human Activity Profile (HAP), and the Medical Outcomes Study Short Form 36-item questionnaire (SF-36). Physical activity was measured using three-dimensional activity monitors (accelerometers) over a seven-day period (the "gold standard"). Patients also underwent physical performance tests, including measurement of gait speed, stair climbing time, and chair rising time. Study questionnaires were administered, and questionnaire results were compared with each other and with activity monitor and physical performance test results.Thirty-nine maintenance hemodialysis patients participated in the study. Dialysis patients scored worse than previously published healthy norms on all tests. All questionnaires correlated with seven-day accelerometry and with at least one measure of physical performance. The HAP correlated best with accelerometry (r = 0.78, P < 0.0001). Seventy-five percent of the variability in physical activity measured by accelerometry could be explained by a model that combined information from the HAP and the PASE. The HAP and the physical functioning scale of the SF-36 were about equally well correlated with physical performance measures.These questionnaires are valid in patients on hemodialysis and should be used to study the physical activity and rehabilitation efforts in this population further.

Abstract

Although serum prealbumin is considered a valid indicator of nutritional status in hemodialysis patients, there is relatively little evidence that its determination is of major prognostic significance. In this study, we aimed to determine the independent association of serum prealbumin with survival in hemodialysis patients, after adjusting for serum albumin and other indicators of protein energy nutritional status.Serum prealbumin was measured in more than 1600 maintenance hemodialysis patients. We determined the correlations among prealbumin and other indicators of nutritional status, including serum albumin, and bioimpedance-derived indicators of body composition. The relationship between serum prealbumin and survival was determined using proportional hazards regression.The serum albumin was directly correlated with the serum prealbumin (r = 0.47, P < 0.0001), but still explained <25% of the variability in prealbumin. Prealbumin was inversely related to mortality, with a relative risk reduction of 6% per 1 mg/dL increase in prealbumin, even after adjusting for case mix, serum albumin, and other nutritional indicators. The increase in risk with lower serum prealbumin concentrations was observed whether the serum albumin was high or low.In hemodialysis patients, the serum prealbumin provides prognostic value independent of the serum albumin and other established predictors of mortality in this population.

Abstract

Secondary hyperparathyroidism and its effects on bone and viscera are among the most important complications of end-stage renal disease. Despite its ubiquity, little is known about the treated natural history of the disorder.We assembled a cohort of 310 patients with endstage renal disease on hemodialysis who were participants in one of four clinical trials of the phosphate binder sevelamer. Baseline parathyroid hormone levels were collected, and the relation between dialysis vintage and other clinical variables with parathyroid hormone were described.There was a direct relation between dialysis vintage and the severity of hyperparathyroidism. Other variables that were significantly associated with PTH on univariate analysis included age, African American race, Kt/V, and the serum concentrations of calcium, phosphate, and bicarbonate. Multivariable linear regression analysis yielded three significant predictors of PTH: calcium, phosphorus, and vintage (5.8% (4.0-7.5%) expected increase in PTH per year of vintage). The model R2 was 0.22.Dialysis vintage is a key determinant of the severity of secondary hyperparathyroidism. Vintage and certain laboratory variables should be considered in the evaluation of therapies aimed at modifying the treated natural history of this disorder.

Abstract

Atrial natriuretic peptide (ANP), an endogenous hormone synthesized by the cardiac atria, has been shown to improve renal function in multiple animal models of acute renal failure. In a recent multicenter clinical trial of 504 patients with acute tubular necrosis (oliguric and nonoliguric), ANP decreased the need for dialysis only in the oliguric patients. In the present study, 222 patients with oliguric acute renal failure were enrolled into a multicenter, randomized, double-blind, placebo-controlled trial designed to assess prospectively the safety and efficacy of ANP compared with placebo. Subjects were randomized to treatment with a 24-hour infusion of ANP (anaritide, 0.2 microgram/kg/min; synthetic form of human ANP) or placebo. Dialysis and mortality status were followed up for 60 days. The primary efficacy end point was dialysis-free survival through day 21. Dialysis-free survival rates were 21% in the ANP group and 15% in the placebo group (P = 0.22). By day 14 of the study, 64% and 77% of the ANP and placebo groups had undergone dialysis, respectively (P = 0.054), and 9 additional patients (7 patients, ANP group; 2 patients, placebo group) needed dialysis but did not receive it. Although a trend was present, there was no statistically significant beneficial effect of ANP in dialysis-free survival or reduction in dialysis in these subjects with oliguric acute renal failure. Mortality rates through day 60 were 60% versus 56% in the ANP and placebo groups, respectively (P = 0.541). One hundred two of 108 (95%) versus 63 of 114 (55%) patients in the ANP and placebo groups had systolic blood pressures less than 90 mm Hg during the study-drug infusion (P < 0.001). The maximal absolute decrease in systolic blood pressure was significantly greater in the anaritide group than placebo group (33.6 versus 23.9 mm Hg; P < 0.001). This well-characterized population with oliguric acute renal failure had an overall high morbidity and mortality.

Abstract

The optimal composition of fluid for volume resuscitation in critically ill patients has been the subject of controversy for decades. Clinicians are faced with several options, including crystalloid solutions of varying tonicity, several colloid preparations (albumin and others), and blood products. Some of these solutions may be differentially distributed between the intra- and extravascular, and intra- and extracellular compartments, accounting for a variety of physiological effects. Two recently published meta-analyses concluded that colloids afford no survival benefit in critically ill patients compared with crystalloids. Albumin infusion may be of more value in patients with cirrhosis, or in those at high risk of acute renal failure. Additional randomized trials will be needed to establish the optimal composition and volume of colloid or crystalloid solutions for resuscitation in shock.

Abstract

The terms routinely used to describe states of reduced glomerular filtration rate (GFR) not requiring renal replacement therapy are poorly defined. With increasing interest in the epidemiology of chronic renal insufficiency and the timing of initiation of dialysis, terms such as "pre-ESRD" and "pre-dialysis" have been popularized, again without clear definition. Unambiguous terminology should be adopted. The authors favor using the term chronic renal insufficiency to describe states of reduced GFR not severe enough to require dialysis or transplantation. The authors propose classifying patients with GFR of 60 to 41 mL/min, 40 to 21 mL/min, and 20 mL/min or below as having mild, moderate, and advanced degrees of chronic renal insufficiency, respectively. The use of this terminology will facilitate communication among nephrologists and other physicians and provide a framework for comparison of populations across cohort studies and clinical trials.

Abstract

To determine the effects of sevelamer hydrochloride on serum phosphorus, calcium, calcium x phosphate product, and parathyroid hormone (PTH) in patients treated with and without vitamin D metabolites and calcium supplementation.Long-term, open-label clinical trial.Hemodialysis units.One hundred ninety-two adult patients with end-stage renal disease on hemodialysis.An extended treatment period of sevelamer hydrochloride, preceded and followed by phosphate binder washout periods.Treatment-related changes in serum phosphorus, calcium, calcium x phosphate product, and PTH.Subjects treated with sevelamer alone, sevelamer with vitamin D metabolites (with or without calcium), and sevelamer with calcium without vitamin D experienced significant reductions in mean serum phosphorus (range, 2.1 to -2.9 mg/dL) and the calcium x phosphate product (range, -16.3 to -23.4 mg2/dL2). The mean serum calcium concentration increased in all subgroups except those treated with sevelamer alone (range, +0.3 to +0.5 mg/dL). In contrast, only subjects treated concurrently with vitamin D metabolites experienced a reduction in PTH. Subjects treated with sevelamer alone or sevelamer with calcium without vitamin D experienced an increase in PTH with treatment.Sevelamer hydrochloride is a safe and effective phosphate binder in hemodialysis patients. Sevelamer should be used in combination with vitamin D metabolites to jointly control hyperphosphatemia and hyperparathyroidism. Randomized clinical trials will be required to determine the optimal management strategies for metabolic bone disease in end-stage renal disease, as well as less advanced stages of chronic renal insufficiency.

Abstract

Although newer tunneled dialysis catheters offer improved capacity for blood flow and efficiency of dialysis, catheter-associated bacteremia remains an extremely important complication of this access strategy. This is a report of a case of catheter-associated bacteremia with Ochrobactrum anthropi, a water-borne gram-negative rod with an unusual pattern of antibiotic resistance. Given the organism's hydrophilic property and the frequency of catheter use in debilitated individuals with end-stage renal disease, Ochrobactrum anthropi infection should be considered in the differential diagnosis of a hemodialysis patient with unexplained fever.

Abstract

Acute renal failure (ARF) requiring dialysis after coronary artery bypass grafting (CABG) occurs in 1 to 5% of patients and is independently associated with postoperative mortality, even after case-mix adjustment. A risk-stratification algorithm that could reliably identify patients at increased risk of ARF could help improve outcomes.To assess the validity and generalizability of a previously published preoperative renal risk-stratification algorithm, we analyzed data from the Quality Measurement and Management Initiative (QMMI)1 patient cohort. The QMMI includes all adult patients (N = 9498) who underwent CABG at 1 of 12 academic tertiary care hospitals from August 1993 to October 1995. ARF requiring dialysis was the outcome of interest. Cross-validation of a recursive partitioning algorithm developed from the VA Continuous Improvement in Cardiac Surgery Program (CICSP) was performed on the QMMI. An additive severity score derived from logistic regression was also cross-validated on the QMMI.The CICSP recursive partitioning algorithm discriminated well (ARF vs. no ARF) in QMMI patients, even though the QMMI cohort was more diverse. Rates of ARF were similar among risk subgroups in the CICSP tree, as was the overall ranking of subgroups by risk. Using logistic regression, independent predictors of ARF in the QMMI cohort were similar to those found in the CICSP. The CICSP additive severity score performed well in the QMMI cohort, successfully stratifying patients into low-, medium-, high-, and very high-risk groups.The CICSP preoperative renal-risk algorithms are valid and generalizable across diverse populations.

Abstract

Patients on dialysis have reduced exercise tolerance compared with age-matched sedentary controls. The reasons for this debility have not been fully elucidated, but physical inactivity could be a contributing factor. The purpose of the current study was to determine whether patients on hemodialysis are less active than healthy sedentary controls and to explore clinical correlates of physical activity level in a group of hemodialysis patients.Thirty-four hemodialysis patients and 80 healthy sedentary individuals participated in the study. Physical activity was measured for seven days with a three-dimensional accelerometer and with an activity questionnaire.Vector magnitude values from the accelerometer for the dialysis and control subjects were 104,718 +/- 9631 and 161,255 +/- 6792 arbitrary units per day, respectively (P < 0.0001, mean +/- SEM). The estimated energy expenditure values derived from the questionnaire were 33.6 +/- 0.5 kcal/kg/day and 36.2 +/- 0.5 kcal/kg/day (P = 0.002). The difference between patients on dialysis and controls increased with advancing age. Among the dialysis subjects, some measures of nutritional status correlated with physical activity level, including serum albumin concentration (r = 0.58, P = 0.003), serum creatinine concentration (r = 0.37, P = 0. 03), and phase angle derived from bioelectrical impedance analysis (r = 0.40, P = 0.02).Patients on hemodialysis are less active than healthy sedentary controls, and this difference is more pronounced among older individuals. There is an association between the level of physical activity and nutritional status among patients on dialysis. These findings are of great concern, given the trend toward increasing age in incident dialysis patients and the well-known association between inactivity and increased mortality in the general population.

Abstract

Cardiovascular disease (CVD) is the most common cause of death in patients with end-stage renal disease (ESRD). The optimal management strategy in this population is unknown. We studied 640 patients with ESRD and acute myocardial infarction during 1994 to 1995 as part of the Health Care Financing Administration's Cooperative Cardiovascular Project. The majority of patients were treated with medical therapy alone, 46 patients (7%) were treated with percutaneous transluminal coronary angioplasty (PTCA), and 29 patients (5%) underwent coronary artery bypass grafting (CABG). Patient characteristics and comorbid conditions were similar among the three groups. The overall 1-year mortality rate was 53%. Advanced age, low or high body mass index, history of peripheral vascular disease or stroke, the inability to walk independently, and several indicators of cardiac dysfunction were associated with an increased relative risk (RR) for death. Survival curves differed significantly by treatment modality, with 1-year survival rates of 45%, 54%, and 69% in the medical therapy alone, PTCA, and CABG groups, respectively (P = 0.03). After adjustment for confounding variables, the RR for death was less (but not significantly so) in the CABG group (RR, 0.6; 95% confidence interval, 0.3 to 1.1). There are no randomized clinical trial data to guide therapy of CVD in patients with ESRD. On the basis of these and other available data, CABG may be the optimal therapy for CVD in ESRD. In light of the exceptionally poor outcomes observed for patients treated with medical therapy alone, it may be premature to dismiss PTCA as a therapeutic option in this population.

Abstract

Dietary practices differ greatly among individuals by race and ethnicity. The importance of these differences is accentuated in patients with end-stage renal disease, a population for whom dietary restrictions are often prescribed. In addition to the known variation in dietary practices among US-born whites and African-Americans, persons of other ethnicities often present new and unique challenges to the dialysis-nutrition care team. The UCSF-Mt. Zion Dialysis Unit (San Francisco, CA) is a university-affiliated dialysis unit that serves an ethnically diverse population in San Francisco's Western Addition neighborhood. Ten percent to 15% of patients are recent immigrants from the former Soviet Union. This report highlights the dietary practices of this immigrant community and the need for ethnicity-specific renal nutrition recommendations in modern dialysis practice.

Abstract

The link between dialysis "vintage" (length of time on dialysis in months to years) and survival has been difficult to define, largely because of selection effects. End-stage renal disease (ESRD) is thought to be a wasting illness, but there are no published reports describing the associations between vintage and body composition in hemodialysis patients.We explored the relationships among vintage, nutritional status, and survival in a 3009 patient cohort of prevalent hemodialysis patients. Body weight, total body water, body cell mass, and phase angle by bioelectrical impedance analysis were the body composition parameters of interest. We examined vintage as an explanatory variable in multiple linear regression analyses (adjusted for age, gender, race, and diabetes) using body composition parameters and biochemical indicators of nutritional status as dependent variables. Proportional hazards regression was used to evaluate the association of vintage and survival with and without adjustment for case mix and laboratory variables.Dialysis vintage was 3.8 +/- 3.7 (median 2.6) years. Body composition parameters tended to be lower after dialysis year 2. Linear estimates per year of vintage beyond year 2 include -0.66 kg body wt (P < 0.0001), -0.17 kg total body water (P = 0.0003), -0.14 kg body cell mass (P < 0.0001), and -0.07 degrees phase angle (P < 0.0001). In unadjusted analyses, vintage was not associated with survival, either as a linear or higher order term. The adjustment for case mix yielded a vintage term associated with an increased relative risk (RR) of death (RR 1.04 (95% CI, 1.01 to 1.07 per year). A further adjustment for laboratory data yielded a RR of 1.06 (95% CI, 1.03 to 1.09 per year).Dialysis vintage is related to nutritional status in hemodialysis patients, with vintage of more than years associated with a significant decline in all measured nutritional parameters. Cross-sectional analyses probably underestimate these effects. A year accrued on dialysis is associated with a 6% increase in the risk of death, all else equal. Longitudinal assessments of nutritional status, including body composition, are required to better understand the natural history of wasting with ESRD and its implications for long-term survival.

Abstract

Although accepted worldwide as valid measures of dialysis adequacy, neither the Kt/V (urea clearance determined by kinetic modeling) nor the urea reduction ratio (URR) have unambiguously predicted survival in hemodialysis patients. Because the ratio Kt/V can be high with either high Kt (clearance x time) or low V (urea volume of distribution) and V may be a proxy for skeletal muscle mass and nutritional health, we hypothesized that the increase in the relative risk of death observed among individuals dialyzed in the top 10 to 20% of URR or Kt/V values might reflect a competing risk of malnutrition.A total of 3,009 patients who underwent bioelectrical impedance analysis were stratified into quintiles of URR. Laboratory indicators of nutritional status and two bioimpedance-derived parameters, phase angle and estimated total body water, were compared across quintiles. The relationship between dialysis dose and mortality was explored, with a focus on how V influenced the structure of the dose-mortality relationship.There were statistically significant differences in all nutritional parameters across quintiles of URR or Kt/V, indicating that patients in the fifth quintile (mean URR, 74.4 +/- 3.1%) were more severely malnourished on average than patients in all or some of the other quintiles. The relationship between URR and mortality was decidedly curvilinear, resembling a reverse J shape that was confirmed by statistical analysis. An adjustment for the influence of V on URR or Kt/V was performed by ev