Between 2008 and 2010, more than 50,000 patients in the United States discontinued dialysis, a right which has been affirmed by several practice guidelines. A large proportion of these patients enroll in hospice, and this care could be enhanced by a detailed understanding of factors that determine post‑dialysis survival time. The authors of this study investigated survival among end‑stage renal disease (ESRD) patients admitted to hospice after discontinuation of dialysis, and the study defined independent predictors of survival time. Patient data were obtained from electronic medical records of 10 hospices in the Coalition Hospices Organized to Investigate Comparative Effectiveness (CHOICE) network. For the 1,947 evaluated ESRD patients, mean survival was 7.4 days after hospice admission as compared to 54.4 days for the nonrenal control patients (n = 124,673). Additionally, a Cox proportional hazards model identified 7 independent predictors of early mortality, including male gender, referral from a hospital, lower functional status, and presence of peripheral edema. The authors expect that discontinuation of dialysis will become more common with increased prevalence of ESRD, and the authors conclude by recommending additional investigations to define survival trajectories, involvement of advanced directives, and family experience of the process. Read more…

Patency Rates of the Arteriovenous Fistula for Hemodialysis: A Systematic Review and Meta-analysis

Arteriovenous fistulas (AVFs) are endorsed as the preferred form of vascular access for hemodialysis. Knowledge of AVF performance informs patient consent and quality improvement initiatives and also guides patient and clinician decision making. The authors conducted a systematic review and meta‑analysis of AVF primary failure as well as primary and secondary patency rates at 1 and 2 years. Estimates were obtained from English‑language studies (2000-2012) of 100 or more AVFs: estimates were pooled using a random-effects model and sources of heterogeneity were explored using meta‑regression. From 46 articles (62 cohorts; N = 12,383), the AVF primary failure rate was 23% (95% confidence intervals [CI], 18%-28%), The primary patency rate (including primary failures) was 60% (95% CI, 56%‑64%) at year 1 and 51% (95% CI, 44%‑58%) at year 2. Additionally, the authors found that approximately one-quarter to one-third of created AVFs were never used (with even higher rates in the elderly and patients using a lower-arm AVF), and by 1 year, 40% of all AVFs failed or required at least 1 intervention. The authors concluded that there is a substantial decrease in AVF performance over time, and current data highlighted a higher risk of primary failure and low to moderate primary and secondary patency rates. Read more…

Dialysis Dose and Intradialytic Hypotension: Results from the HEMO Study

Intradialytic hypotension (IDH) is an abrupt decline in blood pressure during a hemodialysis (HD) treatment session, and by definition IDH results in symptoms and/or requires clinical intervention. IDH has been associated with numerous adverse events, and IDH has been attributed to transient intradialytic osmotic gradients, resulting from rapid removal of urea, sodium, and other substances from the intravascular compartment during dialysis. In the current study, the authors tested the hypothesis that higher dialysis dose is associated with greater risk of IDH. The authors performed a post hoc analysis of the HEMO study – a multicenter, randomized clinical study in maintenance HD that randomized patients to higher versus standard Kt/V and higher versus lower membrane flux. In the study population (N = 1,825 individuals; 62,095 unique HD session), IDH events occurred more frequently in the higher-Kt/V group (18.3% vs. 16.8%; p < 0.001). Participants randomized to higher-target Kt/V had a greater adjusted risk of IDH than those randomized to standard Kt/V (odds ratio [OR] 1.12; 95% confidence interval [CI] 1.01–1.25). Higher vs. lower dialyzer mass transfer-area coefficient for urea and rate of urea removal were associated with greater adjusted odds of IDH (OR 1.15; 95% CI 1.04–1.27 and OR 1.05; 95% CI 1.04– 1.06 per mg/dl/h, respectively). The authors concluded that a higher dialysis dose appeared to increase the risk for IDH, supporting the idea that rapidity of intradialytic plasma osmolality reductions may mediate hemodynamic instability. Further, the authors advocated for targeted strategies to mitigate the rapidity of the plasma osmolality changes. Read more…

Given the advances in peritoneal dialysis (PD) over the last 2 decades, the authors assert that this improved therapy should no longer be an underutilized form of renal replacement therapy as compared to in‑center hemodialysis (HD). The authors describe that there is no clinical, quality of life, cost benefit, or other acceptable reason for the discrepancy in utilization of PD versus HD. For example, they note that over the last 20 years advances in PD therapy have made short and long‑term clinical outcomes indistinguishable from in‑center HD. Additionally, the authors describe that PD costs are approximately 25% less per year per patient than for in‑center HD. All evidence taken together led the authors to advocate a “PD first” approach for renal replacement therapy, noting advantages for patients, physicians, and health care systems. In spite of predicted advantages, the authors discuss biases against PD (eg, ease of HD initiation, physician experience and training, misinformation about PD, inadequate patient education about PD, lack of PD financial incentives, and lack of PD infrastructure) which must be confronted and overcome in order to move the dialysis marketplace toward “PD first” practices. In conclusion, the authors note that it is time for a cultural shift in the United States dialysis marketplace, accommodating both “patient‑centered” cost‑efficient delivery of care and patient choice in selecting the best renal replacement therapy that suits lifestyle and needs. Read more…

CKD–Mineral and Bone Disorder and Risk of Death and Cardiovascular Hospitalization in Patients on Hemodialysis

Parathyroid hormone (PTH), calcium, and phosphate have been independently associated with cardiovascular event risk. In hemodialysis (HD) patients who experienced increased risks of cardiovascular morbidity analyses suggest that chronic kidney disease‑mineral and bone disorder (CKD‑MBD) may be involved with the increased risks. The authors suggest that previous analyses may have serious limitations because PTH, calcium, and phosphate parameters were each treated independently with the other parameters as covariates. Thus, the current study was conducted to determine the prevalence of CKD-MBD phenotypes (ie, naturally occurring CKD‑MBD groups identified from average PTH, calcium, and phosphorus values during a 4‑month baseline period) in HD patients and to estimate their associations with mortality as well as a composite end point (cardiovascular hospitalization or death). The 2‑stage model first estimated 16-month probabilities of death and the composite end point; then, patients were categorized into 1 of 36 phenotypes, and phenotypes were tested for associations with the outcomes. From the 26,221 patients included in the study, 98.5% were in the 22 most common phenotypes, and within these phenotypes, 20% and 54% of patients were in groups with statistically significant higher risk of death and composite endpoint, respectively. More that 40% of all patients were in the 3 largest groups with elevated composite endpoint risk. The authors concluded that this novel method of grouping HD patients by naturally occurring CKD‑MBD phenotypes may add to developing clinically meaningful metrics for CKD‑MBD care in HD patients. Read more…

For hemodialysis (HD) therapy, the preferred vascular access is an arteriovenous fistula (AVF) due to greater longevity and lower complication rates; however, establishing a usable AVF may require several months and multiple surgical procedures. If an AVF is created too early before HD is indicated, there may be an increase in risk of complications and wasting the limited lifetime of an AVF before its use. If the AVF is created too late, it may not mature in time, and a central venous catheter (CVC) may be used, increasing the risk of morbidity and mortality. To estimate timing of AVF creation, some guidelines recommend using a “preparation window” method based on prediction of HD initiation, but other guidelines recommend an “estimated glomerular filtration (eGFR)‑threshold ” method. In this study, the authors report results of a Monte Carlo simulation model designed to determine optimal timing of AVF placement in CKD patients. Model inputs included CKD progression, patient survival, surgical wait times, probability of AVF failure, and eGFR calculation timing. Model outputs included expected remaining patient lifetime, patients with a CVC at HD initiation, patients with unnecessary AVF creation, and results stratified by cohort and age group. Overall, results suggested that the optimal timing for AVF creation was when the estimated time to HD initiation was within approximately 12 months or eGFR decreased to < 15‑20 mL/min/1.73 m2. The authors added that the choice of strategy also should be guided by assessment of the individual’s rate of CKD progression to avoid excessively early or late AVF creation. Because elderly patients with CKD had a greater risk of having an unnecessary AVF creation due to the competing risk of death, later referral seemed appropriate in this group. Read more

Individual differences in peritoneal membrane function (eg, peritoneal solute transport rate [PSTR]) have been shown to influence clinical outcomes in patients undergoing peritoneal dialysis (PD). For example, high PSTR has been associated with decreased survival, and recently, PSTR has been associated with increased dialysate IL-6 concentrations, implying IL‑6 local production. For reference, individuals with genetic polymorphisms associated with increased systemic and local IL-6 production have increased PSTR and worse survival. Because the authors found no current studies linking dialysate cytokine profiles to survival and because only small studies have linked dialysate IL-6 with activation of the local cytokine network, the authors analyzed data from the Global Fluid Study, a multinational, multicenter, prospective, combined incident and prevalent cohort study with up to 8 years of follow-up (N = 959), to test the hypotheses (1) that intraperitoneal and systemic inflammation are distinct entities, (2) that local not systemic inflammation is associated with membrane function (ie, PSTR), (3) that different clinical factors are associated with local and systemic inflammation, and (4) that systemic but not local inflammation predicts patient survival. The analysis results demonstrated that systemic and local peritoneal inflammatory cytokine networks were largely independent and that they had different consequences for patient survival. Local, subclinical peritoneal inflammation, represented by IL‑6 concentrations, was the most significant known predictor of PSTR but did not determine patient survival. Independent of inflammation, higher PSTR may still be associated with worse survival in prevalent patients. The authors concluded that while the relevance of membrane inflammation is yet unknown, greater attention should be given to membrane function in PD rather than generalized control of systemic inflammation. Read more

Racial and ethnic disparities in end‑stage renal disease (ESRD) incidence have been observed for decades in the United States. For example Blacks, Native Americans, and Asians display higher rates of ESRD than Whites, and similarly, Hispanics display higher rates that non‑Hispanics. This ESRD disparity is not explained by racial or ethnic differences in kidney function decline or mortality. In the study reported here, the authors investigated the roles of kidney function decline and survival as epidemiologic driving forces in the disparity of ESRD incidence across a range of kidney function by conducting a retrospective, cohort study of approximately 1 million adult members (White, Hispanic, Black, Asian) of Kaiser Permanente Southern California, receiving care between 2003 and 2009. Patients were included if they had more than 2 serum creatinine tests prior to ESRD onset. Racial differences in kidney failure were projected using linear regression of estimated glomerular filtration rate (eGFR) decline, and survival was estimated in the subgroups. Investigators found that the most extreme rates of eGFR decline were demonstrated in Blacks then Hispanics, Whites, and Asians. More Whites (n = 25,065) were projected to develop kidney failure during the study period than Hispanics (n = 11,368), Blacks (N = 6,785), and Asians (n = 3,176). The odds ratio (OR) for projected kidney failure versus Whites was greater for Blacks (1.54; 95% confidence interval [CI], 1.46‑1.62) than Hispanics (1.49; 95% CI, 1.42‑1.56) or Asians (1.41; 95% CI, 1.32‑1.51). For those patients with projected kidney failure, the hazard ratio (HR) of death versus Whites was greater for Blacks (0.82; 95% CI, 0.77‑0.88) than Hispanics (0.67; 95% CI, 0.63‑0.72) or Asians (0.58; 95% CI, 0.52‑0.65). Given these results, the authors concluded that differences in eGFR decline and mortality contributed to racial disparities in ESRD incidence. Read more

Previous studies have demonstrated that 25‑hydroxy (25‑OH) vitamin D deficiency or insufficiency is common among maintenance dialysis patients, even though many of them also receive 1,25‑(OH)2 vitamin D derivatives as treatment for hyperparathyroidism. Some data suggest that 25‑OH vitamin D deficiency in dialysis patients is associated with negative health outcomes such as higher risk of 90‑day mortality, sudden cardiac death, cardiovascular mortality, arterial stiffness, and vascular calcification. The authors of the current study tested the hypothesis that low serum 25‑OH vitamin D concentrations (< 20 ng/mL) were independently associated with higher mortality and risk of hospitalization in incident hemodialysis and peritoneal dialysis patients enrolled in the Comprehensive Dialysis Study (CDS). The study population (N = 256) for this prospective cohort study was recruited from 56 dialysis centers across the United States. The adjusted analysis demonstrated that incident dialysis patients with 25‑OH vitamin D concentrations < 10.6 ng/mL (lowest tertile) were 75% more likely to die or to be hospitalized. Patients in the lower 2 tertiles (< 15.5 ng/mL) were hospitalized at a 65%‑70% higher rate in the first year of dialysis. The authors concluded that incident dialysis patients with severe deficiency of 25‑OH vitamin D have poor survival, consistent with the literature; however, the authors were unable to speculate if supplementation in deficient patients would improve survival or reduce hospitalization. Read more

Modeling of Oxidized PTH (oxPTH) and Non-oxidized PTH (n‑oxPTH) Receptor Binding and Relationship of Oxidized to Non‑Oxidized PTH in Children with Chronic Renal Failure, Adult Patients on Hemodialysis and Kidney Transplant Recipients

Biological properties of oxidized parathyroid hormone (oxPTH) and non‑oxidized PTH (n‑oxPTH) are different; namely, n‑oxPTH is the biologically active agonist of the PTH receptor, whereas oxPTH is inactive. The authors hypothesized that previous and current assays (eg, RIA, ELISA) for intact PTH (iPTH) do not correctly measure circulating concentrations of biologically active n‑oxPTH, but rather the assays measure total PTH which primarily consists of inactive oxPTH. Having developed an antibody-based assay that differentiates between oxPTH and n‑oxPTH, the authors report finding of a study comparing concentrations of oxPTH and n‑oxPTH in different patient populations exhibiting chronic renal failure (healthy volunteers [n = 89], children with stage 2‑4 renal failure [n = 620], adult dialysis patients [n = 342], and kidney transplant recipients [n = 602]). Concentrations of oxPTH and n‑oxPTH demonstrated high intrasubject variability. A large proportion of circulating PTH was in the oxidized form. For example, in the pediatric subgroup, the baseline mean ratios (± standard deviation) of iPTH:n‑oxPTH and oxPTH:n‑oxPTH were 8.75 ± 3.49 and 7.75 ± 3.49, respectively. Additionally, patients with renal failure displayed n‑oxPTH concentrations that were 1.5‑2.25‑fold greater than healthy volunteers. Structural modeling of oxPTH and n‑oxPTH molecules suggested that the differential activities are likely related to PTH refolding upon oxidization rather than steric or electrostatic changes. The authors concluded that n‑oxPTH measurements may more precisely reflect the hormone status, while iPTH measurements likely describe oxidative stress of renal failure patients. Read more

Previous studies of hemodialysis (HD) patients in the United States have described that within weeks of death certain characteristics (eg, weight loss, increased inflammation, worsening cardiovascular status) decrease in a predictable manner for these patients. From additional observations that decreased systolic blood pressure (SBP), body weight, and serum albumin concentrations are associated with increased mortality risk, researchers have hypothesized that fundamental biological processes may be operative in chronic HD patients weeks or months before death, independent of race and gender. In the study reported here by Usvyat et al, the temporal evolution of dynamic parameters, namely interdialytic weight gain (IDWG), SBP, serum albumin, and C‑reactive protein (CRP), were analyzed from patient‑level data in a multinational consortium of HD databases. The study population included approximately 52,000 chronic HD patients from Asia, Europe, Argentina, and the United States. In surviving patients, IDWG, SBP, and serum albumin concentrations were stable, where as in those patients who died, these indicators began declining – irrespective of gender and race – more than a year prior to death. In European patients who died, CRP increased sharply before death. The authors propose that longitudinal monitoring of chronic HD patients may allow for early warning alerts, triggering clinical interventions for patient care and improved outcomes. Read more

Differences in prescribed Kt/V and delivered haemodialysis dose—why obesity makes a difference to survival for haemodialysis patients when using a ‘one size fits all’ Kt/V target

Historically, dialysis dose is prescribed to achieve a target Kt/V corrected for total body water (TBW). However, as adipose contains less water than muscle, obese hemodialysis (HD) patients may have lower TBW. Consequently, for these patients delivered Kt/V may be greater than the prescribed Kt/V. As such, increased dialysis in obese HD patients may explain reports of increased survival in this patient group. The study reported here was designed to determine whether obese HD patients receive a proportionally higher dialysis dose when the same Kt/V target was used. The TBW was determined using both anthropomorphic equations and multi‑frequency bioelectric impedance assessments (MF‑BIA). In the 363 adult patients observed, MF‑BIA demonstrated that as body mass index (BMI) increased, the proportion of skeletal muscle decreased and the proportion of fat mass increased. At higher BMIs the TBW was lower when measured by MF‑BIA versus as predicted by an anthropomorphic equation. As a result, the delivered Kt/V using MF‑BIA was greater for the obese HD patients. The authors concluded that due to underlying body composition in obese patients, anthropomorphic equation‑based dialysis results in underestimation of the delivered dose, consequently providing greater dialysis for obese patients. Read more

Current guidance recommends that arteriovenous fistulas (AVFs) be the first‑used form of access for incident hemodialysis (HD) patients. However, evaluating the first‑placed form of access may better reflect the actual effect of the fistula first initiative. In the elderly population, predialysis access placement is accompanied by greater risk of death prior to dialysis, and in this patient population, AVFs have twice the failure rate as in younger patients. Therefore, incident catheter use – the least preferred method of HD access – has increased to nearly 82% in the elderly population. The study reported here was designed to investigate the mortality rate associated with the first type of vascular access placed rather than the type of access used at HD initiation in the elderly population. The study population included 21,436 patients with AVFs as first access placed; 3,472 had arteriovenous grafts (AVGs); and 90,517 had catheters. For the total study population, AVFs as first‑placed access had favorable but statistically nonsignificant survival outcomes as compared to AVGs. When stratified by age, the 67‑79 year old cohort demonstrated better survival outcomes with AVF placement, but the octogenarians and nonagenarians showed equivalent survival outcomes with AVF and AVG placement. In all comparisons, a catheter placed as first access was inferior to both AVFs and AVGs. The authors conclude that fistulas and grafts may be equally acceptable in octogenarians, and catheters should remain the last option strategy. Read more

Hypertension is a leading cause of target organ injury, and untreated, hypertension is associated with increased risk of cerebrovascular accident, ischemic heart disease, congestive heart failure, and renal disease. In fact, hypertension and diabetes together, account for the most cases of chronic kidney disease (CKD) worldwide. Poorly controlled blood pressure is an independent risk factor for end‑stage renal disease (ESRD) and severity of hypertension correlates with risk of ESRD. Therefore, blood pressure control using antihypertensive agents is critical for improved renal outcomes, and the study reported here assessed the impact of antihypertensive agent adherence on the prevention of ESRD. This was a case cohort study analyzing data from 185,476 patients in Canadian public health insurance program RAMQ databases aged 45‑85 year and newly diagnosed/treated for hypertension between 1999 and 2007. Follow-up time was 5.1 years. Results demonstrated that adherence to antihypertensive treatment of ≥ 80% was associated with a 33% decrease risk of ESRD onset. Sensitivity analysis revealed that the effect was significant only among patients without CKD. Increased risk of ESRD onset was associated with male gender, diabetes, peripheral vascular disease, chronic heart failure, gout, urologic intervention, CKD, and the number of antihypertensive drugs. A subgroup analysis demonstrated that risk reduction for ESRD was similar for patients > 65 years and those < 65 year. The authors concluded that assessment of antihypertensive medication adherence should be adopted as routine clinical practice, and long‑term adherence must be promoted within the discipline. Read More

From the annual meeting of the Heart Failure Association of the European Society of Cardiology (Heart Failure 2013), Professor Svend Aage Mortensen (Copenhagen, Denmark) reported results of a multicenter, randomized, double‑blind clinical study titled Q‑SYMBIO in which Coenzyme Q10 (CoQ10) administration reduced all‑cause mortality versus placebo. In this study, patients with severe heart failure (New York Heart Association [NYHA] Class III or IV) were treated with either CoQ10 or placebo and followed for 2 years. The investigator reported that for the primary endpoint of time to first major cardiovascular event (MACE), CoQ10 halved the risk with 29 (14%) patients in the CoQ10 group reaching the MACE endpoint compared to 55 (25%) patients in the placebo group (hazard ratio = 2; p = 0.003). Additionally, CoQ10 halved the risk of dying from all causes, which occurred in 18 (9%) patients in the CoQ10 group compared to 36 (17%) patients in the placebo group (hazard ratio = 2.1; p = 0.01). There were fewer adverse events in the CoQ10 group compared to the placebo group (p = 0.073). Mortensen concluded by adding that CoQ10 is the first medication in a decade which improves survival in chronic heart failure, and CoQ10 supplementation should be added to the standard heart failure therapy regimen. Read More

Here the authors remind us that dialysis patients often require on average 9-10 oral medications and 2‑3 parenteral medications per day for clinical management of comorbid conditions. The high pill burden of such complex regimens is associated with widespread nonadherence. To address nonadherence in the dialysis population, a large US dialysis organization, DaVita, created an integrated pharmacy program called DaVita Rx, which included individualized prescription reviews for dosage, allergies, and interactions; prior authorization assistance; no‑cost medication delivery to dialysis facilities or homes; refill management with periodic patient reminders; and clinical pharmacists available for telephone consultation. In the retrospective observational study reported here, the authors used linked data from DaVita records and the US Renal Data System (USRDS) to assess relative rates of mortality and hospitalization of DaVita Rx enrollees (n = 8,864) and matched control patients (n = 43,013) who received maintenance hemodialysis therapy between 2006 and 2008. The authors reported that receipt of integrated pharmacy services was associated with lower rates of death and hospitalization in hemodialysis patients dually eligible for Medicare and Medicaid enrollment. In intention-to-treat and as-treated analyses, mortality hazard ratios for patients receiving integrated pharmacy services versus matched controls were 0.92 (95% confidence interval [CI], 0.86-0.97) and 0.79 (95% CI, 0.74-0.84), respectively. Corresponding relative rates of hospital admissions were 0.98 (95% CI, 0.95-1.01) and 0.93 (95% CI, 0.90-0.96), respectively, and of hospital days, 0.94 (95% CI, 0.90-0.98) and 0.86 (95% CI, 0.82-0.90), respectively. The authors concluded that the results of this study provide a foundation for further clinical and economic outcomes research regarding coordination of medication-related services for dialysis patients in the current and changing reimbursement environment. Read More

Impaired Kidney Function at Hospital Discharge and Long-Term Renal and Overall Survival in Patients Who Received CRRT

Foundational to this study, the authors cite recent research that suggests acute kidney injury (AKI) is a significant risk factor for subsequent development of chronic kidney disease (CKD) and associated mortality. While numerous risk factors for increased mortality after AKI have been identified, including the need for dialysis during intensive care unit (ICU) stay, the long‑term effects of AKI that necessitates dialysis in critically ill patients after hospital discharge is not well understood. The study reported here was conducted to evaluate the degree of renal function upon hospital discharge as an independent risk factor for long‑term renal survival and overall long‑term mortality after AKI that necessitates dialysis in the ICU. Results demonstrated that the overall mortality over 8.5 years was 75%, and only 35% of patients were originally discharged with an estimated glomerular filtration rate (eGFR) > 60 mL/min per 1.73 m2. Long‑term survival and renal survival were both strongly associated with the degree of kidney function impairment upon discharge. Specifically, an eGFR < 30 mL/min per 1.73 m2 was a predictor of death and worse renal survival at long‑term follow‑up. The authors conclude that these data are clinically relevant and consistent with other investigational observations that most critically ill patients surviving AKI necessitating dialysis have impaired kidney function upon discharge. Read More

No Independent Association of Serum Phosphorus With Risk for Death or Progression to End-Stage Renal Disease in a Large Screen for Chronic Kidney Disease

Chronic kidney disease (CKD) is consistently associated with higher risk of cardiovascular disease; however, evidence suggests that the pathogenesis of this vascular disease in CKD may be complex, including such nontraditional risk factors as abnormal phosphorus homeostasis. The study presented here tested the hypothesis that the association between serum phosphorus and all‑cause mortality and progression to end‑stage renal disease (ESRD) in earlier‑stage CKD is confounded by access and barriers to health care. Data from the nationwide CKD screening program Kidney Early Evaluation Program (KEEP) was analyzed for patients with estimated glomerular filtration rate (eGFR) < 60 mL/min per 1.73 m2. Data was analyzed for a cohort of 10,672 subjects who participated in KEEP screening between November 2005 and December 2010. The authors reported that serum phosphorus was associated with self‑reported history of cardiovascular disease; however, the association with all-cause mortality (median 2.3 years follow‑up), or progression to ESRD after adjustment for potential confounders, or the composite outcome of death or progression to ESRD in this cohort was robustly null. There was no appreciable change in hazard ratios when variables related to access and barriers to care were included. The authors concluded that serum phosphorus should be cautiously used as a biomarker for disordered phosphorus homeostasis in early‑stage CKD patients. Read More

Studies of chronic kidney disease (CKD) patients have demonstrated positive clinical outcomes (eg, slower disease progression, improved quality of life) associated with timely and consistent care from specialists such as nephrologists and dieticians. However, according to 2011 data from the United States Renal Data System (USRDS), as many as 50% of maintenance dialysis patients had received no pre‑end‑stage renal disease (ESRD) nephrologist care. The authors here report a national population analysis conducted to assess whether pre‑ESRD care differs across 4 urban/rural geographic categories (ie, large metropolitan, medium/small metropolitan, suburban, rural) and to assess if there are certain geographic categories that demonstrate large black‑white racial differences in receiving pre‑ESRD care. The study included 404,622 non‑Hispanic white and black adults (≥ 18 years old) who began dialysis between 2005‑2010. Pre‑ESRD care was obtained from USRDS data, and geographic data was obtained from Area Resource Files (United States Department of Agriculture). The following 5 pre‑ESRD care indicators were examined: nephrologist care at least 6 months and at least 12 months before ESRD, pre‑ESRD dietitian care, use of erythropoises‑stimulating agents for dialysis patens with hemoglobin concentrations < 10 g/dL, use of arteriovenous fistula for first outpatient dialysis session. Results demonstrated that lower proportions of pre‑ESRD patients received nephrologist care for at least 12 months in large metropolitan and rural areas (25.7% and 26.9%, respectively) as compared to medium/small metropolitan areas (31.6%). Suburban and rural patients had worse access to dietician care. In all 4 geographic categories, black patients received less care than white patients. The authors concluded that improving pre‑ESRD care will require more rigorous characterization of regional healthcare needs and coordinated employment efforts with renal and dietitian organizations. Read More

Recognizing that the 2 primary mechanisms of hypotension in end‑stage renal disease (ESRD) are chronic volume overload and sodium loading, the authors tested the hypothesis that sodium may exert a volume‑independent effect on blood pressure (BP). In this prospective clinical study, 16 Chinese hemodialysis (HD) patients were enrolled and received 1 month of thrice weekly HD with standard dialysate sodium concentration (138 mmol/L) before receiving 4 months of dialysis with dialysate sodium concentration of 136 mmol/L. No other aspects of the HD prescription were changed, and no changes in dietary sodium instructions were given to the patients. Pre‑ and post‑dialytic volume status were maintained constant using bioimpedance spectroscopy, and BP was measured using thrice daily home monitoring and ambulatory BP monitoring (ABPM) at baseline and Month 4 (44‑hour duration). The authors reported that the 2 mmol/L negative gradient between physiologic and dialysate sodium concentrations resulted in statistically significant reductions of daytime systolic and diastolic BP (‑10 mmHg and ‑6 mmHg, respectively); similar changes were observed in mean ABPM BPs. No change was observed in dialysis‑related symptoms (eg, intradialytic hypotension or cramping). Because these BP changes were observed alongside small post‑dialytic volume changes, the authors suggested that increased diffusive sodium removal may improve BP control via a volume‑independent mechanism. Read More

Given that fluid management and optimization remains a challenging part of hemodialysis (HD) therapy and long‑term fluid overload is associated with increased hospitalizations and mortality, clinicians have suggested different methods for assessing fluid status. Here, the authors tested blood volume monitoring (BVM), which is commonly used to assess intradialytic vascular depletion, as a marker for fluid status, and bioimpedance spectroscopy was used as a predialysis fluid status reference measurement. Fifty five HD patients in Spain were each observed for 7 HD treatments. The primary BVM marker was the slope of the intravascular volume decrease over time normalized by the ultrafiltration rate. The authors reported that BVM was well suited as a qualitative fluid status assessment in patient populations, but high individual variability presented a limitation. Additionally, BVM was most sensitive as a fluid status marker in patients with high fluid overload (> 3-4 L; area under the curve [AUC] 0.85), slightly less sensitive at low fluid overload (< 1 L; AUC 0.65), and least sensitive in a middle range (1‑3 L, AUC 0.60‑0.65). The authors concluded that BVM may be a reasonable fluid status marker for states of excess and depleted volume. Read More

With the goal of understanding and improving cardiovascular outcomes in hemodialysis (HD) patients, the authors undertook longitudinal investigation of left ventricular (LV) structure and function across chronic kidney disease (CKD) and end stage renal disease (ESRD) in a subset of patients enrolled in the Chronic Renal Insufficiency Cohort (CRIC) study. The authors hypothesized that abnormal LV structure and function at the onset of ESRD would be present at advanced CKD. Dialysis patients were included in the analysis if they had serial echocardiograms performed at advanced CKD (GFR < 20 mL/min/1.73 m2) and again after ESRD (defined as need for renal replacement therapy). From the CRIC study, 190 dialysis patients (160 HD patients, 30 peritoneal dialysis patients) were analyzed with data across advanced CKD and ESRD, and of those 190 dialysis patients, 89 patients had serial echocardiogram data across moderate CKD, advanced CKD, and ESRD. Results demonstrated no change in mean LV mass index between CKD and ESRD (62.3–59.5 g/m2.7, P = 0.10); however, ejection fraction was significantly decreased (53%–50%, P = 0.002). The authors suggested that the declining ejection fraction across CKD and ESRD may substantially contribute to post dialysis cardiovascular disease and mortality. Read more

Encapsulating peritoneal sclerosis (EPS) is a rare complication of peritoneal dialysis (PD), and it can have devastating clinical and quality of life effects on the PD population. Generally, EPS develops from increased exudation of fibrin and inflammatory cells on the peritoneal membrane, resulting in adhesions and development of a fibrous encapsulation of the intestines. As such, clinical signs include abdominal pain, bowel obstruction, and weight loss. The authors retrospectively analyzed data from 24 patients who were diagnosed with EPS between 1998 and 2011 and underwent surgery because of symptomatic late stage EPS. The authors identified 3 different macroscopic phenotypes of EPS which were categorized as types I, II, and II. Type I EPS displayed an active pattern of inflammation, exudations, ascites, and adhesions, which were similar across all 3 types; Type III demonstrated cocooning of the intestines but did not show significant fibrin deposition as hypothesized; and Type II displayed characteristics of both Types I and III. The authors correlated histologic findings and outcomes (eg, postoperative, long term) with the macroscopic types, finding a difference across groups in duration of PD. Regarding baseline characteristics, there were no statistical differences, except time to operation (ie, longer for Type I versus Type III), nor were there differences in onset of complaint prior to surgery. Regarding outcomes after a follow-up period of at least 3 years, differences across groups were not evident, nor were there differences in major or minor surgical complications. In conclusion, the authors described 3 EPS phenotypic categories but recommended that these 3 categories shuld be further investigated to identify how the phenotypes are related to pathophysiological processes. Because no differences in outcomes were observed, the authors suggested that EPS postoperative treatment should not be influenced by the macroscopic phenotypes. Read More

The European Renal Association European Dialysis and Transplant Association (ERA EDTA) created the EUropean DIALysis (EUDIAL) Working Group with the objective of enhancing the quality of dialysis therapies in Europe, and the working group has begun the initiative by focusing on haemodiafiltration (HDF), a combined diffusive and convective therapy for treating end stage renal disease (ESRD). In this review article the authors seek to establish a common terminology for HDF, summarize pre existing guidelines for HDF application, and suggest areas for future investigation. The authors define HDF as a blood purification therapy in which fluid is removed by ultrafiltration, and fluid balance is maintained by external solution infusion, providing more large solute removal than hemodialysis: ultrafiltration characteristics include pre specified convection volume, middle molecule clearance, and sieving coefficients for “high flux” membranes. Various modes of HDF differ by the site of replacement fluid infusion—before, during, or after ultrafiltration—and the authors discuss conditions under which each modality can be optimally applied. The authors describe how HDF can be quantified in terms of small solute removal, using approaches similar to hemodialysis (eg, Kt/V), and in terms of large solute removal, using the normalized effective convection volume; clearance can be calculated in HDF using the same method as for conventional hemodialysis. The EUDIAL group recommends development of harmonized norms and regulations which would include site level safety and quality control monitoring. To complement equipment and fluid standards establish by international organizations, the EUDIAL group encourages HDF instrument manufacturers to report risk analyses and protocols for disinfecting, testing, and replacing sterilizing filters, allowing for implementation of appropriate safety measures. In conclusion, the authors acknowledge that implementation of HDF at dialysis centers will require site level risk analysis and quality control management as well as guidance for establishing safely administered HDF programs. Furthermore, the EUDIAL group recommends development of a checklist for the basic prerequisites and protocols covering technical and clinical aspects for reducing HDF associated risks. Read more

The authors describe a retrospective outcomes analysis of the Frequent Hemodialysis Network’s (FHN) home based Nocturnal Trial (NCT00271999; Rocco MV et al, Kidney Int. 2011;80:1080-1091) and in center Daily Trial (NCT00264758; Chertow GM et al, N Engl J Med. 2010;363:2287-2300)— 2 trials which evaluated the safety and efficacy of 6 times per week hemodialysis versus the more conventional 3 times per week hemodialysis therapy. Specifically, in this publication, the authors investigated the effects of hemodialysis frequency on the trajectory of residual kidney function (RKF), hypothesizing that frequent hemodialysis would result in more rapid decline of RKF measured by an array of parameters (eg, 24 h urine volume [UVol], kidney urea [Kru], creatinine clearances[Krcreat]). In the FHN Nocturnal Trial, frequent hemodialysis was associated with a more rapid RKF decline demonstrated by all parameters, whereas in the FHN Daily Trial, frequent hemodialysis did not significantly influence the RKF change. The authors suggested that differences in study design and entry criteria may have made the Daily Trial less conducive to answering the question at hand. Additionally, the authors noted that while frequent nocturnal hemodialysis may hasten the decrease in RKF, the same therapy has demonstrated the positive effects of partially correcting left ventricular hypertrophy and lowering plasma concentrations of beta 2 microglobulin, protein bound toxins, and phosphorus. In summary, the authors recognized that given the strong associations among RKF, mortality, and morbidity in peritoneal and hemodialysis populations, the potential for more rapid loss of RKF should be a risk benefit consideration for any individual undergoing frequent hemodialysis. Read more

In this perspective piece, the authors, citing the annual report on global risks by the World Economic Forum, emphasize the dual points that 1) antibiotic resistant bacteria are “arguably the greatest risk…to human health” and 2) resistance plus the current collapse of antibiotic research and development will require both innovative approaches and traditional therapies to manage future infection control. The authors remind us that resistance results from bacterial adaptations to eons of “prokaryotic invented” antibiotics, carrying the three fold implication that use of modern antibiotic therapies naturally selects for resistant bacteria populations, “inappropriate” antibiotic use is not the sole culprit for resistance, and after billions of years of evolution, microbes have most likely invented antibiotics (and accompanying resistance) against every available biochemical target. Thus, the authors introduce and discuss 5 categories of promising future strategies to combat antibiotic resistance—infection prevention, new economic models that spur investment in anti infective treatments, slowing the spread of resistance, new therapies that attack microbes without creating resistance, and altering host microbe interactions to modify disease progression. According to the authors, infection prevention will be driven largely by new technologies that produce clean environments and reduce implantation of foreign materials (ie, plastic, metal) as therapies. Noting that economic and regulatory approaches must be aligned to revitalize the antibiotic drug pipeline, the authors discuss the Limited Population Antibiotic Drug proposal from the Infectious Diseases Society of America. Slowing the spread of resistance may be achievable by narrowing therapeutic antibiotic use and lowering environmental contamination of antibiotics. Finally, the authors discuss promising interventions that treat infection by attacking host rather than microbial targets. In conclusion, the authors suggest that long term future solutions to antibiotic resistance will not be incremental tweakings of 75 years worth of policies and processes, but rather solutions will be novel approaches resulting from reconceptualizing the nature of resistance, disease, and prevention. Read more

Peginesatide is a synthetic peptide-based erythropoiesis-stimulating agent (ESA) which is being investigated as a potential therapy for anemia in patients with advanced chronic kidney disease. Data from 2 randomized, controlled, open label studies in hemodialysis patients (EMERALD 1 [NCT00597753] and EMERALD 2 [NCT00597584]) was pooled. In the EMERALD studies, patients received either peginesatide once monthly or epoetin 1–3 times per week with doses adjusted to maintain hemoglobin concentration between 10.0–12.0 g/dL for ≥ 52 weeks. The primary efficacy end point was mean change from baseline in hemoglobin concentration to mean hemoglobin concentration during the evaluation period, and this endpoint was evaluated for noninferiority to epoetin. The safety endpoint was an adjudicated composite cardiovascular end point—all cause mortality, stroke, myocardial infarction, or serious adverse events of congestive heart failure, unstable angina, or arrhythmia—including pooled data from the EMERALD studies and 2 studies (PEARL 1 [NCT00598273] and PEARL 2 [NCT00598442]) that included non hemodialysis patients; this safety end point was evaluated to exclude a hazard ratio of > 1.3 (peginesatide relative to epoetin). In an analysis of 1418 pooled EMERALD patients, peginesatide was noninferior to epoetin in maintaining hemoglobin concentrations (mean between-group difference: EMERALD 1, −0.15 g/dL, 95% CI −0.30 to −0.01; EMERALD 2, 0.10 g/dL; 95% CI −0.05 to 0.26). The hazard ratio for the composite safety end point was 1.06 (95% CI 0.89 to 1.26) in the 4 pooled studies (2591 patients) and 0.95 (95% CI 0.77 to 1.17) in the 2 pooled EMERALD studies with peginesatide relative to epoetin in all comparisons. The proportions of patients with adverse and serious adverse events were similar in the treatment groups in the EMERALD studies. The cardiovascular safety of peginesatide was similar to that of epoetin in the pooled cohort. In summary, the authors concluded that peginesatide (administered monthly) was as effective as epoetin (administered 1–3 times per week) in maintaining hemoglobin concentrations in hemodialysis patients. Read more

Share this blog

Search blog

Categories

Recent posts

Archives

About this blog

Mahesh Krishnan, MD, VP of Research for DaVita Clinical Research, helps you navigate through all of the articles, publications and resources available online, providing you with a collection of the most timely, relevant and important resources to nephrologists.