Research & Scholarship

Current Research and Scholarly Interests

My research relates to issues pertaining to clinical kidney transplantation. We have ongoing studies on the following topics.1. Renal senescence and kidney transplant, and chronic allograft nephropathy.2. Living donor safety and response to uninephrectomy.3. Biomarkers for post-transplant monitoring.

Clinical Trials

Immunosuppression Impact on the Metabolic Control of Kidney Transplant With Pre-Existing Type 2 Diabetes (DM)Not Recruiting

Protocol Title: Randomized open label study comparing the metabolic control of first Kidney
Transplant recipients with Type 2 Diabetes Mellitus (DM) receiving either Prograf or Neoral
as part of a ATG induction, prednisone free and blood monitored Cellcept immunosuppressive
regimen.
PURPOSE This is a single center medical research study to analyze post-transplant kidney
recipients with pre-existing type 2 diabetes managed according to the recommended American
Diabetes Association (ADA) guidelines. Prograf (Tac) and Neoral (CSA) are the two main
medications to prevent rejection after transplantation. However, they may contribute to
poorer diabetes control. The purpose of the study is to compare the effects of Prograf and
Neoral on the control of Diabetes after kidney transplantation. In addition, all participants
in this study will receive Thymoglobulin (anti-lymphocyte globulin) at the time of
transplantation instead of long term prednisone (steroids).

Stanford is currently not accepting patients for this trial.For more information, please contact Stephan Busque, MD, 650-498-6189.

Abstract

Simultaneous liver-kidney (SLK) transplantation plays an important role in treating kidney failure in patients with end-stage liver disease. It used 5% of deceased donor kidney transplanted in 2015. We evaluated the utility, defined as posttransplant kidney allograft lifespan, of this practice.Using data from the Scientific Registry of Transplant Recipients, we compared outcomes for all SLK transplants between January 1, 1995, and December 3, 2014, to their donor-matched kidney used in kidney-alone (Ki) or simultaneous pancreas kidney (SPK) transplants. Primary outcome was kidney allograft lifespan, defined as the time free from death or allograft failure. Secondary outcomes included death and death-censored allograft failure. We adjusted all analyses for donor, transplant, and recipient factors.The adjusted 10-year mean kidney allograft lifespan was higher in Ki/SPK compared with SLK transplants by 0.99 years in the Model for End-stage Liver Disease era and 1.71 years in the pre-Model for End-stage Liver Disease era. Death was higher in SLK recipients relative to Ki/SPK recipients: 10-year cumulative incidences 0.36 (95% confident interval 0.33-0.38) versus 0.19 (95% confident interval 0.17-0.21).SLK transplantation exemplifies the trade-off between the principles of utility and medical urgency. With each SLK transplantation, about 1 year of allograft lifespan is traded so that sicker patients, that is, SLK transplant recipients, are afforded access to the organ. These data provide a basis against which benefits derived from urgency-based allocation can be measured.

Abstract

Renal failure is a late consequence of end-stage liver disease (ESLD). Even with liver transplantation, pretransplant renal impairment remains a strong predictor of posttransplant mortality. This review seeks to summarize and critically appraise common therapies used in this setting, including pharmacologic agents, procedures (transjugular intrahepatic portosystemic shunt, renal replacement therapy), and simultaneous liver-kidney transplantation. More experimental extracorporal modalities, eg, albumin dialysis or bioartificial livers, will not be discussed. A brief discussion on the definition and pathophysiologic underpinnings of renal failure in ESLD will be held at the beginning to lay the groundwork for the main section. Liver Transplantation 22 1710-1719 2016 AASLD.

Abstract

One third of the kidney transplants performed in the USA come from living kidney donors. The long-term outcome of healthy individuals who donate kidneys is mostly excellent, although recent studies have suggested that living donation is associated with a small absolute increase in the risk of end stage renal failure. Much of our understanding about the progression of kidney disease comes from experimental models of nephron loss. For this reason, living kidney donation has long been of great interest to renal physiologists. This review will summarize the determinants of glomerular filtration and the physiology that underlies post-donation hyperfiltration. We describe the 'remnant kidney' model of kidney disease and the reasons why such progressive kidney disease very rarely ensues in healthy humans following uninephrectomy. We also review some of the methods used to determine glomerular number and size and outline their associations.

Abstract

The Best Practice in Live Kidney Donation Consensus Conference held in June of 2014 included the Best Practices in Living Donor Education Workgroup, whose charge was to identify best practice strategies in education of living donors, community outreach initiatives, commercial media, solicitation, and state registries. The workgroup's goal was to identify critical content to include in living kidney donor education and best methods to deliver educational content. A detailed summary of considerations regarding educational content issues for potential living kidney donors is presented, including the consensus that was reached. Educational topics that may require updating on the basis of emerging studies on living kidney donor health outcomes are also presented. Enhancing the educational process is important for increasing living donor comprehension to optimize informed decision-making.

Abstract

The effect of preexisting hypertension on living donor nephron number has not been established. In this study, we determined the association between preexisting donor hypertension and glomerular number and volume and assessed the effect of predonation hypertension on postdonation BP, adaptive hyperfiltration, and compensatory glomerular hypertrophy. We enrolled 51 living donors to undergo physiologic, morphometric, and radiologic evaluations before and after kidney donation. To estimate the number of functioning glomeruli (NFG), we divided the whole-kidney ultrafiltration coefficient (Kf) by the single-nephron ultrafiltration coefficient (SNKf). Ten donors were hypertensive before donation. We found that, in donors ages >50 years old, preexisting hypertension was associated with a reduction in NFG. In a comparison of 10 age- and sex-matched hypertensive and normotensive donors, we observed more marked glomerulopenia in hypertensive donors (NFG per kidney, 359,499±128,929 versus 558,239±205,152; P=0.02). Glomerulopenia was associated with a nonsignificant reduction in GFR in the hypertensive group (89±12 versus 95±16 ml/min per 1.73 m(2)). We observed no difference in the corresponding magnitude of postdonation BP, hyperfiltration capacity, or compensatory renocortical hypertrophy between hypertensive and normotensive donors. Nevertheless, we propose that the greater magnitude of glomerulopenia in living kidney donors with preexisting hypertension justifies the need for long-term follow-up studies.

Abstract

Live donor kidney transplantation is the best treatment option for most patients with late-stage chronic kidney disease; however, the rate of living kidney donation has declined in the United States. A consensus conference was held June 5-6, 2014 to identify best practices and knowledge gaps pertaining to live donor kidney transplantation and living kidney donation. Transplant professionals, patients, and other key stakeholders discussed processes for educating transplant candidates and potential living donors about living kidney donation; efficiencies in the living donor evaluation process; disparities in living donation; and financial and systemic barriers to living donation. We summarize the consensus recommendations for best practices in these educational and clinical domains, future research priorities, and possible public policy initiatives to remove barriers to living kidney donation.

Abstract

Over 5,000 living kidney donor nephrectomies are performed annually in the US. While the physiological changes that occur early after nephrectomy are well documented, less is known about the long-term glomerular dynamics in living donors.We enrolled 21 adult living kidney donors to undergo detailed long-term clinical, physiological, and radiological evaluation pre-, early post- (median, 0.8 years), and late post- (median, 6.3 years) donation. A morphometric analysis of glomeruli obtained during nephrectomy was performed in 19 subjects.Donors showed parallel increases in single-kidney renal plasma flow (RPF), renocortical volume, and glomerular filtration rate (GFR) early after the procedure, and these changes were sustained through to the late post-donation period. We used mathematical modeling to estimate the glomerular ultrafiltration coefficient (Kf), which also increased early and then remained constant through the late post-donation study. Assuming that the filtration surface area (and hence, Kf) increased in proportion to renocortical volume after donation, we calculated that the 40% elevation in the single-kidney GFR observed after donation could be attributed exclusively to an increase in the Kf. The prevalence of hypertension in donors increased from 14% in the early post-donation period to 57% in the late post-donation period. No subjects exhibited elevated levels of albuminuria.Adaptive hyperfiltration after donor nephrectomy is attributable to hyperperfusion and hypertrophy of the remaining glomeruli. Our findings point away from the development of glomerular hypertension following kidney donation.Not applicable. FUNDING. NIH (R01DK064697 and K23DK087937); Astellas Pharma US; the John M. Sobrato Foundation; the Satellite Extramural Grant Foundation; and the American Society of Nephrology.

Abstract

Acute transplant glomerulopathy refers to alloimmune mediated endothelial injury and glomerular inflammation that typically occurs early post-kidney transplantation. We report a case of a 48-year old woman with end stage renal disease from lupus nephritis who developed an unexplained rise in serum creatinine 2months after renal transplant. As immunosuppression, she received alemtuzumab induction followed by a tacrolimus, mycophenolate mofetil and prednisone maintenance regimen. Her biopsy revealed severe glomerular endothelial injury associated with monocyte/macrophage-rich infiltrate in addition to mild acute tubulointerstitial cellular rejection. We briefly discuss acute transplant glomerulitis, its pathology and association with chronic/overt transplant glomerulopathy, C4d negative antibody-mediated rejection and the significance of monocytes in rejection. We also postulate that alemtuzumab induction may have contributed to the unusual pattern of monocyte-rich transplant glomerulitis.

Abstract

The objectives of this study were to evaluate and compare the performance of the deceased donor registries of the 50 states and the District of Columbia and to identify possible predictive factors of donor designation. Data were collected retrospectively by Donate Life America using a questionnaire sent to Donor Designation Collaborative state teams between 2007 and 2010. By the end of 2010, there were 94,669,081 designated donors nationwide. This accounted for 39.8 per cent of the U.S. population aged 18 years and over. The number of designated organ donors and registry-authorized recovered donors increased each year; however, the total number of recovered donors in 2010 was the lowest since 2004. Donor designation rate was significantly higher when license applicants were verbally questioned at the Department of Motor Vehicles (DMV) regarding their willingness to register as a donor and when DMV applicants were not given an option on DMV application forms to contribute money to support organ donation, compared with not being questioned verbally, and being offered an option to contribute money. State registries continue to increase the total number of designated organ donors; however, the current availability of organs remains insufficient to meet the demand. These data suggest that DMV applicants who are approached verbally regarding their willingness to register as a donor and not given an option on DMV application forms to contribute money to support organ donation might be more likely to designate themselves to be a donor.

Abstract

The elderly are the fastest growing subpopulation with end-stage renal disease. The goal of our study was to define characteristics of elderly patients who were considered ineligible for transplantation compared to those who were listed.984 patients were referred for evaluation during a 2-year period. Records of patients ≥65 years of age (n = 123) were reviewed. Patients who were listed versus not listed were characterized. Factors associated with waitlisting were determined using standard statistical tools.Half of elderly transplant candidates were accepted for listing compared to 75.4% of those aged <65 years. In multivariable logistic regression, older age (OR 1.29 per year ≥65, 95% CI 1.14-1.45), coronary artery disease (OR 8.57, 95% CI 2.41-30.53), and poor mobility (OR 13.97, 95% CI 4.76-41.00) were independently associated with denial of listing. The receiver operating characteristic curve showed good discrimination for denial of listing (area under the receiver operating characteristic curve of 0.88).Elderly candidates carry a heavy burden of comorbidities and over half of those evaluated are deemed unsuitable for waitlisting. Better delineation of characteristics associated with suitability for transplant candidacy in the elderly is warranted to facilitate appropriate referrals by physicians and management of expectations in potential candidates.

Abstract

The lack of reliable human proxies for minor (ie, non-HLA) histocompatibility loci hampers the ability to leverage these factors toward improving transplant outcomes. Despite conflicting reports of the effect of donor-recipient sex mismatch on renal allografts, the association between acute rejection of renal allografts and the development of human alloantibodies to the male H-Y antigen suggested to us that donor-recipient sex mismatch deserved re-evaluation.To evaluate whether the relationships between donor sex and allograft failure differed by recipient sex.We studied recipients of deceased-donor (n = 125,369) and living-donor (n = 63,139) transplants in the United States Renal Data System. Using Cox proportional hazards models stratified by donor type, we estimated the association between donor-recipient sex mismatch and death-censored allograft failure with adjustment for known risk factors, with and without the use of multiple imputation methods to account for potential bias and/or loss of efficiency due to missing data.The advantage afforded by male donor kidneys was more pronounced among male than among female recipients (8% vs 2% relative risk reduction; interaction P < 0.01). This difference is of the order of magnitude of several other risk factors affecting donor selection decisions.Donor-recipient sex mismatch affects renal allograft survival in a direction consistent with immune responses to sexually determined minor histocompatibility antigens. Our study provides a paradigm for clinical detection of markers for minor histocompatibility loci.

Abstract

An increasing number of patients older than 65 years are referred for and have access to organ transplantation, and an increasing number of older adults are donating organs. Although short-term outcomes are similar in older versus younger transplant recipients, older donor or recipient age is associated with inferior long-term outcomes. However, age is often a proxy for other factors that might predict poor outcomes more strongly and better identify patients at risk for adverse events. Approaches to transplantation in older adults vary across programs, but despite recent gains in access and the increased use of marginal organs, older patients remain less likely than other groups to receive a transplant, and those who do are highly selected. Moreover, few studies have addressed geriatric issues in transplant patient selection or management, or the implications on health span and disability when patients age to late life with a transplanted organ. This paper summarizes a recent trans-disciplinary workshop held by ASP, in collaboration with NHLBI, NIA, NIAID, NIDDK and AGS, to address issues related to kidney, liver, lung, or heart transplantation in older adults and to propose a research agenda in these areas.

Abstract

It is often difficult to synthesize information about the risks and benefits of recommended management strategies in older patients with end-stage renal disease since they may have more comorbidity and lower life expectancy than patients described in clinical trials or practice guidelines. In this review, we outline a framework for individualizing end-stage renal disease management decisions in older patients. The framework considers three factors: life expectancy, the risks and benefits of competing treatment strategies, and patient preferences. We illustrate the use of this framework by applying it to three key end-stage renal disease decisions in older patients with varying life expectancy: choice of dialysis modality, choice of vascular access for hemodialysis, and referral for kidney transplantation. In several instances, this approach might provide support for treatment decisions that directly contradict available practice guidelines, illustrating circumstances when strict application of guidelines may be inappropriate for certain patients. By combining quantitative estimates of benefits and harms with qualitative assessments of patient preferences, clinicians may be better able to tailor treatment recommendations to individual older patients, thereby improving the overall quality of end-stage renal disease care.

Abstract

The presence of kidney stones has been a relative contraindication for living donation. With the widespread use of more sensitive imaging techniques as part of the routine living donor workup, kidney stones are more frequently detected, and their clinical significance in this setting is largely unknown. Records from 325 potential kidney donors who underwent MRA or CT-angiography were reviewed; 294 proceeded to donation. The prevalence of kidney stones found incidentally during donor evaluation was 7.4% (24 of 325). Sixteen donors with stones proceeded with kidney donation. All incidental calculi were nonobstructing and small (median 2 mm; range 1-9 mm). Eleven recipients were transplanted with allografts containing stones. One recipient developed symptomatic nephrolithasis after transplantation. This recipient was found to have newly formed stones secondary to hyperoxaluria, suggesting a recipient-driven propensity for stone formation. The remaining ten recipients have stable graft function, postoperative ultrasound negative for nephrolithiasis, and no sequelae from stones. No donor developed symptomatic nephrolithiasis following donation. Judicious use of allografts with small stones in donors with normal metabolic studies may be acceptable, and careful follow-up in recipients of such allografts is warranted.

Abstract

Transplanted nephron mass is an important determinant of long-term allograft survival, but accurate assessment before organ retrieval is challenging. Newer radiologic imaging techniques allow for better determination of total kidney and cortical volumes.Using volume measurements reconstructed from magnetic resonance or computed tomography imaging from living donor candidates, we characterized total kidney (n=312) and cortical volumes (n=236) according to sex, age, weight, height, body mass index (BMI), and body surface area (BSA).The mean cortical volume was 204 mL (range 105-355 mL) with no significant differences between left and right cortical volumes. The degree to which existing anthropomorphic surrogates predict nephron mass was quantified, and a diligent attempt was made to derive a better surrogate model for nephron mass. Cortical volumes were strongly associated with sex and BSA, but not with weight, height, or BMI. Four prediction models for cortical volume constructed using combinations of age, sex, race, weight, and height were compared with models including either BSA or BMI.Among existing surrogate measures, BSA was superior to BMI in predicting renal cortical volume. We were able to construct a statistically superior proxy for cortical volume, but whether relevant improvements in predictive accuracy could be gained needs further evaluation in a larger population.

Abstract

The proportion of prospective living donors disqualified for medical reasons is unknown. The objective of this study is to delineate and quantify specific reasons for exclusion of prospective living donors from kidney donation.All adult prospective kidney donors who contacted our transplant program between October 1, 2007 and April 1, 2009 were included in our analysis (n = 484). Data were collected by review of an electronic transplant database.Of the 484 prospective donors, 39 (8%) successfully donated, 229 (47%) were excluded, 104 (22%) were actively undergoing evaluation, and 112 (23%) were withdrawn before evaluation was complete. Criteria for exclusion were medical (n = 150), psychosocial (n = 22), or histocompatibility (n = 57) reasons. Of the 150 prospective donors excluded for medical reasons, 79% were excluded because of obesity, hypertension, nephrolithiasis, and/or abnormal glucose tolerance. One hundred and forty-seven (61%) intended recipients had only one prospective living donor, of whom 63 (42%) were excluded.A significant proportion of prospective living kidney donors were excluded for medical reasons such as obesity (body mass index >30), hypertension, nephrolithiasis, and abnormal glucose tolerance. Longer-term studies are needed to characterize the risks to medically complex kidney donors and the potential risks and benefits afforded to recipients.

Abstract

Pre-transplant screening of a woman with end-stage renal disease (ESRD) showed no anti-human leukocyte antigen (HLA) alloantibodies by anti-human globulin-complement-dependent cytotoxicity (AHG-CDC; class I) or enzyme-linked immunosorbent assay (class II). Following a negative AHG-CDC crossmatch, an HLA*01:01+ deceased donor (DD) kidney was transplanted in September 2005. Subsequent screening of pre-transplant serum by LABScreen Single Antigen (SA) array showed strong reactivity versus A*01:01. Despite that reactivity, at 5 years post-transplant, the patient has a serum creatinine of 1.6 mg/dl and has never experienced humoral or cellular rejection. Retrospective flow-cytometric crossmatch of pre- and post-transplant sera versus DD cells was negative. Rescreening of multiple pre- and post-transplant sera revealed anti-A1 reactivity persisting from the first through the last samples tested. The patient's anti-A1 was almost two fold more reactive with denatured A*01:01 FlowPRA SA beads after denaturation with acid treatment (pH 2.7) than with untreated beads. Parallel results were observed with pH 2.7 treated versus untreated A1+ T cells in FXM. These data highlight the difficulty in interpreting screening results obtained using bead arrays, because of antibodies that appear to recognize denatured but not native class I HLA antigens. We suggest that such bead-positive, flow cytometric crossmatch negative antibodies are not associated with humoral rejection, may not necessarily be detrimental to a graft, and deserve further evaluation before becoming a barrier to transplantation.

Abstract

To elucidate the pathophysiologic changes in the kidney due to aging, we used physiological, morphometric, and imaging techniques to quantify GFR and its determinants in a group of 24 older (≥ 55 years) compared to 33 younger (≤ 45 years) living donors. Mathematical modeling was used to estimate the glomerular filtration coefficients for the whole kidney (K(f)) and for single nephrons (SNK(f)), as well as the number of filtering glomeruli (N(FG)). Compared to younger donors, older donors had a modest (15%) but significant depression of pre-donation GFR. Mean whole-kidney K(f), renocortical volume, and derived N(FG) were also significantly decreased in older donors. In contrast, glomerular structure and SNK(f) were not different in older and younger donors. Derived N(FG) in the bottom quartile of older donors was less than 27% of median-derived N(FG) in the two kidneys of younger donors. Nevertheless, the remaining kidney of older donors exhibited adaptive hyperfiltration and renocortical hypertrophy post-donation, comparable to that of younger donors. Thus, our study found the decline of GFR in older donors is due to a reduction in K(f) attributable to glomerulopenia. We recommend careful monitoring for and control of post-donation comorbidities that could exacerbate glomerular loss.

Abstract

To ensure long-term safety of living kidney donors, it is now recommended that they be followed for at least 2 years after donation and that serum creatinine levels be monitored. Such levels are often subjected by clinical laboratories to estimating equations and are reported as estimated GFR (eGFR). The accuracy of such equations in uninephric living donors has yet to be validated. This is especially important in older living donors, who often have senescence-related depression of GFR.We compared urinary creatinine clearance, four-variable Modification of Diet in Renal Disease estimating equation (eGFR), and the recently reported CKD-EPI GFR estimating equation with true GFR measured by the urinary iothalamate clearance (iGFR) in 64 subjects after kidney donation.Creatinine clearance overestimated iGFR. Both creatinine-based estimating equations were poorly correlated with and underestimated iGFR. More than half of kidney donors had eGFR <60 ml/min per 1.73 m(2) after donation, a level that categorized them as having stage 3 chronic kidney disease by our current laboratory reporting, whereas only 25% had iGFR <60 ml/min per 1.73 m(2). This misclassification disproportionately affected older donors age > or =55 years, of whom 80% had eGFR <60 ml/min per 1.73 m(2). Neither significant albuminuria nor hypertension was observed.The current practice of reporting eGFR after donation commonly leads to a misclassification of chronic kidney disease, particularly in older donors. To ensure long-term well-being of living kidney donors, more precise estimates of GFR are required, particularly among older potential donors.

Abstract

The 5-yr survival rate of renal allografts is significantly lower for grafts from older deceased donors than from younger deceased donors. For evaluation of the potential contribution of renal senescence in this shortened graft survival, glomerular function and structure were analyzed in allografts from deceased donors older than 55 yr ("aging") or younger than 40 yr ("youthful"). Aging donors had a significantly higher prevalence of sclerotic glomeruli (P < 0.002), and their nonsclerotic glomeruli tended to be larger, had a larger filtration surface area (P = 0.02), and had a higher single-nephron ultrafiltration coefficient (K(f); P = 0.07), suggesting a compensatory response to functional loss of glomeruli. After serum creatinine reached a stable nadir in the transplant recipients, GFR and its hemodynamic determinants were evaluated and the whole allograft K(f) was computed. Compared with the allografts from youthful donors, allografts from aging donors exhibited a 32% lower GFR, which was exclusively attributable to a 45% reduction in allograft K(f) (both P < 0.001). In addition, the number of functioning glomeruli per allograft was profoundly lower in grafts from aging donors than from youthful donors (3.6 +/- 2.1 x 10(5) versus 8.5 +/- 3.4 x 10(5); P < 0.01), and this could not be explained by the relatively modest 17% prevalence of global glomerulosclerosis in the aging group. The marked reduction in overall glomerular number in many aging donors may lead to a "remnant kidney" phenomenon, potentially explaining the shorter mean survival of these allografts.

Abstract

Human minor histocompatibility antigens (mHA) and clinically relevant immune responses to them have not been well defined in organ transplantation. We hypothesized that women with male kidney transplants would develop antibodies against H-Y, the mHA encoded on the Y-chromosome, in association with graft rejection.We tested sera from 118 consecutive transplant recipients with kidney biopsies. Antibodies that specifically recognized the recombinant H-Y antigens RPS4Y1 or DDX3Y were detected by IgG enzyme-linked immunosorbent assay and western blotting. Immunogenic epitopes were further identified using overlapping H-Y antigen peptides for both the H-Y proteins.In the 26 female recipients of male kidneys, H-Y antibody development posttransplant (1) was more frequent (46%) than in other gender combinations (P<0.001), (2) showed strong correlation with acute rejection (P=0.00048), (3) correlated with plasma cell infiltrates in biopsied kidneys (P=0.04), and (4) did not correlate with C4d deposition or donor-specific anti-human leukocyte antigen (HLA) antibodies. Of the two H-Y antigens, RPS4Y1 was more frequently recognized (P=0.005).This first demonstration of a strong association between H-Y antibody development and acute rejection in kidney transplant recipients shows that in solid organ allografts, humoral immune responses against well defined mHA have clear clinical correlates, can be easily monitored, and warrant study for possible effects on long-term graft function.

Abstract

We examined the magnitude of adaptive hyperfiltration in the remaining kidney of 16 aging (>57 yr) and 16 youthful (<55 yr) individuals who had undergone a contralateral nephrectomy. Healthy volunteers who were youthful (n = 143) or aging (n = 37) provided control values for the binephric condition. One-kidney glomerular filtration rate (GFR; +42%), renal plasma flow (+38%), plasma oncotic pressure (+2.8 mmHg), and mean arterial pressure (+7.0 mmHg) were all higher in youthful uninephric vs. binephric subjects. Corresponding excesses in aging uninephric vs. binephric subjects were by 38 and 36% and 1.4 and 14.0 mmHg, respectively. Modeling of these data revealed that an isolated increase in either the glomerular ultrafiltration coefficient (K(f)) by 110% or in the transcapillary hydraulic pressure gradient (DeltaP) by 7 mmHg, could account for the observed level of hyperfiltration in youthful uninephric subjects. Corresponding increases for aging uninephric subjects were 61% for K(f) and 5 mmHg for DeltaP. We conclude that the magnitude of adaptive hyperfiltration is similar in aging to that in youthful uninephric subjects, albeit at a lower absolute GFR level. Isolated increases in either K(f) or DeltaP or a combination of smaller increases in both can account for the hyperfiltration. Greater adaptive arterial hypertension in aging than youthful uninephric subjects raises the possibility of a disproportionate role for glomerular hypertension and DeltaP elevation in aging compared with youthful uninephric subjects. Glomerular hypertension could exacerbate the sclerosing glomerulopathy of senescence and lead to renal insufficiency. We recommend that living donors of a kidney transplantation in or beyond the seventh decade be used with caution.

Abstract

Resistance to growth hormone (GH) in end-stage renal disease (ESRD) causes growth retardation and muscle wasting. In humans, circulating GH binding protein (GHBP), the extracellular domain of the GH receptor that is shed into the circulation and is believed to reflect tissue GH receptor levels, is reduced in uremia and suggests that cellular GH receptor levels are correspondingly reduced. If true, this could be a cause of GH resistance. We set out to establish whether serum GHBP levels reflect cellular GH receptor levels and whether changes in serum GHBP levels are related to nutritional or inflammatory status.GH receptor protein expression in peripheral blood mononuclear cells (PBMC) from 21 ESRD and 14 normal subjects were analyzed by fluorochrome flow cytometry.The GH receptor density and percent total PBMCs expressing the GH receptor were similar in the 2 groups, and there was no difference in percent GH receptor positive T or B cells or monocytes. In contrast, serum GHBP levels were 80% lower in ESRD. GHBP levels did not correlate with serum albumin, body mass index, or muscle mass but seemed to be partly related to the log serum C-reactive protein levels.Serum GHBP levels are markedly reduced in ESRD; this seems to occur independent of nutritional status and may in part be caused by inflammation. Because GH receptor expression on PBMC of ESRD and control subjects was similar, our findings argue against a reduction in GH receptor as a cause of GH resistance and the use of serum GHBP levels as a reliable marker of specific tissue GH receptor levels.

Abstract

Certain clinical risk factors are associated with significant coronary artery disease in kidney transplant candidates with diabetes mellitus. We sought to validate the use of a clinical algorithm in predicting post-transplantation mortality in patients with type 1 diabetes. We also examined the prevalence of significant coronary lesions in high-risk transplant candidates.All patients with type 1 diabetes evaluated between 1991 and 2001 for kidney with/without pancreas transplantation were classified as high-risk based on the presence of any of the following risk factors: age >or=45 yr, smoking history >or=5 pack years, diabetes duration >or=25 yr or any ST-T segment abnormalities on electrocardiogram. Remaining patients were considered low risk. All high-risk candidates were advised to undergo coronary angiography. The primary outcome of interest was all-cause mortality post-transplantation.Eighty-four high-risk and 42 low-risk patients were identified. Significant coronary artery stenosis was detected in 31 high-risk candidates. Mean arterial pressure was a significant predictor of coronary stenosis (odds ratio 1.68; 95% confidence interval 1.14-2.46), adjusted for age, sex and duration of diabetes. In 75 candidates who underwent transplantation with median follow-up of 47 months, the use of clinical risk factors predicted all eight deaths. No deaths occurred in low-risk patients. A significant mortality difference was noted between the two risk groups (p = 0.03).This clinical algorithm can identify patients with type 1 diabetes at risk for mortality after kidney with/without pancreas transplant. Patients without clinical risk factors can safely undergo transplantation without further cardiac evaluation.

Abstract

Cytokines consist of a large family of secreted proteins, including pro-inflammatory agents, growth hormone and erythropoietin, that utilize the Janus kinase (JAK) signal transducer and activator of transcription (STAT) signal transduction pathway to mediate many of their key physiologic and pathologic actions. These actions include cytokine-mediated inflammation, immunoregulation, hematopoiesis and growth. The JAK-STAT pathway is regulated by several processes, among which negative feedback regulation by the suppressors of cytokine signaling (SOCS), members of a family of eight proteins, is particularly important. Each cytokine induces one or more specific SOCS proteins that in turn down-regulate the signal initiated by the cytokine. Through their impact on the cytokine-activated JAK-STAT pathway, the SOCS proteins are involved in many diseases that come to the attention of the pediatric nephrologist. For example, an increase in the expression of SOCS-2 and -3 may be a cause of growth hormone resistance and thus may contribute to the growth retardation that affects children with chronic renal failure. Because of their obvious biologic importance, the SOCS proteins have been the subject of intense research that includes the development of strategies to utilize these proteins to control cytokine-induced JAK/STAT signal transduction for therapeutic purposes.

Growth hormone resistance in uremia, a role for impaired JAK/STAT signaling7th Symposium on Growth and Development in Children with Chronic Kidney DiseaseRabkin, R., Sun, D. F., Chen, Y., Tan, J., Schaefer, F.SPRINGER.2005: 313–18

Abstract

Resistance to growth hormone (GH) is a significant complication of advanced chronic renal failure. Thus while the circulating GH levels are normal or even elevated in uremia, resistance to the hormone leads to stunting of body growth in children and contributes to muscle wasting in adults. Insensitivity to GH is the consequence of multiple defects in the GH/insulin-like growth factor-1 (IGF-1) system. Expression of the GH receptor may be reduced, although this is not a consistent finding, GH activation of the Janus kinase 2-signal transducer (JAK2) and activator of transcription (STAT) signal transduction pathway is depressed and this leads to reduced IGF-1 expression, and finally there is resistance to IGF-1, a major mediator of GH action. We review these various defects with an emphasis on the GH-activated JAK2-STAT5 pathway, since this pathway is essential for normal body growth and there has been recent progress in our understanding of the perturbations that occur in uremia.

Abstract

Nephron underdosing and donor kidney-recipient body size mismatch can lead to poor allograft function. The purpose of this study is to examine the relationship between donor kidney volume and posttransplantation graft function by using magnetic resonance imaging (MRI) to obtain renal volumes. Previous investigators used donor body surface area as a surrogate for kidney size or measured renal volume by using ultrasonography; both these techniques are inaccurate measures of renal volume. Intraoperative weights are more accurate, but provide information only after the transplantation is underway. More recently, MRI has been used preoperatively to screen living donors; these novel MRI techniques also provide information regarding renal size.We performed a retrospective analysis of 54 patients who underwent living donor transplantation at our institution from 2000 to 2002. All living donors underwent preoperative renovascular imaging using MRI, and renal volumes were obtained for each donor. A transplant kidney volume-recipient body weight (Vol/Wt) ratio was determined for each donor-recipient pair, and patients were divided into tertiles corresponding to 3 groups: high (>2.7), medium (2 to 2.7), and low (<2) "nephron dose" ratios.Glomerular filtration rate (GFR) correlated with Vol/Wt ratio at 6 and 12 months (r = 0.46; P = 0.0005 and r = 0.41; P = 0.003). At 6 months, mean GFRs in the low, medium, and high groups were 52.4 +/- 2.8 (SEM), 64.5 +/- 6.2, and 82.0 +/- 4.4 mL/min, respectively (P < 0.0005). At 12 months, GFRs in the low, medium, and high groups were 51.6 +/- 3.6, 63.3 +/- 3.8, and 83.9 +/- 5.4 mL/min, respectively (P < 0.0001).Transplantation of donor-recipient pairs with a Vol/Wt ratio less than 2 cm 3 /kg was associated with significantly worse graft function. Donor kidney volumes measured by means of preoperative MRI can be used to calculate Vol/Wt ratios before transplantation and identify patients at risk for a low GFR posttransplantation.

Abstract

Since 1995, dual-kidney transplantation using organs from marginal donors has been used at our center to expand the organ donor pool and decrease the waiting time for deceased donor kidney transplantation. This approach has allowed for a shorter waiting period without compromising outcome in the early posttransplant period. We now have 8-year follow-up in the first recipients. Older individuals were offered this option preferentially, because we reasoned that they would stand to benefit most from the shorter waiting period.Patients aged 55 years or more who underwent either dual-kidney transplantation with expanded criteria donors or single-kidney transplantation with standard donors were included in this study. All expanded criteria donor organs were those that were refused by all other local transplant centers. The primary endpoints were recipient death and graft failure.Waiting time for dual-kidney transplantation was 440 +/- 38 days versus 664 +/- 51 days for single-kidney transplantation (P<0.01). The 8-year actuarial patient survivals for the single- and dual-kidney transplants were 74.1% and 82.1%, respectively. The 8-year actuarial graft survivals for the single- and dual-kidney transplants were 59.4% and 69.7%, respectively.Eight-year actuarial patient and graft survivals in older individuals who underwent dual-kidney transplantation are equivalent to those who underwent standard single-kidney transplantation. With the continuing organ shortage and increasing waiting times for cadaver kidney transplantation, dual-kidney transplantation using organs that would otherwise be discarded offers a good option for older individuals who may not withstand a long waiting period.

Abstract

The purpose of the present study was to confirm the extent to which glomerular filtration rate (GFR) is depressed in healthy, aging subjects and to elucidate the mechanism of such hypofiltration.Healthy volunteers aged 18 to 88 years (N = 159) underwent a determination of GFR, renal plasma flow (RPF), afferent oncotic pressure, and arterial pressure. Glomeruli in renal biopsies of healthy kidney transplant donors aged 23 to 69 years (N = 33) were subjected to a morphometric analysis, so as to determine glomerular hydraulic permeability and filtration surface area. The aforementioned GFR determinants were then subjected to mathematical modeling to compute the glomerular ultrafiltration coefficient (Kf) for two kidneys and individual glomeruli.GFR was significantly depressed (P < 0.0001) by 22% in aging (>or=55 years old) compared to youthful subjects (

Abstract

Rodents and dogs conditioned with total-lymphoid irradiation (TLI), with or without antithymocyte globulin (ATG), have been shown to develop mixed chimerism and immune tolerance without graft-versus-host disease (GVHD) after the infusion of major histocompatability complex (MHC)-mismatched donor bone marrow cells given alone or in combination with an organ allograft.Four human leukocyte antigen (HLA)-mismatched recipients of living donor kidney transplants were conditioned with TLI and ATG posttransplantation and infused with cyropreserved donor granulocyte colony-stimulating factor (G-CSF) "mobilized" hematopoietic progenitor (CD34+) cells (3-5x10(6) cells/kg) thereafter. Maintenance prednisone and cyclosporine dosages were tapered, and recipients were monitored for chimerism, GVHD, graft function, T-cell subsets in the blood, and antidonor reactivity in the mixed leukocyte reaction (MLR).Three of the four patients achieved multilineage macrochimerism, with up to 16% of donor-type cells among blood mononuclear cells without evidence of GVHD. Prolonged depletion of CD4+ T cells was observed in all four patients. Rejection episodes were not observed in the three macrochimeric recipients, and immunosuppressive drugs were withdrawn in the first patient by 12 months. Prednisone was withdrawn from a second patient at 9 months, and cyclosporine was tapered thereafter.Multilineage macrochimerism can be achieved without GVHD in HLA-mismatched recipients of combined kidney and hematopoietic progenitor transplants. Conditioning of the host with posttransplant TLI and ATG was nonmyeloablative and was not associated with severe infections. Recipients continue to be studied for the development of immune tolerance.

Abstract

Norepinephrine and the beta-adrenergic receptor agonist, isoproterenol, have been shown to potentiate the amplitude of GABAA receptor-mediated whole-cell current responses in Purkinje cells acutely dissociated from the rat cerebellum. However, the steps leading from the activation of beta-adrenergic receptors to the modulation of GABAA receptor remain to be delineated. This study tested the hypothesis that a sequelae of intracellular intermediaries involving the cyclic AMP second messenger system serves as the subcellular link to promote this heteroreceptor interaction. Exposure to cholera toxin, but not to pertussis toxin, increased the amplitude of GABA-activated current responses in acutely dissociated Purkinje cells. Intracellular dialysis with guanosine 5'-O-(3-thiotriphosphate) also resulted in a time- and dose-dependent augmentation of the response to GABA. while guanosine 5'-O-(2-thiodiphosphate) blocked the norepinephrine-mediated facilitation. A positive modulation of the current response to GABA was observed following intracellular delivery of cyclic AMP or the catalytic subunit of the cyclic AMP-dependent protein kinase. Furthermore, the norepinephrine-induced potentiation of the GABA-activated current response was prevented in the presence of the Rp isomer of cyclic AMP, the regulatory subunit of cyclic AMP-dependent protein kinase and an inhibitor of cyclic AMP-dependent protein kinase. These findings led to the formulation of a working model in which activation of the beta-adrenergic receptor triggers a Gs-protein-mediated transduction cascade in cerebellar Purkinje cells which activates adenylate cyclase, resulting in a rise in intracellular levels of cyclic AMP, increased phosphorylating activity by cyclic AMP-dependent protein kinase and, ultimately, a potentiation of GABAA receptor function.

Abstract

Previous studies employing extracellular single-unit recording in the intact cerebellum have demonstrated that norepinephrine can potentiate GABA-induced suppression of Purkinje cell spike activity. However, many issues related to the nature of this modulatory phenomenon remain to be resolved. Using whole-cell patch clamp recording, the present study investigated the effect of norepinephrine on GABA-activated membrane currents (IGABA) in solitary Purkinje cells isolated from neonatal rat cerebella following acute dissociation. Exposure of Purkinje cells to norepinephrine at a concentration which, by itself, had no obvious effect on Purkinje cell membrane conductance, consistently augmented IGABA. The catecholamine also potentiated GABA-gated chloride currents as well as muscimol-induced currents in Purkinje cells. Thus, the facilitating effect of norepinephrine on IGABA was attributed to an interaction between norepinephrine and the GABAA receptor-mediated chloride conductance. The effect of norepinephrine could be mimicked by isoproterenol as well as by 8-bromo cAMP, suggesting that a beta-receptor-mediated, cAMP-dependent cascade may underlie the observed heteroreceptor interaction. Our results establish the existence of a postsynaptic mechanism by which norepinephrine, through activation of the beta-adrenoceptor, may modulate GABAA receptor function in cerebellar Purkinje cells. This study provides the groundwork for a detailed investigation into the cascade of membrane and intracellular events underlying such a synergistic modulatory interaction at the cellular and subcellular levels.

Abstract

This study examined the morphology and the development of inward currents in the course of differentiation of a stem cell toward a neuronal phenotype. Using the P19 embryonal cell line, whole-cell current profiles of P19 cells before, during and after retinoic acid-induced differentiation were matched with their morphology as well as with the expression of neuron-specific enolase-like immunoreactivity. Prior to and during the initial 48 hr of retinoic acid treatment, P19 cells either lacked detectable currents or expressed a voltage-dependent outward potassium current, did not display neuron-like morphology and did not express neuron-specific enolase-like immunoreactivity. Upon completion of retinoic acid treatment, the current profile of fully differentiated P19 cells was hallmarked by a large voltage-dependent inward current which consisted of a sodium current and a smaller cobalt-sensitive calcium component, in addition to the potassium current observed earlier. Such cells invariably emitted neurites and displayed neuron-specific enolase-like immunoreactivity. Interestingly, coupling was prevalent among P19 cells in the undifferentiated state but was absent in the fully differentiated cultures. In studying cells undergoing neuronal differentiation, these results underscore the importance of taking into account both electrical properties and morphological considerations in determining the degree of differentiation.

Abstract

A spectrum of studies has been conducted on a single aspect of NE function in which, through a beta-one receptor activation, NE appears to mediate a degree of physiological control over the gain of GABA mediated inhibition. It is significant that this single effect has been observed in numerous interrelated preparations ranging from single isolated Purkinje cells from young rats to adult Purkinje cells in awake locomoting rats. With respect to the functional conse-quences of these effects, our best current speculation as to "what NE does" is that NE acts to regulate the strength of these tuned gating mechanisms in both cerebral and cerebellar cortices. There are numerous unanswered questions raised by the past work. One pressing issue is - when and for what reason in normal function does the modulation take place? When does NE release normally occur (is it phasic or tonic), and which of the demonstrated actions appears and for how long in relation to period of receptor activation? Does NE release cause the circuit to "react" to conditions which need "improved neurocomputation" or does NE stabilize the circuit to react predictably in the face of stress? Finally, what is the molecular sequence of events between receptor activation and an alteration of GABA receptor channel opening? What additional molecular control mechanisms exist and how can the diverse inhibitory and modulatory phenomena be reconciled, both short and long term? Issues are defined which need to be clarified at all levels of the current skeleton of basic understanding. Our prediction is that pursuit of these issues will benefit from an exchange of insight gained from investigations at all levels.

Abstract

A series of studies has been conducted to determine the mode of action on the cerebellar cortical circuitry of the norepinephrine (NE)-containing afferents from the locus coeruleus. NE has been known to exert an "inhibitory" action on the background firing observed in Purkinje cells, due presumably to a shift in conductances favoring hyperpolarization. An additional independent action at low threshold appears to be an enhancement of GABA, the inhibitory transmitter of cerebellar interneurons. Recent whole-cell patch-clamp studies on isolated Purkinje cells indicate that exposure to NE increases the chloride current caused by transient pulses of GABA applied iontophoretically. NE applied to Purkinje cells in the parafloccular lobule during stimulation by moving visual patterns revealed the capacity either to "gate" signals initially not expressed, or to amplify the gain of phasic excitations. The control of emergent circuit functions may be the functional consequence of the multiple modulatory functions of NE.

Abstract

This paper describes experiments on GABA-activated whole-cell membrane currents in bipolar cells freshly isolated from the adult rat retina. The main goal was to determine whether bipolar cell responses to GABA could be resolved in terms of mediation by the GABAA receptor, the GABAB receptor, or both. Bipolar cells were isolated by gentle enzymatic dissociation and identified by their distinct morphology. GABA agonists and antagonists were applied focally by pressure and the resultant currents were recorded under whole-cell voltage clamp. In all bipolar cells tested, GABA (0.1-100 microM) induced a monophasic response associated with a conductance increase (IGABA). The shift in reversal potential for IGABA as a function of pipet [Cl-] paralleled that predicted based on the Nernst equation for Cl-. IGABA was mimicked by muscimol (5-20 microM) and antagonized by bicuculline (20-100 microM). Baclofen (0.1-1.0 mM) produced no apparent conductance change. "Hot spots" of sensitivity to GABA which might be associated with regions of synaptic contact were not found; both the soma and processes of all bipolar cells were responsive to focally applied GABA. Furthermore, all bipolar cells tested responded to glycine. In conclusion, we have established the presence of GABAA receptors on rat retinal bipolar cells. Our data suggest further that these cells lack GABAB receptors. Finally, our observation that bipolar cells in the rat retina are relatively homogeneous in terms of their sensitivity to GABA and glycine lead us to postulate that the functional significance of the presence of receptors and their distribution on a neuron may be dictated more by the topography of the presynaptic inputs than by its inherent chemosensitivity.