Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 55
Filter
1.
Int J Radiat Biol ; 100(3): 385-398, 2024.
Article in English | MEDLINE | ID: mdl-37976378

ABSTRACT

PURPOSE: Total body irradiation (TBI) followed by bone marrow transplantation (BMT) is used in pre-clinical research to generate mouse chimeras that allow to study the function of a protein specifically on immune cells. Adverse consequences of irradiation on the juvenile body and brain are well described and include general fatigue, neuroinflammation, neurodegeneration and cognitive impairment. Yet, the long-term consequences of TBI/BMT performed on healthy adult mice have been poorly investigated. MATERIAL AND METHODS: We developed a robust protocol to achieve near complete bone marrow replacement in mice using 2x550cGy TBI and evaluated the impact of the procedure on their general health, mood disturbances, memory, brain atrophy, neurogenesis, neuroinflammation and blood-brain barrier (BBB) permeability 2 and/or 16 months post-BMT. RESULTS: We found a persistent decrease in weight along with long-term impact on locomotion after TBI and BMT. Although the TBI/BMT procedure did not lead to anxiety- or depressive-like behavior 2- or 16-months post-BMT, long-term spatial memory of the irradiated mice was impaired. We also observed radiation-induced impaired neurogenesis and cortical microglia activation 2 months post-BMT. Moreover, higher levels of hippocampal IgG in aged BMT mice suggest an enhanced age-related increase in BBB permeability that could potentially contribute to the observed memory deficit. CONCLUSIONS: Overall health of the mice did not seem to be majorly impacted by TBI followed by BMT during adulthood. Yet, TBI-induced alterations in the brain and behavior could lead to erroneous conclusions on the function of a protein on immune cells when comparing mouse chimeras with different genetic backgrounds that might display altered susceptibility to radiation-induced damage. Ultimately, the BMT model we here present could also be used to study the related long-term consequences of TBI and BMT seen in patients.


Subject(s)
Bone Marrow Transplantation , Whole-Body Irradiation , Humans , Adult , Mice , Animals , Aged , Whole-Body Irradiation/adverse effects , Neuroinflammatory Diseases , Mice, Inbred C57BL , Brain
2.
Pediatr Blood Cancer ; 70(1): e30081, 2023 01.
Article in English | MEDLINE | ID: mdl-36377714

ABSTRACT

BACKGROUND: Childhood cancer causes significant physical and emotional stress. Patients and families benefit from palliative care (PC) to reduce symptom burden, improve quality of life, and enhance family-centered care. We evaluated palliative opportunities across leukemia/lymphoma (LL), solid tumors (ST), and central nervous system (CNS) tumor groups. PROCEDURE: A priori, nine palliative opportunities were defined: disease progression/relapse, hematopoietic stem cell transplant, phase 1 trial enrollment, admission for severe symptoms, social concerns or end-of-life (EOL) care, intensive care admission, do-not-resuscitate (DNR) status, and hospice enrollment. A single-center retrospective review was completed on 0-18-year olds with cancer who died from January 1, 2012 to November 30, 2017. Demographic, disease, and treatment data were collected. Descriptive statistics were performed. Opportunities were evaluated from diagnosis to death and across disease groups. RESULTS: Included patients (n = 296) had LL (n = 87), ST (n = 114), or CNS tumors (n = 95). Palliative opportunities were more frequent in patients with ST (median 8) and CNS tumors (median 7) versus LL (median 5, p = .0005). While patients with ST had more progression/relapse opportunities (p < .0001), patients with CNS tumors had more EOL opportunities (p < .0001), earlier PC consultation, DNR status, and hospice enrollment. Palliative opportunities increased toward the EOL in all diseases (p < .0001). PC was consulted in 108 (36%) patients: LL (48%), ST (30%), and CNS (34%, p = .02). CONCLUSIONS: All children with cancer incur many events warranting PC support. Patients with ST and CNS tumors had more palliative opportunities than LL, yet received less subspecialty PC. Understanding palliative opportunities within each disease group can guide PC utilization to ease patient and family stress.


Subject(s)
Leukemia , Lymphoma , Neoplasms , Terminal Care , Child , Humans , Neoplasms/therapy , Palliative Care , Quality of Life , Recurrence , Retrospective Studies , Terminal Care/psychology , Infant, Newborn , Infant , Child, Preschool , Adolescent , Clinical Trials, Phase I as Topic
3.
J Adolesc Young Adult Oncol ; 11(4): 402-409, 2022 08.
Article in English | MEDLINE | ID: mdl-34582272

ABSTRACT

Purpose: Adolescent patients with cancer experience unique stressors due to their developmental stage, with increased physical, emotional, and social distress. Palliative care (PC) serves an important role in pediatric cancer care. We examined "palliative opportunities," or events during a patient's cancer course where subspecialty PC would be warranted and compared opportunities between adolescents and younger patients. Methods: Patients from a single center, 0-18 years of age at cancer diagnosis, who died from January 1, 2012, to November 30, 2017, were included. In this secondary analysis, patients were divided into cohorts based on age at diagnosis: 0-12 and 13-18 years. Demographic, disease, and treatment data were collected. Descriptive statistics and modeling were performed. Number, type, and timing of palliative opportunities and PC consultation timing and reason were evaluated across cohorts. Results: Of the 296 patients included for analysis, 27.7% were 13-18 years (82/296) at diagnosis. Frequency of palliative opportunities did not differ by age (median 7.0 [interquartile range 4.0 and 10.0] in both cohorts). PC consultation occurred in 36.5% (108/296), with neither rate nor timing differing by age group. PC consultations in adolescents were more often for symptom management (p = 0.0001). Adolescent patients were less likely to have a do-not-resuscitate order placed before death (61.0%, 50/82) compared to younger patients (73.8%, 158/214, p = 0.03). Conclusion: Adolescent patients with cancer did not experience more palliative opportunities than younger patients in this cohort, although they often have challenging psychological, family, and social stressors that were not identified. Incorporating additional palliative opportunities could enhance identification of stress and symptoms in adolescents with cancer such that PC could be timed to meet their needs.


Subject(s)
Neoplasms , Palliative Care , Adolescent , Child , Cohort Studies , Humans , Neoplasms/therapy , Referral and Consultation , Retrospective Studies
4.
Neurooncol Pract ; 8(4): 451-459, 2021 Aug.
Article in English | MEDLINE | ID: mdl-34277023

ABSTRACT

BACKGROUND: Children with brain and central nervous system (CNS) tumors experience substantial challenges to their quality of life during their disease course. These challenges are opportunities for increased subspecialty palliative care (PC) involvement. Palliative opportunities have been defined in the pediatric oncology population, but the frequency, timing, and factors associated with palliative opportunities in pediatric patients with CNS tumors are unknown. METHODS: A single-institution retrospective review was performed on children ages 0-18 diagnosed with a CNS tumor who died between January 1, 2012 and November 30, 2017. Nine palliative opportunities were defined prior to data collection (progression, relapse, admission for severe symptoms, intensive care admission, bone marrow transplant, phase 1 trial, hospice, do-not-resuscitate (DNR) order). Demographic, disease, treatment, palliative opportunity, and end-of-life data were collected. Opportunities were evaluated over quartiles from diagnosis to death. RESULTS: Amongst 101 patients with a median age at death of eight years (interquartile range [IQR] = 8.0, range 0-22), there was a median of seven (IQR = 6) palliative opportunities per patient, which increased closer to death. PC consultation occurred in 34 (33.7%) patients, at a median of 2.2 months before death, and was associated with having a DNR order (P = .0028). Hospice was involved for 72 (71.3%) patients. CONCLUSION: Children with CNS tumors suffered repeated events warranting PC yet received PC support only one-third of the time. Mapping palliative opportunities over the cancer course promotes earlier timing of PC consultation which can decrease suffering and resuscitation attempts at the end-of-life.

5.
Eur J Public Health ; 28(4): 657-661, 2018 08 01.
Article in English | MEDLINE | ID: mdl-29596591

ABSTRACT

Background: Female genital mutilation (FGM) is most commonly encountered in Africa and the Middle East, with migration from FGM-practicing countries meaning it is increasingly seen in Europe. Addressing FGM requires accurate information on who is affected but ascertainment is notoriously difficult. This study estimated FGM prevalence in women presenting for maternity care in the Lothian region of Scotland and compared this with that expected by extrapolation of survey data from women's country of birth. Methods: Electronic clinical records were linked to birth registration data to estimate FGM in the obstetric patients in Lothian from 2010 to 2013. Results: Among all, 107 women affected by FGM were detected, at a rate of 2.8/1000 pregnancies. Of 487 women from UNICEF-recognized FGM-practicing countries who accessed care, 87 (18%) had documented evidence of FGM (three quarters of whom came from Nigeria, Sudan or The Gambia). The prevalence was 54% of the level expected from the extrapolation method. Country of birth had a sensitivity of 81% for FGM. Conclusion: Women from FGM-practicing countries commonly access maternity care in Lothian. This confirms the need for ongoing training and investment in identifying and managing FGM. Matching electronic clinical records with birth registration data was a useful methodology in estimating the level of FGM in the maternity population. In a European country like Scotland with modest migrant numbers, asking country of birth during pregnancy and making sensitive enquiries could detect 81% of women with FGM. Extrapolation from maternal country of birth surveys grossly overestimates the true prevalence.


Subject(s)
Circumcision, Female/statistics & numerical data , Electronic Health Records/statistics & numerical data , Emigrants and Immigrants/statistics & numerical data , Genitalia, Female/surgery , Adult , Africa/epidemiology , Female , Humans , Prevalence , Scotland/epidemiology , Surveys and Questionnaires
6.
JDR Clin Trans Res ; 3(3): 279-287, 2018 07.
Article in English | MEDLINE | ID: mdl-30938601

ABSTRACT

Few studies have examined the relation between food consumption and related attitudes and dental pain among children. The objective of this study is to examine the associations of healthy and unhealthy food items, attitudes toward healthy food, and self-efficacy of eating healthy with dental pain among children. A cross-sectional analysis was performed using child survey data from the Texas Childhood Obesity Research Demonstration (TX CORD) project. Fifth-grade students ( n = 1,020) attending 33 elementary schools in Austin and Houston, Texas, completed the TX CORD Child Survey, a reliable and valid survey instrument focused on nutrition and physical activity behaviors. All nutrition questions ask about the number of times food and beverage items were consumed on the previous day. Dental pain was reported as mouth or tooth pain in the past 2 wk that made their mouth hurt so much that they could not sleep at night. Mixed-effects logistic regression models were used to test the association between 10 unhealthy food items, 9 healthy food items, 2 health attitudes, and self-efficacy with dental pain. All models controlled for sociodemographic variables. In total, 99 (9.7%) students reported dental pain. Dental pain was associated with intake of the following unhealthy items: soda, fruit juice, diet soda, frozen desserts, sweet rolls, candy, white rice/pasta, starchy vegetables, French fries/chips, and cereal (adjusted odds ratio [AOR], 1.27-1.81, P < 0.01). The intake of other vegetables (AOR, 1.56; P < 0.01), a healthy item, and the attitude that healthy food tastes good (AOR, 1.59; P = 0.04) were also positively associated with dental pain. The attitude of eating healthier leads to fewer health problems (AOR, 0.50) and self-efficacy for healthy eating (AOR, 0.44) were negatively associated with dental pain ( P < 0.01). Interventions should focus on improving oral health by reducing intake of unhealthy foods and educating children and families on the importance of diet as a means of reducing dental caries. Knowledge Transfer Statement: The results of this study can be used to inform researchers on potential food items and psychosocial measures to examine in low-income, minority populations for longitudinal research. These results would also be useful to educators who could incorporate oral health care and nutrition education into school curriculums.


Subject(s)
Dental Caries , Adolescent , Attitude , Child , Cross-Sectional Studies , Humans , Pain , Texas
7.
Am J Transplant ; 17(12): 3123-3130, 2017 Dec.
Article in English | MEDLINE | ID: mdl-28613436

ABSTRACT

Incompatible living donor kidney transplantation (ILDKT) has been established as an effective option for end-stage renal disease patients with willing but HLA-incompatible living donors, reducing mortality and improving quality of life. Depending on antibody titer, ILDKT can require highly resource-intensive procedures, including intravenous immunoglobulin, plasma exchange, and/or cell-depleting antibody treatment, as well as protocol biopsies and donor-specific antibody testing. This study sought to compare the cost and Medicare reimbursement, exclusive of organ acquisition payment, for ILDKT (n = 926) with varying antibody titers to matched compatible transplants (n = 2762) performed between 2002 and 2011. Data were assembled from a national cohort study of ILDKT and a unique data set linking hospital cost accounting data and Medicare claims. ILDKT was more expensive than matched compatible transplantation, ranging from 20% higher adjusted costs for positive on Luminex assay but negative flow cytometric crossmatch, 26% higher for positive flow cytometric crossmatch but negative cytotoxic crossmatch, and 39% higher for positive cytotoxic crossmatch (p < 0.0001 for all). ILDKT was associated with longer median length of stay (12.9 vs. 7.8 days), higher Medicare payments ($91 330 vs. $63 782 p < 0.0001), and greater outlier payments. In conclusion, ILDKT increases the cost of and payments for kidney transplantation.


Subject(s)
Blood Group Incompatibility/economics , Graft Rejection/economics , Histocompatibility Testing/economics , Kidney Failure, Chronic/surgery , Kidney Transplantation/economics , Living Donors , Postoperative Complications/economics , Case-Control Studies , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Rejection/epidemiology , Graft Survival , Humans , Kidney Function Tests , Male , Middle Aged , Prognosis , Quality of Life , Retrospective Studies , Risk Factors
8.
Am J Transplant ; 17(12): 3131-3140, 2017 Dec.
Article in English | MEDLINE | ID: mdl-28510355

ABSTRACT

In 2013, the Organ Procurement and Transplantation Network (OPTN)/ United Network for Organ Sharing (UNOS) mandated that transplant centers collect data on living kidney donors (LKDs) at 6 months, 1 year, and 2 years postdonation, with policy-defined thresholds for the proportion of complete living donor follow-up (LDF) data submitted in a timely manner (60 days before or after the expected visit date). While mandated, it was unclear how centers across the country would perform in meeting thresholds, given potential donor and center-level challenges of LDF. To better understand the impact of this policy, we studied Scientific Registry of Transplant Recipients data for 31,615 LKDs between January 2010 and June 2015, comparing proportions of complete and timely LDF form submissions before and after policy implementation. We also used multilevel logistic regression to assess donor- and center-level characteristics associated with complete and timely LDF submissions. Complete and timely 2-year LDF increased from 33% prepolicy (January 2010 through January 2013) to 54% postpolicy (February 2013 through June 2015) (p < 0.001). In an adjusted model, the odds of 2-year LDF increased by 22% per year prepolicy (p < 0.001) and 23% per year postpolicy (p < 0.001). Despite these annual increases in LDF, only 43% (87/202) of centers met the OPTN/UNOS-required 6-month, 1-year, and 2-year LDF thresholds for LKDs who donated in 2013. These findings motivate further evaluation of LDF barriers and the optimal approaches to capturing outcomes after living donation.


Subject(s)
Continuity of Patient Care/standards , Delivery of Health Care/standards , Guideline Adherence , Kidney Transplantation , Living Donors , Registries , Tissue and Organ Procurement , Adolescent , Adult , Aged , Female , Follow-Up Studies , Humans , Male , Middle Aged , Prognosis , Risk Factors , United States , Young Adult
9.
Am J Transplant ; 17(12): 3040-3048, 2017 Dec.
Article in English | MEDLINE | ID: mdl-28520316

ABSTRACT

In the setting of an overall decline in living organ donation and new questions about long-term safety, a better understanding of outcomes after living donation has become imperative. Adequate information on outcomes important to donors may take many years to ascertain and may be evident only by comparing large numbers of donors with suitable controls. Previous studies have been unable to fully answer critical questions, primarily due to lack of appropriate controls, inadequate sample size, and/or follow-up duration that is too short to allow detection of important risks attributable to donation. The Organ Procurement and Transplantation Network does not follow donors long term and has no prospective control group with which to compare postdonation outcomes. There is a need to establish a national living donor registry and to prospectively follow donors over their lifetimes. In addition, there is a need to better understand the reasons many potential donors who volunteer to donate do not donate and whether the reasons are justified. Therefore, the US Health Resources and Services Administration asked the Scientific Registry of Transplant Recipients to establish a national registry to address these important questions. Here, we discuss the efforts, challenges, and opportunities inherent in establishing the Living Donor Collective.


Subject(s)
Living Donors , Organ Transplantation , Registries , Tissue and Organ Procurement , Delivery of Health Care , Humans
10.
Am J Transplant ; 17(7): 1823-1832, 2017 Jul.
Article in English | MEDLINE | ID: mdl-28497525

ABSTRACT

New federal regulations allow HIV-positive individuals to be live kidney donors; however, potential candidacy for donation is poorly understood given the increased risk of end-stage renal disease (ESRD) associated with HIV infection. To better understand this risk, we compared the incidence of ESRD among 41 968 HIV-positive participants of North America AIDS Cohort Collaboration on Research and Design followed for a median of 5 years with the incidence of ESRD among comparable HIV-negative participants of National Health and Nutrition Examination III followed for a median of 14 years. We used risk associations from multivariable Cox proportional hazards regression to derive cumulative incidence estimates for selected HIV-positive scenarios (no history of diabetes, hypertension, AIDS, or hepatitis C virus coinfection) and compared these estimates with those from similarly selected HIV-negative scenarios. For 40-year-old HIV-positive individuals with health characteristics that were similar to those of age-matched kidney donors, viral load <400 copies/mL, and CD4+ count ≥500 cells/µL, the 9-year cumulative incidence of ESRD was higher than that of their HIV-negative peers, yet still low: 2.5 versus 1.1 per 10 000 among white women, 3.0 versus 1.3 per 10 000 among white men, 13.2 versus 3.6 per 10 000 among black women, and 15.8 versus 4.4 per 10 000 among black men. HIV-positive individuals with no comorbidities and well-controlled disease may be considered low-risk kidney donor candidates.


Subject(s)
Graft Rejection/epidemiology , HIV Infections/complications , Kidney Failure, Chronic/epidemiology , Kidney Transplantation/adverse effects , Living Donors , Adult , Case-Control Studies , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Rejection/etiology , Graft Survival , HIV Infections/virology , HIV Seropositivity , HIV-1/physiology , Humans , Incidence , Kidney Failure, Chronic/etiology , Kidney Function Tests , Male , Middle Aged , Nephrectomy , North America/epidemiology , Prognosis , Risk Factors , Viral Load
11.
Am J Transplant ; 17(2): 512-518, 2017 Feb.
Article in English | MEDLINE | ID: mdl-27457221

ABSTRACT

Under Share 35, deceased donor (DD) livers are offered regionally to candidates with Model for End-Stage Liver Disease (MELD) scores ≥35 before being offered locally to candidates with MELD scores <35. Using Scientific Registry of Transplant Recipients data from June 2013 to June 2015, we identified 1768 DD livers exported to regional candidates with MELD scores ≥35 who were transplanted at a median MELD score of 39 (interquartile range [IQR] 37-40) with 30-day posttransplant survival of 96%. In total, 1764 (99.8%) exports had an ABO-compatible candidate in the recovering organ procurement organization (OPO), representing 1219 unique reprioritized candidates who would have had priority over the regional candidate under pre-Share 35 allocation. Reprioritized candidates had a median waitlist MELD score of 31 (IQR 27-34) when the liver was exported. Overall, 291 (24%) reprioritized candidates had a comparable MELD score (within 3 points of the regional recipient), and 209 (72%) were eventually transplanted in 11 days (IQR 3-38 days) using a local (50%), regional (50%) or national (<1%) liver; 60 (21%) died, 13 (4.5%) remained on the waitlist and nine (3.1%) were removed for other reasons. Of those eventually transplanted, MELD score did not increase in 57%; it increased by 1-3 points in 37% and by ≥4 points in 5.7% after the export. In three cases, OPOs exchanged regional exports within a 24-h window. The majority of comparable reprioritized candidates were not disadvantaged; however, 21% died after an export.


Subject(s)
Liver Transplantation , Needs Assessment/standards , Severity of Illness Index , Tissue Donors/supply & distribution , Tissue and Organ Procurement , Waiting Lists , Female , Follow-Up Studies , Humans , Liver Failure/physiopathology , Liver Failure/surgery , Male , Middle Aged , Prognosis , Registries
12.
Am J Transplant ; 17(2): 519-527, 2017 Feb.
Article in English | MEDLINE | ID: mdl-27456927

ABSTRACT

The impact of interferon (IFN)-free direct-acting antiviral (DAA) hepatitis C virus (HCV) treatments on utilization and outcomes associated with HCV-positive deceased donor liver transplantation (DDLT) is largely unknown. Using the Scientific Registry of Transplant Recipients, we identified 25 566 HCV-positive DDLT recipients from 2005 to 2015 and compared practices according to the introduction of DAA therapies using modified Poisson regression. The proportion of HCV-positive recipients who received HCV-positive livers increased from 6.9% in 2010 to 16.9% in 2015. HCV-positive recipients were 61% more likely to receive an HCV-positive liver after 2010 (early DAA/IFN era) (aRR:1.45 1.611.79 , p < 0.001) and almost three times more likely to receive one after 2013 (IFN-free DAA era) (aRR:2.58 2.853.16 , p < 0.001). Compared to HCV-negative livers, HCV-positive livers were 3 times more likely to be discarded from 2005 to 2010 (aRR:2.69 2.993.34 , p < 0.001), 2.2 times more likely after 2010 (aRR:1.80 2.162.58 , p < 0.001) and 1.7 times more likely after 2013 (aRR:1.37 1.682.04 , p < 0.001). Donor HCV status was not associated with increased risk of all-cause graft loss (p = 0.1), and this did not change over time (p = 0.8). Use of HCV-positive livers has increased dramatically, coinciding with the advent of DAAs. However, the discard rate remains nearly double that of HCV-negative livers. Further optimization of HCV-positive liver utilization is necessary to improve access for all candidates.


Subject(s)
Hepacivirus/isolation & purification , Hepatitis C/surgery , Liver Transplantation , Tissue Donors , Tissue and Organ Procurement/statistics & numerical data , Transplants/virology , Waiting Lists , Adolescent , Adult , Aged , Antiviral Agents/therapeutic use , Female , Follow-Up Studies , Graft Survival , Hepatitis C/drug therapy , Hepatitis C/virology , Humans , Male , Middle Aged , Transplant Recipients , Treatment Outcome , Young Adult
13.
Am J Transplant ; 16(12): 3548-3553, 2016 12.
Article in English | MEDLINE | ID: mdl-27402293

ABSTRACT

The incidence of live donor transplantation has declined over the past decade, and waitlisted candidates report substantial barriers to identifying a live donor. Since asking someone to donate feels awkward and unfamiliar, candidates are hesitant to ask directly and may be more comfortable with a passive approach. In collaboration with Facebook leadership (Facebook Inc., Menlo Park, CA), we developed a mobile application-an app-that enables waitlisted candidates to create a Facebook post about their experience with organ failure and their need for a live donor. We conducted a single-center prospective cohort study of 54 adult kidney-only and liver-only waitlisted candidates using the Facebook app. Cox proportional hazards models were used to describe donor referral on behalf of candidates using the app compared with matched controls. The majority of candidates who used the app reported it to be "good" or "excellent" with regard to the installation process (82.9%), readability (88.6%), simplicity (70.6%), clarity (87.5%) and the information provided (85.3%). Compared with controls, candidates using the Facebook app were 2.43 6.6117.98 times more likely to have a donor come forward on their behalf (p < 0.001). The Facebook app is an easy-to-use instrument that enables waitlisted candidates to passively communicate with their social network about their need for a live donor.


Subject(s)
Living Donors , Organ Transplantation , Smartphone/statistics & numerical data , Social Media/statistics & numerical data , Tissue and Organ Procurement/methods , Adult , Female , Humans , Male , Middle Aged , Prospective Studies
14.
Am J Transplant ; 16(12): 3540-3547, 2016 12.
Article in English | MEDLINE | ID: mdl-27287605

ABSTRACT

Inferences about late risk of end-stage renal disease (ESRD) in live kidney donors have been extrapolated from studies averaging <10 years of follow-up. Because early (<10 years) and late (≥10 years) postdonation ESRD may differ by causal mechanism, it is possible that extrapolations are misleading. To better understand postdonation ESRD, we studied patterns of common etiologies including diabetes, hypertension and glomerulonephritis (GN; as reported by providers) using donor registry data linked to ESRD registry data. Overall, 125 427 donors were observed for a median of 11.0 years (interquartile range 5.3-15.7 years; maximum 25 years). The cumulative incidence of ESRD increased from 10 events per 10 000 at 10 years after donation to 85 events per 10 000 at 25 years after donation (late vs. early ESRD, adjusted for age, race and sex: incidence rate ratio [IRR] 1.3 1.72.3 [subscripts are 95% confidence intervals]). Early postdonation ESRD was predominantly reported as GN-ESRD; however, late postdonation ESRD was more frequently reported as diabetic ESRD and hypertensive ESRD (IRR 2.3 7.725.2 and 1.4 2.64.6 , respectively). These time-dependent patterns were not seen with GN-ESRD (IRR 0.4 0.71.2 ). Because ESRD in live kidney donors has traditionally been reported in studies averaging <10 years of follow-up, our findings suggest caution in extrapolating such results over much longer intervals.


Subject(s)
Diabetes Mellitus/physiopathology , Glomerulonephritis/complications , Hypertension/complications , Kidney Failure, Chronic/etiology , Kidney Transplantation/methods , Living Donors , Nephrectomy/adverse effects , Adolescent , Adult , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Rejection , Graft Survival , Humans , Kidney Function Tests , Male , Middle Aged , Postoperative Complications , Prognosis , Risk Factors , Young Adult
15.
Am J Transplant ; 16(12): 3532-3539, 2016 12.
Article in English | MEDLINE | ID: mdl-27172445

ABSTRACT

Live kidney donors have an increased risk of end-stage renal disease (ESRD) compared with nondonors; however, it is unknown whether undetected, subclinical kidney disease exists at donation that subsequently contributes to this risk. To indirectly test this hypothesis, the authors followed the donated kidneys, by comparing the outcomes of 257 recipients whose donors subsequently developed ESRD with a matched cohort whose donors remained ESRD free. The compared recipients were matched on donor (age, sex, race/ethnicity, donor-recipient relationship), transplant (HLA mismatch, peak panel-reactive antibody, previous transplantation, year of transplantation), and recipient (age, sex, race/ethnicity, body mass index, cause of ESRD, and time on dialysis) risk factors. Median recipient follow-up was 12.5 years (interquartile range 7.4-17.9, maximum 20 years). Recipients of allografts from donors who developed ESRD had increased death-censored graft loss (74% versus 56% at 20 years; adjusted hazard ratio [aHR] 1.7; 95% confidence interval [CI] 1.5-2.0; p < 0.001) and mortality (61% versus 46% at 20 years; aHR 1.5; 95% CI 1.2-1.8; p < 0.001) compared with matched recipients of allografts from donors who did not develop ESRD. This association was similar among related, spousal, and unrelated nonspousal donors. These findings support a novel view of the mechanisms underlying donor ESRD: that of pre-donation kidney disease. However, biopsy data may be required to confirm this hypothesis.


Subject(s)
Kidney Failure, Chronic/mortality , Kidney Transplantation/mortality , Living Donors , Nephrectomy/mortality , Tissue and Organ Harvesting/adverse effects , Adult , Allografts , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Survival , Humans , Kidney Failure, Chronic/etiology , Kidney Function Tests , Kidney Transplantation/adverse effects , Male , Middle Aged , Nephrectomy/adverse effects , Postoperative Complications , Prognosis , Risk Factors , Survival Rate , Young Adult
16.
Am J Transplant ; 16(7): 2202-7, 2016 07.
Article in English | MEDLINE | ID: mdl-26932575

ABSTRACT

Since March 26, 2012, the Kidney Donor Profile Index (KDPI) has been provided with all deceased-donor kidney offers, with the goal of improving the expanded criteria donor (ECD) indicator. Although an improved risk index may facilitate identification and transplantation of marginal yet viable kidneys, a granular percentile system may reduce provider-patient communication flexibility, paradoxically leading to more discards ("labeling effect"). We studied the discard rates of the kidneys recovered for transplantation between March 26, 2010 and March 25, 2012 ("ECD era," N = 28 636) and March 26, 2012 and March 25, 2014 ("KDPI era," N = 29 021) using Scientific Registry of Transplant Recipients (SRTR) data. There was no significant change in discard rate from ECD era (18.1%) to KDPI era (18.3%) among the entire population (adjusted odds ratio [aOR] = 0.97 1.041.10 , p = 0.3), or in any KDPI stratum. However, among kidneys in which ECD and KDPI indicators were discordant, "high risk" standard criteria donor (SCD) kidneys (with KDPI > 85) were at increased risk of discard in the KDPI era (aOR = 1.07 1.421.89 , p = 0.02). Yet, recipients of these kidneys were at much lower risk of death (adjusted Risk Ratio [aRR] = 0.56 0.770.94 at 2 years posttransplant) compared to those remaining on dialysis waiting for low-KDPI kidneys. Our findings suggest that there might be an unexpected, harmful labeling effect of reporting a high KDPI for SCD kidneys, without the expected advantage of providing a more granular risk index.


Subject(s)
Donor Selection , Graft Survival , Kidney Transplantation , Registries/statistics & numerical data , Tissue Donors , Tissue and Organ Procurement/statistics & numerical data , Adult , Cadaver , Female , Follow-Up Studies , Glomerular Filtration Rate , Humans , Kidney Function Tests , Male , Middle Aged , Prognosis , Risk Factors , Transplant Recipients
17.
Am J Transplant ; 16(8): 2453-62, 2016 08.
Article in English | MEDLINE | ID: mdl-26901466

ABSTRACT

Immunosuppression management in kidney transplantation has evolved to include an increasingly diverse choice of medications. Although informed by patient and donor characteristics, choice of immunosuppression regimen varies widely across transplant programs. Using a novel database integrating national transplant registry and pharmacy fill records, immunosuppression use at 6-12 and 12-24 mo after transplant was evaluated for 22 453 patients transplanted in 249 U.S. programs in 2005-2010. Use of triple immunosuppression comprising tacrolimus, mycophenolic acid or azathioprine, and steroids varied widely (0-100% of patients per program), as did use of steroid-sparing regimens (0-77%), sirolimus-based regimens (0-100%) and cyclosporine-based regimens (0-78%). Use of triple therapy was more common in highly sensitized patients, women and recipients with dialysis duration >5 years. Sirolimus use appeared to diminish over the study period. Patient and donor characteristics explained only a limited amount of the observed variation in regimen use, whereas center choice explained 30-46% of the use of non-triple-therapy immunosuppression. The majority of patients who received triple-therapy (79%), cyclosporine-based (87.6%) and sirolimus-based (84.3%) regimens continued them in the second year after transplant. This population-based study of immunosuppression practice demonstrates substantial variation in center practice beyond that explained by differences in patient and donor characteristics.


Subject(s)
Evidence-Based Medicine , Graft Rejection/drug therapy , Immunosuppression Therapy/methods , Immunosuppressive Agents/therapeutic use , Kidney Failure, Chronic/surgery , Kidney Transplantation , Transplantation Immunology/drug effects , Adult , Drug Therapy, Combination , Female , Glomerular Filtration Rate , Graft Rejection/epidemiology , Graft Survival/drug effects , Graft Survival/immunology , Humans , Kidney Function Tests , Male , Middle Aged , Postoperative Complications , Prevalence , Prognosis , Risk Factors
18.
Am J Transplant ; 16(7): 2077-84, 2016 07.
Article in English | MEDLINE | ID: mdl-26752290

ABSTRACT

Choosing between multiple living kidney donors, or evaluating offers in kidney paired donation, can be challenging because no metric currently exists for living donor quality. Furthermore, some deceased donor (DD) kidneys can result in better outcomes than some living donor kidneys, yet there is no way to compare them on the same scale. To better inform clinical decision-making, we created a living kidney donor profile index (LKDPI) on the same scale as the DD KDPI, using Cox regression and adjusting for recipient characteristics. Donor age over 50 (hazard ratio [HR] per 10 years = 1.15 1.241.33 ), elevated BMI (HR per 10 units = 1.01 1.091.16 ), African-American race (HR = 1.15 1.251.37 ), cigarette use (HR = 1.09 1.161.23 ), as well as ABO incompatibility (HR = 1.03 1.271.58 ), HLA B (HR = 1.03 1.081.14 ) mismatches, and DR (HR = 1.04 1.091.15 ) mismatches were associated with greater risk of graft loss after living donor transplantation (all p < 0.05). Median (interquartile range) LKDPI score was 13 (1-27); 24.2% of donors had LKDPI < 0 (less risk than any DD kidney), and 4.4% of donors had LKDPI > 50 (more risk than the median DD kidney). The LKDPI is a useful tool for comparing living donor kidneys to each other and to deceased donor kidneys.


Subject(s)
Clinical Decision-Making , Graft Rejection/epidemiology , Kidney Failure, Chronic/surgery , Kidney Transplantation , Living Donors , Risk Assessment/methods , Adult , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Survival , Humans , Kidney Function Tests , Male , Middle Aged , Prognosis , United States/epidemiology
19.
Am J Transplant ; 16(1): 137-42, 2016 Jan.
Article in English | MEDLINE | ID: mdl-26561981

ABSTRACT

Allocation policies for liver transplantation underwent significant changes in June 2013 with the introduction of Share 35. We aimed to examine the effect of Share 35 on regional variation in posttransplant outcomes. We examined two patient groups from the United Network for Organ Sharing dataset; a pre-Share 35 group composed of patients transplanted between June 17, 2012, and June 17, 2013 (n = 5523), and a post-Share group composed of patients transplanted between June 18, 2013, and June 18, 2014 (n = 5815). We used Kaplan-Meier and Cox multivariable analyses to compare survival. There were significant increases in allocation Model for End-stage Liver Disease (MELD) scores, laboratory MELD scores, and proportions of patients in the intensive care unit and on mechanical, ventilated, or organ-perfusion support at transplant post-Share 35. We also observed a significant increase in donor risk index in this group. We found no difference on a national level in survival between patients transplanted pre-Share 35 and post-Share 35 (p = 0.987). Regionally, however, posttransplantation survival was significantly worse in the post-Share 35 patients in regions 4 and 10 (p = 0.008 and p = 0.04), with no significant differences in the remaining regions. These results suggest that Share 35 has been associated with transplanting "sicker patients" with higher MELD scores, and although no difference in survival is observed on a national level, outcomes appear to be concerning in some regions.


Subject(s)
Graft Rejection/prevention & control , Liver Failure/surgery , Liver Transplantation , Policy Making , Practice Guidelines as Topic/standards , Resource Allocation/methods , Tissue and Organ Procurement/standards , Female , Graft Rejection/epidemiology , Graft Survival , Humans , Male , Middle Aged , Tissue Donors , Waiting Lists
20.
Am J Transplant ; 16(1): 292-300, 2016 Jan.
Article in English | MEDLINE | ID: mdl-26317315

ABSTRACT

The Open Payments Program (OPP) was recently implemented to publicly disclose industry payments to physicians, with the goal of enabling patient awareness of potential conflicts of interests. Awareness of OPP, its data, and its implications for transplantation are critical. We used the first wave of OPP data to describe industry payments made to transplant surgeons. Transplant surgeons (N = 297) received a total of $759 654. The median (interquartile range [IQR]) payment to a transplant surgeon was $125 ($39-1018), and the highest payment to an individual surgeon was $83 520; 122 surgeons received <$100, and 17 received >$10 000. A higher h-index was associated with 30% higher chance of receiving >$1000 (relative risk/10 unit h-index increase = 1.18 1.301.44 , p < 0.001). The highest payment category was consulting fees, with a total of $314 448 paid in this reported category. Recipients of consulting fees had higher h-indices, median (IQR) of 20 (10-35) versus nine (3-17) (p < 0.001). Ten of 122 companies accounted for 62% of all payments. Kidney transplant and liver transplant (LT) centers that received >$1000 had higher center volumes (p < 0.001). LT centers that received payments of >$1000 had a higher percentage of private-insurance/self-pay patients (p < 0.01). Continued surveillance of industry payments may further elucidate the relationship between industry payments and physician practices.


Subject(s)
Databases, Factual/economics , Drug Industry/economics , Organ Transplantation/economics , Practice Patterns, Physicians'/economics , Surgeons/economics , Truth Disclosure , Health Expenditures , Humans , Patient Protection and Affordable Care Act/legislation & jurisprudence , Research Report
SELECTION OF CITATIONS
SEARCH DETAIL
...