Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 22
Filter
1.
Clin Transl Sci ; 16(9): 1628-1638, 2023 09.
Article in English | MEDLINE | ID: mdl-37353859

ABSTRACT

Despite complex pathways of drug disposition, clinical pharmacogenetic predictors currently rely on only a few high effect variants. Quantification of the polygenic contribution to variability in drug disposition is necessary to prioritize target drugs for pharmacogenomic approaches and guide analytic methods. Dexmedetomidine and fentanyl, often used in postoperative care of pediatric patients, have high rates of inter-individual variability in dosing requirements. Analyzing previously generated population pharmacokinetic parameters, we used Bayesian hierarchical mixed modeling to measure narrow-sense (additive) heritability ( h SNP 2 ) of dexmedetomidine and fentanyl clearance in children and identify relative contributions of small, moderate, and large effect-size variants to h SNP 2 . We used genome-wide association studies (GWAS) to identify variants contributing to variation in dexmedetomidine and fentanyl clearance, followed by functional analyses to identify associated pathways. For dexmedetomidine, median clearance was 33.0 L/h (interquartile range [IQR] 23.8-47.9 L/h) and h SNP 2 was estimated to be 0.35 (90% credible interval 0.00-0.90), with 45% of h SNP 2 attributed to large-, 32% to moderate-, and 23% to small-effect variants. The fentanyl cohort had median clearance of 8.2 L/h (IQR 4.7-16.7 L/h), with estimated h SNP 2 of 0.30 (90% credible interval 0.00-0.84). Large-effect variants accounted for 30% of h SNP 2 , whereas moderate- and small-effect variants accounted for 37% and 33%, respectively. As expected, given small sample sizes, no individual variants or pathways were significantly associated with dexmedetomidine or fentanyl clearance by GWAS. We conclude that clearance of both drugs is highly polygenic, motivating the future use of polygenic risk scores to guide appropriate dosing of dexmedetomidine and fentanyl.


Subject(s)
Dexmedetomidine , Humans , Child , Fentanyl , Genome-Wide Association Study , Bayes Theorem
2.
Br J Clin Pharmacol ; 88(6): 2885-2898, 2022 06.
Article in English | MEDLINE | ID: mdl-34957589

ABSTRACT

AIMS: Our objectives were to perform a population pharmacokinetic analysis of dexmedetomidine in children using remnant specimens and electronic health records (EHRs) and explore the impact of patient's characteristics and pharmacogenetics on dexmedetomidine clearance. METHODS: Dexmedetomidine dosing and patient data were gathered from EHRs and combined with opportunistically sampled remnant specimens. Population pharmacokinetic models were developed using nonlinear mixed-effects modelling. Stage 1 developed a model without genotype variables; Stage 2 added pharmacogenetic effects. RESULTS: Our final study population included 354 post-cardiac surgery patients aged 0-22 years (median 16 mo). The data were best described with a 2-compartment model with allometric scaling for weight and Hill maturation function for age. Population parameter estimates and 95% confidence intervals were 27.3 L/h (24.0-31.1 L/h) for total clearance, 161 L (139-187 L) for central compartment volume of distribution, 26.0 L/h (22.5-30.0 L/h) for intercompartmental clearance and 7903 L (5617-11 119 L) for peripheral compartment volume of distribution. The estimate for postmenstrual age when 50% of adult clearance is achieved was 42.0 weeks (41.5-42.5 weeks) and the Hill coefficient estimate was 7.04 (6.99-7.08). Genotype was not statistically or clinically significant. CONCLUSION: Our study demonstrates the use of real-world EHR data and remnant specimens to perform a population pharmacokinetic analysis and investigate covariate effects in a large paediatric population. Weight and age were important predictors of clearance. We did not find evidence for pharmacogenetic effects of UGT1A4 or UGT2B10 genotype or CYP2A6 risk score.


Subject(s)
Cardiac Surgical Procedures , Dexmedetomidine , Adult , Child , Electronic Health Records , Glucuronosyltransferase/genetics , Humans , Hypnotics and Sedatives , Models, Biological
3.
J Am Med Inform Assoc ; 28(4): 782-790, 2021 03 18.
Article in English | MEDLINE | ID: mdl-33338223

ABSTRACT

OBJECTIVE: To develop an algorithm for building longitudinal medication dose datasets using information extracted from clinical notes in electronic health records (EHRs). MATERIALS AND METHODS: We developed an algorithm that converts medication information extracted using natural language processing (NLP) into a usable format and builds longitudinal medication dose datasets. We evaluated the algorithm on 2 medications extracted from clinical notes of Vanderbilt's EHR and externally validated the algorithm using clinical notes from the MIMIC-III clinical care database. RESULTS: For the evaluation using Vanderbilt's EHR data, the performance of our algorithm was excellent; F1-measures were ≥0.98 for both dose intake and daily dose. For the external validation using MIMIC-III, the algorithm achieved F1-measures ≥0.85 for dose intake and ≥0.82 for daily dose. DISCUSSION: Our algorithm addresses the challenge of building longitudinal medication dose data using information extracted from clinical notes. Overall performance was excellent, but the algorithm can perform poorly when incorrect information is extracted by NLP systems. Although it performed reasonably well when applied to the external data source, its performance was worse due to differences in the way the drug information was written. The algorithm is implemented in the R package, "EHR," and the extracted data from Vanderbilt's EHRs along with the gold standards are provided so that users can reproduce the results and help improve the algorithm. CONCLUSION: Our algorithm for building longitudinal dose data provides a straightforward way to use EHR data for medication-based studies. The external validation results suggest its potential for applicability to other systems.


Subject(s)
Algorithms , Electronic Health Records , Natural Language Processing , Pharmaceutical Preparations/administration & dosage , Drug Therapy , Humans , Information Storage and Retrieval/methods
4.
Clin Pharmacol Ther ; 107(4): 934-943, 2020 04.
Article in English | MEDLINE | ID: mdl-31957870

ABSTRACT

Postmarketing population pharmacokinetic (PK) and pharmacodynamic (PD) studies can be useful to capture patient characteristics affecting PK or PD in real-world settings. These studies require longitudinally measured dose, outcomes, and covariates in large numbers of patients; however, prospective data collection is cost-prohibitive. Electronic health records (EHRs) can be an excellent source for such data, but there are challenges, including accurate ascertainment of drug dose. We developed a standardized system to prepare datasets from EHRs for population PK/PD studies. Our system handles a variety of tasks involving data extraction from clinical text using a natural language processing algorithm, data processing, and data building. Applying this system, we performed a fentanyl population PK analysis, resulting in comparable parameter estimates to a prior study. This new system makes the EHR data extraction and preparation process more efficient and accurate and provides a powerful tool to facilitate postmarketing population PK/PD studies using information available in EHRs.


Subject(s)
Data Interpretation, Statistical , Electronic Health Records/statistics & numerical data , Fentanyl/pharmacokinetics , Lamotrigine/pharmacokinetics , Product Surveillance, Postmarketing/statistics & numerical data , Tacrolimus/pharmacokinetics , Adolescent , Adult , Aged , Analgesics, Opioid/pharmacokinetics , Databases, Factual/statistics & numerical data , Female , Humans , Male , Middle Aged , Product Surveillance, Postmarketing/methods , Young Adult
5.
Undersea Hyperb Med ; 43(5): 549-566, 2016.
Article in English | MEDLINE | ID: mdl-28768073

ABSTRACT

Baseline sleep characteristics were explored for 71 U.S. military service members with mild traumatic brain injury (mTBI) enrolled in a post-concussive syndrome clinical trial. The Pittsburgh Sleep Quality Index (PSQI), sleep diary, several disorder-specific questionnaires, actigraphy and polysomnographic nap were collected. Almost all (97%) reported ongoing sleep problems. The mean global PSQI score was 13.5 (SD=3.8) and 87% met insomnia criteria. Sleep maintenance efficiency was 79.1% for PSQI, 82.7% for sleep diary and 90.5% for actigraphy; total sleep time was 288, 302 and 400 minutes, respectively. There was no correlation between actigraphy and subjective questionnaires. Overall, 70% met hypersomnia conditions, 70% were at high risk for obstructive sleep apnea (OSA), 32% were symptomatic for restless legs syndrome, and 6% reported cataplexy. Nearly half (44%) reported coexisting insomnia, hypersomnia and high OSA risk. Participants with post-traumatic stress disorder (PTSD) had higher PSQI scores and increased OSA risk. Older participants and those with higher aggression, anxiety or depression also had increased OSA risk. The results confirm poor sleep quality in mTBI with insomnia, hypersomnia, and OSA risk higher than previously reported, and imply sleep disorders in mTBI may be underdiagnosed or exacerbated by comorbid PTSD.


Subject(s)
Brain Concussion/complications , Military Personnel , Sleep Initiation and Maintenance Disorders/diagnosis , Actigraphy , Adult , Cataplexy/etiology , Female , Humans , Male , Middle Aged , Narcolepsy/diagnosis , Narcolepsy/etiology , Narcolepsy/physiopathology , Polysomnography , Post-Concussion Syndrome/therapy , Restless Legs Syndrome/etiology , Sleep Apnea, Obstructive/diagnosis , Sleep Apnea, Obstructive/etiology , Sleep Initiation and Maintenance Disorders/drug therapy , Sleep Initiation and Maintenance Disorders/etiology , Sleep Initiation and Maintenance Disorders/physiopathology , Stress Disorders, Post-Traumatic/complications , Surveys and Questionnaires
6.
Transplantation ; 99(2): 360-6, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25594552

ABSTRACT

BACKGROUND: Most pediatric kidney transplant recipients eventually require retransplantation, and the most advantageous timing strategy regarding deceased and living donor transplantation in candidates with only 1 living donor remains unclear. METHODS: A patient-oriented Markov decision process model was designed to compare, for a given patient with 1 living donor, living-donor-first followed if necessary by deceased donor retransplantation versus deceased-donor-first followed if necessary by living donor (if still able to donate) or deceased donor (if not) retransplantation. Based on Scientific Registry of Transplant Recipients data, the model was designed to account for waitlist, graft, and patient survival, sensitization, increased risk of graft failure seen during late adolescence, and differential deceased donor waiting times based on pediatric priority allocation policies. Based on national cohort data, the model was also designed to account for aging or disease development, leading to ineligibility of the living donor over time. RESULTS: Given a set of candidate and living donor characteristics, the Markov model provides the expected patient survival over a time horizon of 20 years. For the most highly sensitized patients (panel reactive antibody > 80%), a deceased-donor-first strategy was advantageous, but for all other patients (panel reactive antibody < 80%), a living-donor-first strategy was recommended. CONCLUSIONS: This Markov model illustrates how patients, families, and providers can be provided information and predictions regarding the most advantageous use of deceased donor versus living donor transplantation for pediatric recipients.


Subject(s)
Decision Support Techniques , Donor Selection , Kidney Transplantation/methods , Living Donors/supply & distribution , Adolescent , Adult , Age Factors , Child , Computer Simulation , Eligibility Determination , Female , Graft Survival , HLA Antigens/immunology , Histocompatibility , Humans , Isoantibodies/blood , Kidney Transplantation/adverse effects , Kidney Transplantation/mortality , Male , Markov Chains , Middle Aged , Multivariate Analysis , Proportional Hazards Models , Registries , Reoperation , Risk Factors , Stochastic Processes , Time Factors , Treatment Outcome , United States , Waiting Lists , Young Adult
7.
Transplantation ; 99(5): 997-1002, 2015 May.
Article in English | MEDLINE | ID: mdl-25340600

ABSTRACT

BACKGROUND: Patient-level risk factors for delayed graft function (DGF) have been well described. However, the Organ Procurement and Transplantation Network definition of DGF is based on dialysis in the first week, which is subject to center-level practice patterns. It remains unclear if there are center-level differences in DGF and if measurable center characteristics can explain these differences. METHODS: Using the 2003 to 2012 Scientific Registry of Transplant Recipients data, we developed a hierarchical (multilevel) model to determine the association between center characteristics and DGF incidence after adjusting for known patient risk factors and to quantify residual variability across centers after adjustment for these factors. RESULTS: Of 82,143 deceased donor kidney transplant recipients, 27.0% developed DGF, with a range across centers of 3.2% to 63.3%. A center's proportion of preemptive transplants (odds ratio [OR], 0.83; per 5% increment; 95% confidence interval [95% CI], 0.74-;0.93; P = 0.001) and kidneys with longer than 30 hr of cold ischemia time (CIT) (OR, 0.95; per 5% increment; 95% CI, 0.92-;0.98; P = 0.001) were associated with less DGF. A center's proportion of donation after cardiac death donors (OR, 1.12; per 5% increment; 95% CI, 1.03-;1.17; P < 0.001) and imported kidneys (OR, 1.06; per 5% increment; 95% CI, 1.03-;1.10; P < 0.001) were associated with more DGF. After patient-level and center-level adjustments, only 41.8% of centers had DGF incidences consistent with the national median and 28.2% had incidences above the national median. CONCLUSION: Significant heterogeneity in DGF incidences across centers, even after adjusting for patient-level and center-level characteristics, calls into question the generalizability and validity of the current DGF definition. Enhanced understanding of center-level variability and improving the definition of DGF accordingly may improve DGF's utility in clinical care and as a surrogate endpoint in clinical trials.


Subject(s)
Delayed Graft Function/etiology , Kidney Transplantation/adverse effects , Adult , Aged , Female , Humans , Logistic Models , Male , Middle Aged , Tissue Donors
8.
Pediatrics ; 133(4): 594-601, 2014 Apr.
Article in English | MEDLINE | ID: mdl-24616363

ABSTRACT

OBJECTIVE: To investigate changes in pediatric kidney transplant outcomes over time and potential variations in these changes between the early and late posttransplant periods and across subgroups based on recipient, donor, and transplant characteristics. METHODS: Using multiple logistic regression and multivariable Cox models, graft and patient outcomes were analyzed in 17,446 pediatric kidney-only transplants performed in the United States between 1987 and 2012. RESULTS: Ten-year patient and graft survival rates were 90.5% and 60.2%, respectively, after transplantation in 2001, compared with 77.6% and 46.8% after transplantation in 1987. Primary nonfunction and delayed graft function occurred in 3.3% and 5.3%, respectively, of transplants performed in 2011, compared with 15.4% and 19.7% of those performed in 1987. Adjusted for recipient, donor, and transplant characteristics, these improvements corresponded to a 5% decreased hazard of graft loss, 5% decreased hazard of death, 10% decreased odds of primary nonfunction, and 5% decreased odds of delayed graft function with each more recent year of transplantation. Graft survival improvements were lower in adolescent and female recipients, those receiving pretransplant dialysis, and those with focal segmental glomerulosclerosis. Patient survival improvements were higher in those with elevated peak panel reactive antibody. Both patient and graft survival improvements were most pronounced in the first posttransplant year. CONCLUSIONS: Outcomes after pediatric kidney transplantation have improved dramatically over time for all recipient subgroups, especially for highly sensitized recipients. Most improvement in graft and patient survival has come in the first year after transplantation, highlighting the need for continued progress in long-term outcomes.


Subject(s)
Kidney Transplantation/trends , Adolescent , Child , Child, Preschool , Female , Graft Survival , Humans , Infant , Kidney Transplantation/mortality , Male , Survival Rate , Time Factors , Treatment Outcome , United States
9.
Transplantation ; 97(4): 446-50, 2014 Feb 27.
Article in English | MEDLINE | ID: mdl-24162248

ABSTRACT

BACKGROUND: Kidney transplantation (KT) is the treatment for end-stage renal disease in appropriate HIV-positive individuals. However, acute rejection (AR) rates are over twice those of HIV-negative recipients. METHODS: To better understand optimal immunosuppression for HIV-positive KT recipients, we studied associations between immunosuppression regimen, AR at 1 year, and survival in 516 HIV-positive and 93,027 HIV-negative adult kidney-only recipients using Scientific Registry of Transplant Recipients data from 2003 to 2011. RESULTS: Consistent with previous reports, HIV-positive patients had twofold higher risk of AR (adjusted relative risk [aRR], 1.77; 95% confidence interval [CI], 1.45-2.2; P<0.001) than their HIV-negative counterparts as well as a higher risk of graft loss (adjusted hazard ratio, 1.51; 95% CI, 1.18-1.94; P=0.001), but these differences were not seen among patients receiving antithymocyte globulin (ATG) induction (aRR for AR, 1.16; 95% CI, 0.41-3.35, P=0.77; adjusted hazard ratio for graft loss, 1.54; 95% CI, 0.73-3.25; P=0.26). Furthermore, HIV-positive patients receiving ATG induction had a 2.6-fold lower risk of AR (aRR, 0.39; 95% CI, 0.18-0.87; P=0.02) than those receiving no antibody induction. Conversely, HIV-positive patients receiving sirolimus-based therapy had a 2.2-fold higher risk of AR (aRR, 2.15; 95% CI, 1.20-3.86; P=0.01) than those receiving calcineurin inhibitor-based regimens. CONCLUSION: These findings support a role for ATG induction, and caution against the use of sirolimus-based maintenance therapy, in HIV-positive individuals undergoing KT.


Subject(s)
Graft Rejection , HIV Infections/complications , Immunosuppression Therapy/methods , Immunosuppressive Agents/therapeutic use , Kidney Failure, Chronic/therapy , Kidney Transplantation/methods , Adolescent , Adult , Aged , Antilymphocyte Serum/metabolism , Calcineurin Inhibitors , Female , Graft Survival , HIV Infections/immunology , Humans , Kidney Failure, Chronic/complications , Male , Middle Aged , Multivariate Analysis , Registries , Risk , Sirolimus/chemistry , Treatment Outcome , Young Adult
10.
Transplantation ; 96(5): 487-93, 2013 Sep 15.
Article in English | MEDLINE | ID: mdl-24002689

ABSTRACT

BACKGROUND: Living-donor kidney transplantation (KT) is encouraged for children with end-stage renal disease due to superior long-term graft survival compared with deceased-donor KT. Despite this, there has been a steady decrease in the use of living-donor KT for pediatric recipients. Due to their young age at transplantation, most pediatric recipients eventually require retransplantation, and the optimal order of donor type is not clear. METHODS: Using the Scientific Registry of Transplant Recipients, we analyzed first and second graft survival among 14,799 pediatric (<18 years old) recipients undergoing KT between 1987 and 2010. RESULTS: Living-donor grafts had longer survival compared with deceased-donor grafts, similarly among both first (adjusted hazard ratio [aHR], 0.78; 95% confidence interval [CI], 0.73-0.84; P<0.001) and second (aHR, 0.74; 95% CI, 0.64-0.84; P<0.001) transplants. Living-donor second grafts had longer survival compared with deceased-donor second grafts, similarly after living-donor (aHR, 0.68; 95% CI, 0.56-0.83; P<0.001) and deceased-donor (aHR, 0.77; 95% CI, 0.63-0.95; P=0.02) first transplants. Cumulative graft life of two transplants was similar regardless of the order of deceased-donor and living-donor transplantation. CONCLUSIONS: Deceased-donor KT in pediatric recipients followed by living-donor retransplantation does not negatively impact the living-donor graft survival advantage and provides similar cumulative graft life compared with living-donor KT followed by deceased-donor retransplantation. Clinical decision-making for pediatric patients with healthy, willing living donors should consider these findings in addition to the risk of sensitization, aging of the living donor, and deceased-donor waiting times.


Subject(s)
Kidney Transplantation , Tissue Donors , Adolescent , Child , Child, Preschool , Female , Graft Survival , Humans , Living Donors , Male , Reoperation
11.
J Pediatr Surg ; 48(6): 1277-82, 2013 Jun.
Article in English | MEDLINE | ID: mdl-23845618

ABSTRACT

BACKGROUND/PURPOSE: Living donor kidney transplantation is encouraged for children with end-stage renal disease given the superior survival of living donor grafts, but pediatric candidates are also given preference for kidneys from younger deceased donors. METHODS: Death-censored graft survival of pediatric kidney-only transplants performed in the U.S. between 1987-2012 was compared across living related (LRRT) (n=7741), living unrelated (LURT) (n=618), and deceased donor renal transplants (DDRT) (n=8945) using Kaplan-Meier analysis, multivariable Cox proportional hazards models, and matched controls analysis. RESULTS: As expected, HLA mismatch was greater among LURT compared to LRRT (p<0.001). Unadjusted graft survival was lower, particularly long-term, for LURT compared to LRRT (p=0.009). However, LURT graft survival was still superior to DDRT graft survival, even when compared only to deceased donors under age 35 (p=0.002). The difference in graft survival between LURT and LRRT was not seen when adjusting for HLA mismatch, year of transplantation, and donor and recipient characteristics using a Cox model (aHR=1.04, 95% CI: 0.87-1.24, p=0.7) or matched controls (HR=1.02, 95% CI: 0.82-1.27, p=0.9). CONCLUSION: Survival of LURT grafts is superior to grafts from younger deceased donors and equivalent to LRRT grafts when adjusting for other factors, most notably differences in HLA mismatch.


Subject(s)
Graft Survival , Kidney Failure, Chronic/surgery , Kidney Transplantation/methods , Living Donors , Unrelated Donors , Adolescent , Age Factors , Child , Child, Preschool , Female , Follow-Up Studies , Humans , Infant , Infant, Newborn , Male , Registries , Survival Analysis , Treatment Outcome , United States
12.
Transplantation ; 95(11): 1360-8, 2013 Jun 15.
Article in English | MEDLINE | ID: mdl-23549198

ABSTRACT

BACKGROUND: More than 25% of pediatric kidney transplants are lost within 7 years, necessitating dialysis or retransplantation. Retransplantation practices and the outcomes of repeat transplantations, particularly among those with early graft loss, are not clear. METHODS: We examined retransplantation practice patterns and outcomes in 14,799 pediatric (ages <18 years) patients between 1987 and 2010. Death-censored graft survival was analyzed using extended Cox models and retransplantation using competing risks regression. RESULTS: After the first graft failure, 50.4% underwent retransplantation and 12.1% died within 5 years; after the second graft failure, 36.1% underwent retransplantation and 15.4% died within 5 years. Prior preemptive transplantation and graft loss after 5 years were associated with increased rates of retransplantation. Graft loss before 5 years, older age, non-Caucasian race, public insurance, and increased panel-reactive antibody were associated with decreased rates of retransplantation. First transplants had lower risk of graft loss compared with second (adjusted hazard ratio [aHR], 0.72; 95% confidence interval [CI], 0.64-0.80; P<0.001), third (aHR, 0.62; 95% CI, 0.49-0.78; P<0.001), and fourth (aHR, 0.44; 95% CI, 0.24-0.78; P=0.005) transplants. However, among patients receiving two or more transplants (conditioned on having lost a first transplant), second graft median survival was 8.5 years despite a median survival of 4.5 years for the first transplant. Among patients receiving three or more transplants, third graft median survival was 7.7 years despite median survivals of 2.1 and 3.1 years for the first and second transplants. CONCLUSIONS: Among pediatric kidney transplant recipients who experience graft loss, racial and socioeconomic disparities exist with regard to retransplantation, and excellent graft survival can be achieved with retransplantation despite poor survival of previous grafts.


Subject(s)
Graft Rejection/epidemiology , Kidney Transplantation/mortality , Kidney Transplantation/statistics & numerical data , Practice Patterns, Physicians'/trends , Transplantation , Adolescent , Age Factors , Child , Child, Preschool , Female , Humans , Incidence , Kidney Transplantation/ethnology , Male , Patient Selection , Racial Groups , Reoperation/statistics & numerical data , Retrospective Studies , Socioeconomic Factors , Survival Rate , Time Factors , Treatment Outcome , Young Adult
13.
J Phys Act Health ; 10(3): 323-34, 2013 Mar.
Article in English | MEDLINE | ID: mdl-23620388

ABSTRACT

BACKGROUND: Path quality has not been well studied as a correlate of active transport to school. We hypothesize that for urban-dwelling children the environment between home and school is at least as important as the environment immediately surrounding their homes and/or schools when exploring walking to school behavior. METHODS: Tools from spatial statistics and geographic information systems (GIS) were applied to an assessment of street blocks to create a walking path quality measure based on physical and social disorder (termed "incivilities") for each child. Path quality was included in a multivariate regression analysis of walking to school status for a sample of 362 children. RESULTS: The odds of walking to school for path quality was 0.88 (95% CI: 0.72-1.07), which although not statistically significant is in the direction supporting our hypothesis. The odds of walking to school for home street block incivility suggests the counter intuitive effect (OR = 1.10, 95% CI: 1.08-1.19). CONCLUSIONS: Results suggest that urban children living in communities characterized by higher incivilities are more likely to walk to school, potentially placing them at risk for adverse health outcomes because of exposure to high incivility areas along their route. Results also support the importance of including path quality when exploring the influence of the environment on walking to school behavior.


Subject(s)
Environment Design , Schools , Social Problems , Walking/statistics & numerical data , Baltimore , Child , Female , Geographic Information Systems , Humans , Male , Models, Statistical , Social Environment , Urban Population
14.
Clin J Am Soc Nephrol ; 8(6): 1019-26, 2013 Jun.
Article in English | MEDLINE | ID: mdl-23430210

ABSTRACT

BACKGROUND AND OBJECTIVE: The risk of graft loss after pediatric kidney transplantation increases during late adolescence and early adulthood, but the extent to which this phenomenon affects all recipients is unknown. This study explored interactions between recipient factors and this high-risk age window, searching for a recipient phenotype that may be less susceptible during this detrimental age interval. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: With use of Scientific Registry of Transplant Recipients data from 1987 to 2010, risk of graft loss across recipient age was quantified using a multivariable piecewise-constant hazard rate model with time-varying coefficients for recipient risk factors. RESULTS: Among 16,266 recipients, graft loss during ages ≥17 and <24 years was greater than that for both 3-17 years (adjusted hazard ratio [aHR], 1.61; P<0.001) and ≥24 years (aHR, 1.28; P<0.001). This finding was consistent across age at transplantation, sex, race, cause of renal disease, insurance type, pretransplant dialysis history, previous transplant, peak panel-reactive antibody (PRA), and type of induction immunosuppression. The high-risk window was seen in both living-donor and deceased-donor transplant recipients, at all levels of HLA mismatch, regardless of centers' pediatric transplant volume, and consistently over time. The relationship between graft loss risk and donor type, PRA, transplant history, insurance type, and cause of renal disease was diminished upon entry into the high-risk window. CONCLUSIONS: No recipient subgroups are exempt from the dramatic increase in graft loss during late adolescence and early adulthood, a high-risk window that modifies the relationship between typical recipient risk factors and graft loss.


Subject(s)
Graft Survival , Kidney Transplantation/adverse effects , Time Factors , Adolescent , Adult , Age Factors , Child , Child, Preschool , Female , Humans , Male , Models, Statistical , Multivariate Analysis , Risk Factors , Treatment Outcome , Young Adult
15.
Transplantation ; 93(11): 1147-50, 2012 Jun 15.
Article in English | MEDLINE | ID: mdl-22461037

ABSTRACT

BACKGROUND: Lack of education and reluctance to initiate a conversation about live donor kidney transplantation is a common barrier to finding a donor. Although transplant candidates are often hesitant to discuss their illness, friends or family members are often eager to spread awareness and are empowered by advocating for the candidates. We hypothesized that separating the advocate from the patient is important in identifying live donors. METHODS: We developed an intervention to train a live donor champion (LDC; a friend, family member, or community member willing to advocate for the candidate) for this advocacy role. We compared outcomes of 15 adult kidney transplant candidates who had no prospective donors and underwent the LDC intervention with 15 matched controls from our waiting list. RESULTS: Comfort in initiating a conversation about transplantation increased over time for LDCs. Twenty-five potential donors contacted our center on behalf of LDC participants; four participants achieved live donor kidney transplantation and three additional participants have donors in evaluation, compared with zero among matched controls (P < 0.001). CONCLUSIONS: Transplant candidates are ill equipped to seek live donors; by separating the advocate from the patient, understandable concerns about initiating conversations are reduced.


Subject(s)
Communication Barriers , Kidney Transplantation , Living Donors , Patient Advocacy , Tissue and Organ Procurement/methods , Aged , Female , Health Education , Humans , Linear Models , Male , Middle Aged , Outcome Assessment, Health Care , Patient Advocacy/education , Prospective Studies
16.
Transplantation ; 93(6): 603-9, 2012 Mar 27.
Article in English | MEDLINE | ID: mdl-22290268

ABSTRACT

BACKGROUND: ABO incompatible (ABOi) kidney transplantation is an important modality to facilitate living donor transplant for incompatible pairs. To date, reports of the outcomes from this practice in the United States have been limited to single-center studies. METHODS: Using the Scientific Registry of Transplant Recipients, we identified 738 patients who underwent live-donor ABOi kidney transplantation between January 1, 1995, and March 31, 2010. These were compared with matched controls that underwent ABO compatible live-donor kidney transplantation. Subgroup analyses among ABOi recipients were performed according to donor blood type, recipient blood type, and transplant center ABOi volume. RESULTS: When compared with ABO compatible-matched controls, long-term patient survival of ABOi recipients was not significantly different between the cohorts (P=0.2). However, graft loss was significantly higher, particularly in the first 14 days posttransplant (subhazard ratio, 2.34; 95% confidence interval, 1.43-3.84; P=0.001), with little to no difference beyond day 14 (subhazard ratio, 1.28; 95% confidence interval, 0.99-1.54; P=0.058). In subgroup analyses among ABOi recipients, no differences in survival were seen by donor blood type, recipient blood type, or transplant center ABOi volume. CONCLUSIONS: These results support the use and dissemination of ABOi transplantation when a compatible live donor is not available, but caution that the highest period of risk is immediately posttransplant.


Subject(s)
ABO Blood-Group System/immunology , Blood Group Incompatibility/immunology , Kidney Failure, Chronic/surgery , Kidney Transplantation/immunology , Kidney Transplantation/mortality , Adult , Cohort Studies , Female , Follow-Up Studies , Graft Rejection/immunology , Humans , Kidney Failure, Chronic/epidemiology , Living Donors , Male , Middle Aged , Prognosis , Retrospective Studies , Risk Factors , Survival Rate , United States/epidemiology
17.
Arch Surg ; 147(2): 190-3, 2012 Feb.
Article in English | MEDLINE | ID: mdl-22351919

ABSTRACT

The ability to predict outcomes following a kidney transplant is limited by the complex physiologic decline of kidney failure, a latent factor that is difficult to capture using conventional comorbidity assessment. The frailty phenotype is a recently described inflammatory state of increased vulnerability to stressors resulting from decreased physiologic reserve and dysregulation of multiple physiologic systems. We hypothesized that frailty would be associated with delayed graft function, based on putative associations between inflammatory cytokines and graft dysfunction. We prospectively measured frailty in 183 kidney transplant recipients between December 2008 and April 2010. Independent associations between frailty and delayed graft function were analyzed using modified Poisson regression. Preoperative frailty was independently associated with a 1.94-fold increased risk for delayed graft function (95% CI, 1.13-3.36; P = .02). The assessment of frailty may provide further insights into the pathophysiology of allograft dysfunction and may improve our ability to preoperatively risk-stratify kidney transplant recipients.


Subject(s)
Delayed Graft Function/physiopathology , Health Status , Kidney Failure, Chronic/surgery , Kidney Transplantation/pathology , Adolescent , Adult , Aged , Aged, 80 and over , Female , Hand Strength , Humans , Kidney Transplantation/physiology , Male , Middle Aged , Multivariate Analysis , Prospective Studies , Risk Assessment , Treatment Outcome , Young Adult
18.
Liver Transpl ; 18(6): 621-9, 2012 Jun.
Article in English | MEDLINE | ID: mdl-22344967

ABSTRACT

Approximately 14,000 women of reproductive age are currently living in the United States after liver transplantation (LT), and another 500 undergo LT each year. Although LT improves reproductive function in women with advanced liver disease, the associated pregnancy outcomes and maternal-fetal risks have not been quantified in a broad manner. To obtain more generalizable inferences, we performed a systematic review and meta-analysis of articles that were published between 2000 and 2011 and reported pregnancy-related outcomes for LT recipients. Eight of 578 unique studies met the inclusion criteria, and these studies represented 450 pregnancies in 306 LT recipients. The post-LT live birth rate [76.9%, 95% confidence interval (CI) = 72.7%-80.7%] was higher than the live birth rate for the US general population (66.7%) but was similar to the post-kidney transplantation (KT) live birth rate (73.5%). The post-LT miscarriage rate (15.6%, 95% CI = 12.3%-19.2%) was lower than the miscarriage rate for the general population (17.1%) but was similar to the post-KT miscarriage rate (14.0%). The rates of pre-eclampsia (21.9%, 95% CI = 17.7%-26.4%), cesarean section delivery (44.6%, 95% CI = 39.2%-50.1%), and preterm delivery (39.4%, 95% CI = 33.1%-46.0%) were higher than the rates for the US general population (3.8%, 31.9%, and 12.5%, respectively) but lower than the post-KT rates (27.0%, 56.9%, and 45.6%, respectively). Both the mean gestational age and the mean birth weight were significantly greater (P < 0.001) for LT recipients versus KT recipients (36.5 versus 35.6 weeks and 2866 versus 2420 g). Although pregnancy after LT is feasible, the complication rates are relatively high and should be considered during patient counseling and clinical decision making. More case and center reports are necessary so that information on post-LT pregnancy outcomes and complications can be gathered to improve the clinical management of pregnant LT recipients. Continued reporting to active registries is highly encouraged at the center level.


Subject(s)
Liver Failure/epidemiology , Liver Failure/surgery , Liver Transplantation/statistics & numerical data , Pregnancy Complications/epidemiology , Pregnancy Outcome/epidemiology , Female , Humans , Pregnancy
19.
Am J Kidney Dis ; 59(6): 849-57, 2012 Jun.
Article in English | MEDLINE | ID: mdl-22370021

ABSTRACT

BACKGROUND: On average, African Americans attain living donor kidney transplantation (LDKT) at decreased rates compared with their non-African American counterparts. However, center-level variations in this disparity or the role of center-level factors is unknown. STUDY DESIGN: Observational cohort study. SETTING & PARTICIPANTS: 247,707 adults registered for first-time kidney transplants from 1995-2007 as reported by the Scientific Registry of Transplant Recipients. PREDICTORS: Patient-level factors (age, sex, body mass index, insurance status, education, blood type, and panel-reactive antibody level) were adjusted for in all models. The association of center-level characteristics (number of candidates, transplant volume, LDKT volume, median time to transplant, percentage of African American candidates, percentage of prelisted candidates, and percentage of LDKT) and degree of racial disparity in LDKT was quantified. OUTCOMES: Hierarchical multivariate logistic regression models were used to derive center-specific estimates of LDKT attainment in African American versus non-African American candidates. RESULTS: Racial parity was not seen at any of the 275 transplant centers in the United States. At centers with the least racial disparity, African Americans had 35% lower odds of receiving LDKT; at centers with the most disparity, African Americans had 76% lower odds. Higher percentages of African American candidates (interaction term, 0.86; P = 0.03) and prelisted candidates (interaction term, 0.80; P = 0.001) at a given center were associated with increased racial disparity at that center. Higher rates of LDKT (interaction term, 1.25; P < 0.001) were associated with less racial disparity. LIMITATIONS: Some patient-level factors are not captured, including a given patient's pool of potential donors. Geographic disparities in deceased donor availability might affect LDKT rates. Center-level policies and practices are not captured. CONCLUSIONS: Racial disparity in attainment of LDKT exists at every transplant center in the country. Centers with higher rates of LDKT attainment for all races had less disparity; these high-performing centers might provide insights into policies that might help address this disparity.


Subject(s)
Black or African American/statistics & numerical data , Healthcare Disparities/trends , Kidney Failure, Chronic/ethnology , Kidney Transplantation/ethnology , Living Donors/statistics & numerical data , White People/statistics & numerical data , Adolescent , Adult , Aged , Cohort Studies , Donor Selection , Female , Humans , Incidence , Kidney Failure, Chronic/diagnosis , Kidney Failure, Chronic/surgery , Kidney Transplantation/statistics & numerical data , Logistic Models , Male , Middle Aged , Multivariate Analysis , Needs Assessment , Risk Factors , Treatment Outcome , United States , Young Adult
20.
Clin J Am Soc Nephrol ; 6(11): 2705-11, 2011 Nov.
Article in English | MEDLINE | ID: mdl-21940839

ABSTRACT

BACKGROUND AND OBJECTIVES: Kidney transplantation from donors after cardiac death (DCD) provides similar graft survival to donors after brain death (DBD) in adult recipients. However, outcomes of DCD kidneys in pediatric recipients remain unclear, primarily because of limited sample sizes. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: We identified 137 pediatric (<18 years old) recipients of DCD kidneys between 1994 and 2010 using Scientific Registry of Transplant Recipients data and compared outcomes with 6059 pediatric recipients of DBD kidneys during the same time period, accounting for donor, recipient, and transplant characteristics using time-varying Cox regression and matched controls. Long-term follow-up (4 years or beyond) was available for 31 DCD recipients. RESULTS: Pediatric recipients of DCD kidneys experienced a significantly higher rate of delayed graft function (22.0% versus 12.3%; P = 0.001), although lower than reported delayed graft function rates of DCD grafts in adults. Although DCD and DBD graft survival was equal in the early postoperative period, graft loss among pediatric recipients of DCD kidneys exceeded their DBD counterparts starting 4 years after transplantation. This effect was statistically significant in a multivariate Cox model (hazard ratio = 2.03; 95% confidence interval, 1.21 to 3.39; P = 0.007) and matched-controls analysis (hazard ratio = 2.36; 95% confidence interval, 1.11 to 5.03; P = 0.03). CONCLUSIONS: A significant increase in DCD graft loss starting 4 years after transplantation motivates a cautious approach to the use of DCD kidneys in children, in whom long-term graft survival is of utmost importance.


Subject(s)
Donor Selection , Graft Survival , Kidney Transplantation/adverse effects , Tissue Donors/supply & distribution , Adolescent , Adult , Age Factors , Brain Death , Chi-Square Distribution , Child , Delayed Graft Function/etiology , Female , Graft Rejection/etiology , Humans , Kaplan-Meier Estimate , Kidney Transplantation/mortality , Male , Patient Selection , Proportional Hazards Models , Registries , Risk Assessment , Risk Factors , Time Factors , Tissue and Organ Procurement , Treatment Outcome , United States , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...