Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 160
Filter
1.
Transpl Infect Dis ; : e14317, 2024 Jun 09.
Article in English | MEDLINE | ID: mdl-38852064

ABSTRACT

BACKGROUND: Opportunistic infections (OIs) are a significant cause of morbidity and mortality after organ transplantation, though data in the liver transplant (LT) population are limited. METHODS: We performed a retrospective cohort study of LT recipients between January 1, 2007 and Deceber 31, 2016 using Medicare claims data linked to the Organ Procurement and Transplantation Network database. Multivariable Cox regression models evaluated factors independently associated with hospitalizations for early (≤1 year post transplant) and late (>1 year) OIs, with a particular focus on immunosuppression. RESULTS: There were 11 320 LT recipients included in the study, of which 13.2% had at least one OI hospitalization during follow-up. Of the 2638 OI hospitalizations, 61.9% were early post-LT. Cytomegalovirus was the most common OI (45.4% overall), although relative frequency decreased after the first year (25.3%). Neither induction or maintenance immunosuppression were associated with early OI hospitalization (all p > .05). The highest risk of early OI was seen with primary sclerosing cholangitis (aHR 1.74; p = .003 overall). Steroid-based and mechanistic target of rapamycin inhibitor-based immunosuppression at 1 year post LT were independently associated with increased late OI (p < .001 overall). CONCLUSION: This study found OI hospitalizations to be relatively common among LT recipients and frequently occur later than previously reported. Immunosuppression regimen may be an important modifiable risk factor for late OIs.

2.
Kidney Med ; 6(5): 100814, 2024 May.
Article in English | MEDLINE | ID: mdl-38689836

ABSTRACT

Rationale & Objective: Limited data exist on longitudinal kidney outcomes after nonsurgical obesity treatments. We investigated the effects of intensive lifestyle intervention on kidney function over 10 years. Study Design: Post hoc analysis of Action for Health in Diabetes (Look AHEAD) randomized controlled trial. Setting & Participants: We studied 4,901 individuals with type 2 diabetes and body mass index of ≥25 kg/m2 enrolled in Look AHEAD (2001-2015). The original Look AHEAD trial excluded individuals with 4+ urine dipstick protein, serum creatinine level of >1.4 mg/dL (women), 1.5 mg/dL (men), or dialysis dependence. Exposures: Intensive lifestyle intervention versus diabetes support and education (ie, usual care). Outcome: Primary outcome was estimated glomerular filtration rate (eGFR, mL/min/1.73 m2) slope. Secondary outcomes were mean eGFR, slope, and mean urine albumin to creatinine ratio (UACR, mg/mg). Analytical Approach: Linear mixed-effects models with random slopes and intercepts to evaluate the association between randomization arms and within-individual repeated measures of eGFR and UACR. We tested for effect modification by baseline eGFR. Results: At baseline, mean eGFR was 89, and 83% had a normal UACR. Over 10 years, there was no difference in eGFR slope (+0.064 per year; 95% CI: -0.036 to 0.16; P = 0.21) between arms. Slope or mean UACR did not differ between arms. Baseline eGFR, categorized as eGFR of <80, 80-100, or >100, did not modify the intervention's effect on eGFR slope or mean. Limitations: Loss of muscle may confound creatinine-based eGFR. Conclusions: In patients with type 2 diabetes and preserved kidney function, intensive lifestyle intervention did not change eGFR slope over 10 years. Among participants with baseline eGFR <80, lifestyle intervention had a slightly higher longitudinal mean eGFR than usual care. Further studies evaluating the effects of intensive lifestyle intervention in people with kidney disease are needed.


Lifestyle interventions can improve chronic kidney disease risk factors, specifically diabetes, hypertension, and obesity. But, the effects of lifestyle intervention on change in kidney function (estimated glomerular filtration rate [eGFR]) over time are not well established. We studied Action for Health in Diabetes (Look AHEAD) trial data because all participants were affected by diabetes and overweight or obesity. Look AHEAD randomized participants to intensive lifestyle intervention or diabetes support and education (ie, usual care). We compared eGFR change over 10 years between groups, but found no difference. However, the intervention group maintained slightly higher eGFR than usual care, especially if eGFR was relatively low at baseline. Our study suggests lifestyle intervention may preserve eGFR, but dedicated studies in individuals with chronic kidney disease are needed.

3.
Stat Med ; 2024 May 23.
Article in English | MEDLINE | ID: mdl-38780593

ABSTRACT

In evaluating the performance of different facilities or centers on survival outcomes, the standardized mortality ratio (SMR), which compares the observed to expected mortality has been widely used, particularly in the evaluation of kidney transplant centers. Despite its utility, the SMR may exaggerate center effects in settings where survival probability is relatively high. An example is one-year graft survival among U.S. kidney transplant recipients. We propose a novel approach to estimate center effects in terms of differences in survival probability (ie, each center versus a reference population). An essential component of the method is a prognostic score weighting technique, which permits accurately evaluating centers without necessarily specifying a correct survival model. Advantages of our approach over existing facility-profiling methods include a metric based on survival probability (greater clinical relevance than ratios of counts/rates); direct standardization (valid to compare between centers, unlike indirect standardization based methods, such as the SMR); and less reliance on correct model specification (since the assumed model is used to generate risk classes as opposed to fitted-value based 'expected' counts). We establish the asymptotic properties of the proposed weighted estimator and evaluate its finite-sample performance under a diverse set of simulation settings. The method is then applied to evaluate U.S. kidney transplant centers with respect to graft survival probability.

4.
Article in English | MEDLINE | ID: mdl-38599308

ABSTRACT

BACKGROUND & AIMS: Greater availability of less invasive biliary imaging to rule out choledocholithiasis should reduce the need for diagnostic endoscopic retrograde cholangiopancreatography (ERCP) in patients who have a remote history of cholecystectomy. The primary aims were to determine the incidence, characteristics, and outcomes of individuals who undergo first-time ERCP >1 year after cholecystectomy (late-ERCP). METHODS: Data from a commercial insurance claim database (Optum Clinformatics) identified 583,712 adults who underwent cholecystectomy, 4274 of whom underwent late-ERCP, defined as first-time ERCP for nonmalignant indications >1 year after cholecystectomy. Outcomes were exposure and temporal trends in late-ERCP, biliary imaging utilization, and post-ERCP outcomes. Multivariable logistic regression was used to examine patient characteristics associated with undergoing late-ERCP. RESULTS: Despite a temporal increase in the use of noninvasive biliary imaging (35.9% in 2004 to 65.6% in 2021; P < .001), the rate of late-ERCP increased 8-fold (0.5-4.2/1000 person-years from 2005 to 2021; P < .001). Although only 44% of patients who underwent late-ERCP had gallstone removal, there were high rates of post-ERCP pancreatitis (7.1%), hospitalization (13.1%), and new chronic opioid use (9.7%). Factors associated with late-ERCP included concomitant disorder of gut-brain interaction (odds ratio [OR], 6.48; 95% confidence interval [CI], 5.88-6.91) and metabolic dysfunction steatotic liver disease (OR, 3.27; 95% CI, 2.79-3.55) along with use of anxiolytic (OR, 3.45; 95% CI, 3.19-3.58), antispasmodic (OR, 1.60; 95% CI, 1.53-1.72), and chronic opioids (OR, 6.24; 95% CI, 5.79-6.52). CONCLUSIONS: The rate of late-ERCP postcholecystectomy is increasing significantly, particularly in patients with comorbidities associated with disorder of gut-brain interaction and mimickers of choledocholithiasis. Late-ERCPs are associated with disproportionately higher rates of adverse events, including initiation of chronic opioid use.

5.
Clin Transplant ; 38(5): e15319, 2024 May.
Article in English | MEDLINE | ID: mdl-38683684

ABSTRACT

OBJECTIVE: Longer end-stage renal disease time has been associated with inferior kidney transplant outcomes. However, the contribution of transplant evaluation is uncertain. We explored the relationship between time from evaluation to listing (ELT) and transplant outcomes. METHODS: This retrospective study included 2535 adult kidney transplants from 2000 to 2015. Kaplan-Meier survival curves, log-rank tests, and Cox regression models were used to compare transplant outcomes. RESULTS: Patient survival for both deceased donor (DD) recipients (p < .001) and living donor (LD) recipients (p < .0001) was significantly higher when ELT was less than 3 months. The risks of ELT appeared to be mediated by other risks in DD recipients, as adjusted models showed no associated risk of graft loss or death in DD recipients. For LD recipients, ELT remained a risk factor for patient death after covariate adjustment. Each month of ELT was associated with an increased risk of death (HR = 1.021, p = .04) but not graft loss in LD recipients in adjusted models. CONCLUSIONS: Kidney transplant recipients with longer ELT times had higher rates of death after transplant, and ELT was independently associated with an increased risk of death for LD recipients. Investigations on the impact of pretransplant evaluation on post-transplant outcomes can inform transplant policy and practice.


Subject(s)
Graft Survival , Kidney Failure, Chronic , Kidney Transplantation , Waiting Lists , Humans , Kidney Transplantation/mortality , Kidney Transplantation/adverse effects , Female , Male , Retrospective Studies , Middle Aged , Kidney Failure, Chronic/surgery , Follow-Up Studies , Risk Factors , Waiting Lists/mortality , Prognosis , Survival Rate , Adult , Graft Rejection/etiology , Graft Rejection/mortality , Tissue Donors/supply & distribution , Glomerular Filtration Rate , Kidney Function Tests , Living Donors/supply & distribution , Tissue and Organ Procurement , Time Factors , Postoperative Complications
6.
Am J Transplant ; 24(5): 839-849, 2024 May.
Article in English | MEDLINE | ID: mdl-38266712

ABSTRACT

Lung transplantation lags behind other solid organ transplants in donor lung utilization due, in part, to uncertainty regarding donor quality. We sought to develop an easy-to-use donor risk metric that, unlike existing metrics, accounts for a rich set of donor factors. Our study population consisted of n = 26 549 adult lung transplant recipients abstracted from the United Network for Organ Sharing Standard Transplant Analysis and Research file. We used Cox regression to model graft failure (GF; earliest of death or retransplant) risk based on donor and transplant factors, adjusting for recipient factors. We then derived and validated a Lung Donor Risk Index (LDRI) and developed a pertinent online application (https://shiny.pmacs.upenn.edu/LDRI_Calculator/). We found 12 donor/transplant factors that were independently predictive of GF: age, race, insulin-dependent diabetes, the difference between donor and recipient height, smoking, cocaine use, cytomegalovirus seropositivity, creatinine, human leukocyte antigen (HLA) mismatch, ischemia time, and donation after circulatory death. Validation showed the LDRI to have GF risk discrimination that was reasonable (C = 0.61) and higher than any of its predecessors. The LDRI is intended for use by transplant centers, organ procurement organizations, and regulatory agencies and to benefit patients in decision-making. Unlike its predecessors, the proposed LDRI could gain wide acceptance because of its granularity and similarity to the Kidney Donor Risk Index.


Subject(s)
Graft Rejection , Graft Survival , Lung Transplantation , Tissue Donors , Tissue and Organ Procurement , Humans , Lung Transplantation/adverse effects , Female , Male , Tissue Donors/supply & distribution , Middle Aged , Risk Factors , Adult , Graft Rejection/etiology , Follow-Up Studies , Prognosis , Risk Assessment
7.
Liver Transpl ; 2024 Jan 24.
Article in English | MEDLINE | ID: mdl-38265295

ABSTRACT

Given liver transplantation organ scarcity, selection of recipients and donors to maximize post-transplant benefit is paramount. Several scores predict post-transplant outcomes by isolating elements of donor and recipient risk, including the donor risk index, Balance of Risk, pre-allocation score to predict survival outcomes following liver transplantation/survival outcomes following liver transplantation (SOFT), improved donor-to-recipient allocation score for deceased donors only/improved donor-to-recipient allocation score for both deceased and living donors (ID2EAL-D/-DR), and survival benefit (SB) models. No studies have examined the performance of these models over time, which is critical in an ever-evolving transplant landscape. This was a retrospective cohort study of liver transplantation events in the UNOS database from 2002 to 2021. We used Cox regression to evaluate model discrimination (Harrell's C) and calibration (testing of calibration curves) for post-transplant patient and graft survival at specified post-transplant timepoints. Sub-analyses were performed in the modern transplant era (post-2014) and for key donor-recipient characteristics. A total of 112,357 transplants were included. The SB and SOFT scores had the highest discrimination for short-term patient and graft survival, including in the modern transplant era, where only the SB model had good discrimination (C ≥ 0.60) for all patient and graft outcome timepoints. However, these models had evidence of poor calibration at 3- and 5-year patient survival timepoints. The ID2EAL-DR score had lower discrimination but adequate calibration at all patient survival timepoints. In stratified analyses, SB and SOFT scores performed better in younger (< 40 y) and higher Model for End-Stage Liver Disease (≥ 25) patients. All prediction scores had declining discrimination over time, and scores relying on donor factors alone had poor performance. Although the SB and SOFT scores had the best overall performance, all models demonstrated declining performance over time. This underscores the importance of periodically updating and/or developing new prediction models to reflect the evolving transplant field. Scores relying on donor factors alone do not meaningfully inform post-transplant risk.

8.
Blood Adv ; 8(5): 1272-1280, 2024 Mar 12.
Article in English | MEDLINE | ID: mdl-38163322

ABSTRACT

ABSTRACT: Hospitalized patients with inflammatory bowel disease (IBD) are at increased risk of venous thromboembolism (VTE). We aimed to evaluate the effectiveness and safety of prophylactic anticoagulation compared with no anticoagulation in hospitalized patients with IBD. We conducted a retrospective cohort study using a hospital-based database. We included patients with IBD who had a length of hospital stay ≥2 days between 1 January 2016 and 31 December 2019. We excluded patients who had other indications for anticoagulation, users of direct oral anticoagulants, warfarin, therapeutic-intensity heparin, and patients admitted for surgery. We defined exposure to prophylactic anticoagulation using charge codes. The primary effectiveness outcome was VTE. The primary safety outcome was bleeding. We used propensity score matching to reduce potential differences between users and nonusers of anticoagulants and Cox proportional-hazards regression to estimate adjusted hazard ratios (HRs) and 95% confidence intervals (CIs). The analysis included 56 194 matched patients with IBD (users of anticoagulants, n = 28 097; nonusers, n = 28 097). In the matched sample, prophylactic use of anticoagulants (vs no use) was associated with a lower rate of VTE (HR, 0.62; 95% CI, 0.41-0.94) and with no difference in the rate of bleeding (HR, 1.05; 95% CI, 0.87-1.26). In this study of hospitalized patients with IBD, prophylactic use of heparin was associated with a lower rate of VTE without increasing bleeding risk compared with no anticoagulation. Our results suggest potential benefits of prophylactic anticoagulation to reduce the burden of VTE in hospitalized patients with IBD.


Subject(s)
Inflammatory Bowel Diseases , Venous Thromboembolism , Humans , Venous Thromboembolism/prevention & control , Venous Thromboembolism/complications , Retrospective Studies , Anticoagulants/adverse effects , Hemorrhage/chemically induced , Heparin/adverse effects , Inflammatory Bowel Diseases/complications , Inflammatory Bowel Diseases/drug therapy
9.
Transplantation ; 108(3): 713-723, 2024 Mar 01.
Article in English | MEDLINE | ID: mdl-37635282

ABSTRACT

BACKGROUND: Outcomes after living-donor liver transplantation (LDLT) at high Model for End-stage Liver Disease (MELD) scores are not well characterized in the United States. METHODS: This was a retrospective cohort study using Organ Procurement and Transplantation Network data in adults listed for their first liver transplant alone between 2002 and 2021. Cox proportional hazards models evaluated the association of MELD score (<20, 20-24, 25-29, and ≥30) and patient/graft survival after LDLT and the association of donor type (living versus deceased) on outcomes stratified by MELD. RESULTS: There were 4495 LDLTs included with 5.9% at MELD 25-29 and 1.9% at MELD ≥30. LDLTs at MELD 25-29 and ≥30 LDLT have substantially increased since 2010 and 2015, respectively. Patient survival at MELD ≥30 was not different versus MELD <20: adjusted hazard ratio 1.67 (95% confidence interval, 0.96-2.88). However, graft survival was worse: adjusted hazard ratio (aHR) 1.69 (95% confidence interval, 1.07-2.68). Compared with deceased-donor liver transplant, LDLT led to superior patient survival at MELD <20 (aHR 0.92; P = 0.024) and 20-24 (aHR 0.70; P < 0.001), equivalent patient survival at MELD 25-29 (aHR 0.97; P = 0.843), but worse graft survival at MELD ≥30 (aHR 1.68, P = 0.009). CONCLUSIONS: Although patient survival remains acceptable, the benefits of LDLT may be lost at MELD ≥30.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Adult , Humans , United States , Living Donors , Liver Transplantation/adverse effects , End Stage Liver Disease/diagnosis , End Stage Liver Disease/surgery , Retrospective Studies , Severity of Illness Index , Graft Survival , Treatment Outcome
10.
Pediatr Crit Care Med ; 25(1): e41-e46, 2024 Jan 01.
Article in English | MEDLINE | ID: mdl-37462429

ABSTRACT

OBJECTIVE: To determine the association of venovenous extracorporeal membrane oxygenation (VV-ECMO) initiation with changes in vasoactive-inotropic scores (VISs) in children with pediatric acute respiratory distress syndrome (PARDS) and cardiovascular instability. DESIGN: Retrospective cohort study. SETTING: Single academic pediatric ECMO center. PATIENTS: Children (1 mo to 18 yr) treated with VV-ECMO (2009-2019) for PARDS with need for vasopressor or inotropic support at ECMO initiation. MEASUREMENTS AND MAIN RESULTS: Arterial blood gas values, VIS, mean airway pressure (mPaw), and oxygen saturation (Sp o2 ) values were recorded hourly relative to the start of ECMO flow for 24 hours pre-VV-ECMO and post-VV-ECMO cannulation. A sharp kink discontinuity regression analysis clustered by patient tested the difference in VISs and regression line slopes immediately surrounding cannulation. Thirty-two patients met inclusion criteria: median age 6.6 years (interquartile range [IQR] 1.5-11.7), 22% immunocompromised, and 75% had pneumonia or sepsis as the cause of PARDS. Pre-ECMO characteristics included: median oxygenation index 45 (IQR 35-58), mPaw 32 cm H 2o (IQR 30-34), 97% on inhaled nitric oxide, and 81% on an advanced mode of ventilation. Median VIS immediately before VV-ECMO cannulation was 13 (IQR 8-25) with an overall increasing VIS trajectory over the hours before cannulation. VISs decreased and the slope of the regression line reversed immediately surrounding the time of cannulation (robust p < 0.0001). There were pre-ECMO to post-ECMO cannulation decreases in mPaw (32 vs 20 cm H 2o , p < 0.001) and arterial P co2 (64.1 vs 50.1 mm Hg, p = 0.007) and increases in arterial pH (7.26 vs 7.38, p = 0.001), arterial base excess (2.5 vs 5.2, p = 0.013), and SpO 2 (91% vs 95%, p = 0.013). CONCLUSIONS: Initiation of VV-ECMO was associated with an immediate and sustained reduction in VIS in PARDS patients with cardiovascular instability. This VIS reduction was associated with decreased mPaw and reduced respiratory and/or metabolic acidosis as well as improved oxygenation.


Subject(s)
Extracorporeal Membrane Oxygenation , Respiratory Distress Syndrome , Respiratory Insufficiency , Humans , Child , Retrospective Studies , Respiratory Distress Syndrome/therapy , Respiratory Insufficiency/therapy , Arteries
12.
Stat Methods Med Res ; 32(12): 2386-2404, 2023 12.
Article in English | MEDLINE | ID: mdl-37965684

ABSTRACT

The hazard ratio (HR) remains the most frequently employed metric in assessing treatment effects on survival times. However, the difference in restricted mean survival time (RMST) has become a popular alternative to the HR when the proportional hazards assumption is considered untenable. Moreover, independent of the proportional hazards assumption, many comparative effectiveness studies aim to base contrasts on survival probability rather than on the hazard function. Causal effects based on RMST are often estimated via inverse probability of treatment weighting (IPTW). However, this approach generally results in biased results when the assumed propensity score model is misspecified. Motivated by the need for more robust techniques, we propose an empirical likelihood-based weighting approach that allows for specifying a set of propensity score models. The resulting estimator is consistent when the postulated model set contains a correct model; this property has been termed multiple robustness. In this report, we derive and evaluate a multiply robust estimator of the causal between-treatment difference in RMST. Simulation results confirm its robustness. Compared with the IPTW estimator from a correct model, the proposed estimator tends to be less biased and more efficient in finite samples. Additional simulations reveal biased results from a direct application of machine learning estimation of propensity scores. Finally, we apply the proposed method to evaluate the impact of intrapartum group B streptococcus antibiotic prophylaxis on the risk of childhood allergic disorders using data derived from electronic medical records from the Children's Hospital of Philadelphia and census data from the American Community Survey.


Subject(s)
Models, Statistical , Child , Humans , Likelihood Functions , Survival Rate , Proportional Hazards Models , Computer Simulation , Propensity Score
13.
Prog Transplant ; 33(4): 283-292, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37941335

ABSTRACT

Introduction: Organ recovery facilities address the logistical challenges of hospital-based deceased organ donor management. While more organs are transplanted from donors in facilities, differences in donor management and donation processes are not fully characterized. Research Question: Does deceased donor management and organ transport distance differ between organ procurement organization (OPO)-based recovery facilities versus hospitals? Design: Retrospective analysis of Organ Procurement and Transplant Network data, including adults after brain death in 10 procurement regions (April 2017-June 2021). The primary outcomes were ischemic times of transplanted hearts, kidneys, livers, and lungs. Secondary outcomes included transport distances (between the facility or hospital and the transplant program) for each transplanted organ. Results: Among 5010 deceased donors, 51.7% underwent recovery in an OPO-based recovery facility. After adjustment for recipient and system factors, mean differences in ischemic times of any transplanted organ were not significantly different between donors in facilities and hospitals. Transplanted hearts recovered from donors in facilities were transported further than hearts from hospital donors (median 255 mi [IQR 27, 475] versus 174 [IQR 42, 365], P = .002); transport distances for livers and kidneys were significantly shorter (P < .001 for both). Conclusion: Organ recovery procedures performed in OPO-based recovery facilities were not associated with differences in ischemic times in transplanted organs from organs recovered in hospitals, but differences in organ transport distances exist. Further work is needed to determine whether other observed differences in donor management and organ distribution meaningfully impact donation and transplantation outcomes.


Subject(s)
Organ Transplantation , Tissue and Organ Procurement , Adult , Humans , Retrospective Studies , Tissue Donors , Hospitals
14.
Hepatol Commun ; 7(10)2023 Oct 01.
Article in English | MEDLINE | ID: mdl-37916863

ABSTRACT

Liver transplantation is a life-saving option for decompensated cirrhosis. Liver transplant recipients require advanced self-management skills, intact cognitive skills, and care partner support to improve long-term outcomes. Gaps remain in understanding post-liver transplant cognitive and health trajectories, and patient factors such as self-management skills, care partner support, and sleep. Our aims are to (1) assess pre-liver transplant to post-liver transplant cognitive trajectories and identify risk factors for persistent cognitive impairment; (2) evaluate associations between cognitive function and self-management skills, health behaviors, functional health status, and post-transplant outcomes; and (3) investigate potential mediators and moderators of associations between cognitive function and post-liver transplant outcomes. LivCog is a longitudinal, prospective observational study that will enroll 450 adult liver transplant recipients and their caregivers/care partners. The duration of the study is 5 years with 24 additional months of patient follow-up. Data will be collected from participants at 1, 3, 12, and 24 months post-transplant. Limited pre-liver transplant data will also be collected from waitlisted candidates. Data collection methods include interviews, surveys, cognitive assessments, and actigraphy/sleep diary measures. Patient measurements include sociodemographic characteristics, pretransplant health status, cognitive function, physical function, perioperative measures, medical history, transplant history, self-management skills, patient-reported outcomes, health behaviors, and clinical outcomes. Caregiver measures assess sociodemographic variables, health literacy, health care navigation skills, self-efficacy, care partner preparedness, nature and intensity of care, care partner burden, and community participation. By elucidating various health trajectories from pre-liver transplant to 2 years post-liver transplant, LivCog will be able to better characterize recipients at higher risk of cognitive impairment and compromised self-management. Findings will inform interventions targeting health behaviors, self-management, and caregiver supports to optimize outcomes.


Subject(s)
Cognitive Dysfunction , Liver Transplantation , Self-Management , Adult , Humans , Liver Transplantation/adverse effects , Prospective Studies , Cognition , Cognitive Dysfunction/etiology
15.
BMC Oral Health ; 23(1): 763, 2023 10 17.
Article in English | MEDLINE | ID: mdl-37848867

ABSTRACT

BACKGROUND: Long-term antiretroviral therapy (ART) perpetually suppresses HIV load and has dramatically altered the prognosis of HIV infection, such that HIV is now regarded as a chronic disease. Side effects of ART in Patients With HIV (PWH), has introduced new challenges including "metabolic" (systemic) and oral complications. Furthermore, inflammation persists despite great viral load suppression and normal levels of CD4+ cell count. The impact of ART on the spectrum of oral diseases among PWH is often overlooked relative to other systemic complications. There is paucity of data on oral complications associated with ART use in PWH. This is in part due to limited prospective longitudinal studies designed to better understand the range of oral abnormalities observed in PWH on ART. METHODS: We describe here the study design, including processes associated with subject recruitment and retention, study visit planning, oral health assessments, bio-specimen collection and preprocessing procedures, and data management and statistical plan. DISCUSSION: We present a procedural roadmap that could be modelled to assess the extent and progression of oral diseases associated with ART in PWH. We also highlight the rigors and challenges associated with our ongoing participant recruitment and retention. A rigorous prospective longitudinal study requires proper planning and execution. A great benefit is that large data sets are collected and biospecimen repository can be used to answer more questions in future studies including genetic, microbiome and metabolome-based studies. TRIAL REGISTRATION: National Institute of Health Clinical Trials Registration (NCT) #: NCT04645693.


Subject(s)
Anti-HIV Agents , HIV Infections , Humans , HIV Infections/complications , HIV Infections/drug therapy , Anti-HIV Agents/adverse effects , Longitudinal Studies , Prospective Studies , Viral Load , Outcome Assessment, Health Care
16.
Res Sq ; 2023 Oct 04.
Article in English | MEDLINE | ID: mdl-37886466

ABSTRACT

Long-term antiretroviral therapy (ART) perpetually suppresses HIV load and has dramatically altered the prognosis of HIV infection, such that HIV is now regarded as a chronic disease. Side effects of ART in Patients With HIV (PWH), has introduced new challenges including "metabolic" (systemic) and oral complications. Furthermore, inflammation persists despite great viral load suppression and normal levels of CD4+ cell count. The impact of ART on the spectrum of oral diseases among PWH is often overlooked relative to other systemic complications. There is paucity of data on oral complications associated with ART use in PWH. This is in part due to limited prospective longitudinal studies designed to better understand the range of oral abnormalities observed in PWH on ART. Our group designed and implemented a prospective observational longitudinal study to address this gap. We present a procedural roadmap that could be modelled to assess the extent and progression of oral diseases associated with ART in PWH. We described here the processes associated with subject recruitment and retention, study visit planning, oral health assessments, bio-specimen collection and preprocessing procedures, and data management. We also highlighted the rigors and challenges associated with participant recruitment and retention.

17.
BMJ Open ; 13(9): e075172, 2023 09 18.
Article in English | MEDLINE | ID: mdl-37723108

ABSTRACT

BACKGROUND AND AIMS: Liver transplantation is a life-saving procedure for end-stage liver disease. However, post-transplant medication regimens are complex and non-adherence is common. Post-transplant medication non-adherence is associated with graft rejection, which can have long-term adverse consequences. Transplant centres are equipped with clinical staff that monitor patients post-transplant; however, digital health tools and proactive immunosuppression adherence monitoring has potential to improve outcomes. METHODS AND ANALYSIS: This is a patient-randomised prospective clinical trial at three transplant centres in the Northeast, Midwest and South to investigate the effects of a remotely administered adherence programme compared with usual care. The programme monitors potential non-adherence largely levering text message prompts and phenotypes the nature of the non-adhere as cognitive, psychological, medical, social or economic. Additional reminders for medications, clinical appointments and routine self-management support are incorporated to promote adherence to the entire medical regimen. The primary study outcome is medication adherence via 24-hour recall; secondary outcomes include additional medication adherence (ASK-12 self-reported scale, regimen knowledge scales, tacrolimus values), quality of life, functional health status and clinical outcomes (eg, days hospitalised). Study implementation, acceptability, feasibility, costs and potential cost-effectiveness will also be evaluated. ETHICS AND DISSEMINATION: The University of Pennsylvania Review Board has approved the study as the single IRB of record (protocol # 849575, V.1.4). Results will be published in peer-reviewed journals and summaries will be provided to study funders. TRIAL REGISTRATION NUMBER: NCT05260268.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Humans , Prospective Studies , Quality of Life , Treatment Adherence and Compliance
18.
Gastroenterology ; 165(5): 1197-1205.e2, 2023 11.
Article in English | MEDLINE | ID: mdl-37481117

ABSTRACT

BACKGROUND & AIMS: We sought to estimate the incidence, prevalence, and racial-ethnic distribution of physician-diagnosed inflammatory bowel disease (IBD) in the United States. METHODS: The study used 4 administrative claims data sets: a 20% random sample of national fee-for-service Medicare data (2007 to 2017); Medicaid data from Florida, New York, Pennsylvania, Ohio, and California (1999 to 2012); and commercial health insurance data from Anthem beneficiaries (2006 to 2018) and Optum's deidentified Clinformatics Data Mart (2000 to 2017). We used validated combinations of medical diagnoses, diagnostic procedures, and prescription medications to identify incident and prevalent diagnoses. We computed pooled age-, sex-, and race/ethnicity-specific insurance-weighted estimates and pooled estimates standardized to 2018 United States Census estimates with 95% confidence intervals (CIs). RESULTS: The age- and sex-standardized incidence of IBD per 100,000 person-years was 10.9 (95% CI, 10.6-11.2). The incidence of IBD peaked in the third decade of life, decreased to a relatively stable level across the fourth to eighth decades, and declined further. The age-, sex- and insurance-standardized prevalence of IBD was 721 per 100,000 population (95% CI, 717-726). Extrapolated to the 2020 United States Census, an estimated 2.39 million Americans are diagnosed with IBD. The prevalence of IBD per 100,000 population was 812 (95% CI, 802-823) in White, 504 (95% CI, 482-526) in Black, 403 (95% CI, 373-433) in Asian, and 458 (95% CI, 440-476) in Hispanic Americans. CONCLUSIONS: IBD is diagnosed in >0.7% of Americans. The incidence peaks in early adulthood and then plateaus at a lower rate. The disease is less commonly diagnosed in Black, Asian, and Hispanic Americans.


Subject(s)
Inflammatory Bowel Diseases , Medicare , Humans , United States/epidemiology , Aged , Adult , Prevalence , Incidence , Inflammatory Bowel Diseases/diagnosis , Inflammatory Bowel Diseases/epidemiology , Florida
19.
J Heart Lung Transplant ; 42(12): 1735-1742, 2023 12.
Article in English | MEDLINE | ID: mdl-37437825

ABSTRACT

BACKGROUND: Whether functional status is associated with survival to pediatric lung transplant is unknown. We hypothesized that completely dependent functional status at waitlist registration, defined using Lansky Play Performance Scale (LPPS), would be associated with worse outcomes. METHODS: Retrospective cohort study of pediatric lung transplant registrants utilizing United Network for Organ Sharing's Standard Transplant Analysis and Research files (2005-2020). Primary exposure was completely dependent functional status, defined as LPPS score of 10-40. Primary outcome was waitlist removal for death/deterioration with cause-specific hazard ratio (CSHR) regression. Subdistribution hazard regression (SHR, Fine and Gray) was used for the secondary outcome of waitlist removal due to transplant/improvement with a competing risk of death/deterioration. Confounders included: sex, age, race, diagnosis, ventilator dependence, extracorporeal membrane oxygenation, year, and listing center volume. RESULTS: A total of 964 patients were included (63.5% ≥ 12 years, 50.2% cystic fibrosis [CF]). Median waitlist days were 95; 20.1% were removed for death/deterioration and 68.2% for transplant/improvement. Completely dependent functional status was associated with removal due to death/deterioration (adjusted CSHR 5.30 [95% CI 2.86-9.80]). This association was modified by age (interaction p = 0.0102), with a larger effect for age ≥12 years, and particularly strong for CF. In the Fine and Gray model, completely dependent functional status did not affect the risk of removal due to transplant/improvement with a competing risk of death/deterioration (adjusted SHR 1.08 [95% CI 0.77-1.49]). CONCLUSIONS: Pediatric lung transplant registrants with the worst functional status had worse pretransplant outcomes, especially for adolescents and CF patients. Functional status at waitlist registration may be a modifiable risk factor to improve survival to lung transplant.


Subject(s)
Cystic Fibrosis , Lung Transplantation , Adolescent , Humans , Child , Retrospective Studies , Functional Status , Risk Factors , Waiting Lists
20.
Inflamm Bowel Dis ; 29(12): 1993-1996, 2023 Dec 05.
Article in English | MEDLINE | ID: mdl-37043675

ABSTRACT

BACKGROUND: To facilitate inflammatory bowel disease (IBD) research in the United States, we developed and validated claims-based definitions to identify incident and prevalent IBD diagnoses using administrative healthcare claims data among multiple payers. METHODS: We used data from Medicare, Medicaid, and the HealthCore Integrated Research Database (Anthem commercial and Medicare Advantage claims). The gold standard for validation was review of medical records. We evaluated 1 incidence and 4 prevalence algorithms based on a combination of International Classification of Diseases codes, National Drug Codes, and Current Procedural Terminology codes. The claims-based incident diagnosis date needed to be within ±90 days of that recorded in the medical record to be valid. RESULTS: We reviewed 111 charts of patients with a potentially incident diagnosis. The positive predictive value (PPV) of the claims algorithm was 91% (95% confidence interval [CI], 81%-97%). We reviewed 332 charts to validate prevalent case definition algorithms. The PPV was 94% (95% CI, 86%-98%) for ≥2 IBD diagnoses and presence of prescriptions for IBD medications, 92% (95% CI, 85%-97%) for ≥2 diagnoses without any medications, 78% (95% CI, 67%-87%) for a single diagnosis and presence of an IBD medication, and 35% (95% CI, 25%-46%) for 1 physician diagnosis and no IBD medications. CONCLUSIONS: Through a combination of diagnosis, procedural, and medication codes in insurance claims data, we were able to identify incident and prevalent IBD cases with high accuracy. These algorithms can be useful for the ascertainment of IBD cases in future studies.


Subject(s)
Inflammatory Bowel Diseases , Medicare , Humans , Aged , United States/epidemiology , Insurance Claim Review , Inflammatory Bowel Diseases/diagnosis , Inflammatory Bowel Diseases/epidemiology , International Classification of Diseases , Databases, Factual , Algorithms
SELECTION OF CITATIONS
SEARCH DETAIL
...