Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 253
Filter
1.
Am J Kidney Dis ; 2024 Jun 12.
Article in English | MEDLINE | ID: mdl-38876272

ABSTRACT

RATIONALE & OBJECTIVE: Exposure to extreme heat events has been linked to increased morbidity and mortality in the general population. Patients receiving maintenance dialysis may be vulnerable to greater risks from these events, but this is not well understood. We sought to characterize the association of extreme heat events and the risk of death among patients receiving dialysis in the United States. STUDY DESIGN: Retrospective cohort study. SETTING & PARTICIPANTS: Data from the United States Renal Data System were used to identify adults living in US urban settlements prone to extreme heat who initiated maintenance dialysis between 1997 and 2016. EXPOSURE: An extreme heat event was defined as a time-updated heat index (a humid-heat metric) exceeding 40.6°C for ≥2 days or 46.1°C for ≥1 day. OUTCOME: Death. ANALYTICAL APPROACH: Cox proportional hazards regression to estimate the elevation in risk of death during a humid-heat event adjusted for age, sex, year of dialysis initiation, dialysis modality, poverty level, and climate region. Interactions between humid-heat and these same factors were explored. RESULTS: Among 945,251 adults in 245 urban settlements, the mean age was 63 years and 44% were female. During a median follow-up of 3.6 years, 498,049 adults were exposed to at least one of 7,154 extreme humid-heat events, and 500,025 deaths occurred. In adjusted models, there was an increased risk of death (hazard ratio 1.18; 95% confidence interval 1.15-1.20) during extreme humid-heat exposure. Relative mortality risk was higher among patients living in the Southeast (P<0.001) compared with the Southwest. LIMITATIONS: Possibility of exposure misclassification, did not account for land use and air pollution co-exposures. CONCLUSIONS: This study suggests that patients receiving dialysis face an increased risk of death during extreme humid-heat exposure.

2.
Article in English | MEDLINE | ID: mdl-38693617

ABSTRACT

BACKGROUND: Social determinants of health (SDoH) likely contribute to outcome disparities in lupus nephritis (LN). Understanding the overall burden and contribution of each domain could guide future health-equity focused interventions to improve outcomes and reduce disparities in LN. Objectives of this meta-analysis were to: 1) determine the association of overall SDoH and specific SDoH domains on LN outcomes, and 2) develop a framework for the multidimensional impact of SDoH on LN outcomes. METHODS: We performed a comprehensive search of studies measuring associations between SDoH and LN outcomes. We examined pooled odds of poor LN outcomes including mortality, end-stage kidney disease, or cardiovascular disease in patients with and without adverse SDoH. Additionally, we calculated the pooled odds ratios of outcomes by four SDoH domains: individual (e.g., insurance), healthcare (e.g., fragmented care), community (e.g., neighborhood socioeconomic status), and health behaviors (e.g., smoking). RESULTS: Among 531 screened studies, 31 met inclusion and 13 studies with raw data were included in meta-analysis. Pooled odds of poor outcomes, were 1.47-fold higher in patients with any adverse SDoH. Patients with adverse SDoH in individual and healthcare domains had 1.64-fold and 1.77-fold higher odds of poor outcomes. We found a multiplicative impact of having ≥2 adverse SDoH on LN outcomes. Patients of Black Race with public insurance and fragmented care had 12-fold higher odds of poor LN outcomes. CONCLUSION: Adverse SDoH is associated with poor LN outcomes. Having ≥2 adverse SDoH, specifically in different SDoH domains, had a multiplicative impact leading to worse LN outcomes, widening disparities.

3.
Am J Nephrol ; 2024 May 16.
Article in English | MEDLINE | ID: mdl-38754385

ABSTRACT

INTRODUCTION: The Center for Medicare and Medicaid Services (CMS) introduced an End Stage Renal Disease (ESRD) Prospective Payment System (PPS) in 2011 to increase the utilization of home dialysis modalities, including peritoneal dialysis (PD). Several studies have shown a significant increase in PD utilization after PPS implementation. However, its impact on patients with kidney allograft failure remains unknown. METHODS: We conducted an interrupted time series (ITS) analysis using data from the United States Renal Data System (USRDS) that include all adult kidney transplant recipients with allograft failure who started dialysis between 2005 and 2019. We compared the PD utilization in the pre-PPS period (2005-2010) to the fully implemented post-PPS period (2014 - 2019) for early (within 90 days) and late (91-365 days) PD experience. RESULTS: 27507 adult recipients with allograft failure started dialysis during the study period. There was no difference in early PD utilization between the pre-PPS and the post-PPS period in either immediate change (0.3% increase; 95%CI: -1.95%, 2.54%; p=0.79) or rate of change over time (0.28% increase per year; 95%CI: -0.16%, 0.72%; p=0.18). Subgroup analyses revealed a trend toward higher PD utilization post-PPS in for-profit and large-volume dialysis units. There was a significant increase in PD utilization in the post-PPS period in units with low PD experience in the pre-PPS period. Similar findings were seen for the late PD experience. CONCLUSION: PPS did not significantly increase the overall utilization of PD in patients initiating dialysis after allograft failure.

4.
Lupus ; 33(8): 804-815, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38631342

ABSTRACT

OBJECTIVE: In systemic lupus erythematosus, poor disease outcomes occur in young adults, patients identifying as Black or Hispanic, and socioeconomically disadvantaged patients. These identities and social factors differentially shape care access and quality that contribute to lupus health disparities in the US. Thus, our objective was to measure markers of care access and quality, including rheumatology visits (longitudinal care retention) and lupus-specific serology testing, by race and ethnicity, neighborhood disadvantage, and geographic context. METHODS: This cohort study used a geo-linked 20% national sample of young adult Medicare beneficiaries (ages 18-35) with lupus-coded encounters and a 1-year assessment period. Retention in lupus care required a rheumatology visit in each 6-month period, and serology testing required ≥1 complement or dsDNA antibody test within the year. Multivariable logistic regression models were fit for visit-based retention and serology testing to determine associations with race and ethnicity, neighborhood disadvantage, and geography. RESULTS: Among 1,036 young adults with lupus, 39% saw a rheumatologist every 6 months and 28% had serology testing. White beneficiaries from the least disadvantaged quintile of neighborhoods had higher visit-based retention than other beneficiaries (64% vs 30%-60%). Serology testing decreased with increasing neighborhood disadvantage quintile (aOR 0.80; 95% CI 0.71, 0.90) and in the Midwest (aOR 0.46; 0.30, 0.71). CONCLUSION: Disparities in care, measured by rheumatology visits and serology testing, exist by neighborhood disadvantage, race and ethnicity, and region among young adults with lupus, despite uniform Medicare coverage. Findings support evaluating lupus care quality measures and their impact on US lupus outcomes.


Subject(s)
Healthcare Disparities , Lupus Erythematosus, Systemic , Medicare , Rheumatology , Humans , Lupus Erythematosus, Systemic/therapy , United States , Adult , Male , Female , Young Adult , Adolescent , Healthcare Disparities/statistics & numerical data , Retention in Care/statistics & numerical data , Health Services Accessibility/statistics & numerical data , Cohort Studies , Logistic Models , Black or African American/statistics & numerical data
5.
Transplant Direct ; 10(4): e1600, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38550773

ABSTRACT

Background: Recurrence of glomerulonephritis (GN) is a significant contributor to long-term allograft failure among kidney transplant recipients (KTRs) with kidney failure because of GN. Accumulating evidence has revealed the role of vitamin D in both innate and adaptive immunity. Although vitamin D deficiency is common among KTRs, the association between 25-hydroxyvitamin D (25[OH]D) and GN recurrence in KTRs remains unclear. Methods: We analyzed data from KTRs with kidney failure caused by GN who received a transplant at our center from 2000 to 2019 and had at least 1 valid posttransplant serum 25(OH)D measurement. Survival analyses were performed using a competing risk regression model considering other causes of allograft failure, including death, as competing risk events. Results: A total of 67 cases of GN recurrence were identified in 947 recipients with GN followed for a median of 7.0 y after transplant. Each 1 ng/mL lower serum 25(OH)D was associated with a 4% higher hazard of recurrence (subdistribution hazard ratio [HR]: 1.04; 95% confidence interval [CI], 1.01-1.06). Vitamin D deficiency (≤20 ng/mL) was associated with a 2.99-fold (subdistribution HR: 2.99; 95% CI, 1.56-5.73) higher hazard of recurrence compared with vitamin D sufficiency (≥30 ng/mL). Results were similar after further adjusting for concurrent urine protein-creatinine ratio, serum albumin, and estimated glomerular filtration rate (eGFR). Conclusions: Posttransplant vitamin D deficiency is associated with a higher hazard of GN recurrence in KTRs. Further prospective observational studies and clinical trials are needed to determine any causal role of vitamin D in the recurrence of GN after kidney transplantation. More in vitro and in vivo experiments would be helpful to understand its effects on autoimmune and inflammation processes.

6.
Transplant Direct ; 10(4): e1607, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38464426

ABSTRACT

Background: Posttransplant erythrocytosis (PTE) is a well-known complication of kidney transplantation. However, the risk and outcomes of PTE among simultaneous pancreas-kidney transplant (SPKT) recipients are poorly described. Methods: We analyzed all SPKT recipients at our center between 1998 and 2021. PTE was defined as at least 2 consecutive hematocrit levels of >51% within the first 2 y of transplant. Controls were selected at a ratio of 3:1 at the time of PTE occurrence using event density sampling. Risk factors for PTE and post-PTE graft survival were identified. Results: Of 887 SPKT recipients, 108 (12%) developed PTE at a median of 273 d (interquartile range, 160-393) after transplantation. The incidence rate of PTE was 7.5 per 100 person-years. Multivariate analysis found pretransplant dialysis (hazard ratio [HR]: 3.15; 95% confidence interval [CI], 1.67-5.92; P < 0.001), non-White donor (HR: 2.14; 95% CI, 1.25-3.66; P = 0.01), female donor (HR: 1.50; 95% CI, 1.0-2.26; P = 0.05), and male recipient (HR: 2.33; 95% CI, 1.43-3.70; P = 0.001) to be associated with increased risk. The 108 cases of PTE were compared with 324 controls. PTE was not associated with subsequent pancreas graft failure (HR: 1.36; 95% CI, 0.51-3.68; P = 0.53) or kidney graft failure (HR: 1.16; 95% CI, 0.40-3.42; P = 0.78). Conclusions: PTE is a common complication among SPKT recipients, even in the modern era of immunosuppression. PTE among SPKT recipients was not associated with adverse graft outcomes, likely due to appropriate management.

7.
Transplant Direct ; 10(2): e1575, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38264296

ABSTRACT

Background: Kidney transplant outcomes have dramatically improved since the first successful transplant in 1954. In its early years, kidney transplantation was viewed more skeptically. Today it is considered the treatment of choice among patients with end-stage kidney disease. Methods: Our program performed its first kidney transplant in 1966 and recently performed our 12 000th kidney transplant. Here, we review and describe our experience with these 12 000 transplants. Transplant recipients were analyzed by decade of date of transplant: 1966-1975, 1976-1985, 1986-1995, 1996-2005, 2006-2015, and 2016-2022. Death-censored graft failure and mortality were outcomes of interest. Results: Of 12 000 kidneys, 247 were transplanted from 1966 to 1975, 1147 from 1976 to 1985, 2194 from 1986 to 1995, 3147 from 1996 to 2005, 3046 from 2006 to 2015, and 2219 from 2016 to 2022 compared with 1966-1975, there were statistically significant and progressively lower risks of death-censored graft failure at 1 y, 5 y, and at last follow-up in all subsequent eras. Although mortality at 1 y was lower in all subsequent eras after 1986-1995, there was no difference in mortality at 5 y or the last follow-up between eras. Conclusions: In this large cohort of 12 000 kidneys from a single center, we observed significant improvement in outcomes over time. Kidney transplantation remains a robust and ever-growing and improving field.

8.
Arthritis Care Res (Hoboken) ; 76(2): 241-250, 2024 Feb.
Article in English | MEDLINE | ID: mdl-37667434

ABSTRACT

OBJECTIVE: Recent data show that lower hydroxychloroquine (HCQ) doses are associated with a two- to six-fold higher risk of lupus flares. Thus, establishing an effective reference range of HCQ blood levels with upper and lower bounds for efficacy may support individualizing HCQ dosing to prevent flares. METHODS: HCQ levels in whole blood and Systemic Lupus Erythematosus Disease Activity Index (SLEDAI) were measured during the baseline visit and again during a standard of care routine follow-up visit. Active cross-sectional lupus at baseline was defined as SLEDAI ≥6; a within subject flare was defined as a subsequent three-point increase in SLEDAI with clinical symptoms requiring therapy change. We examined associations between active lupus and HCQ blood levels at baseline and flares and HCQ levels during 6 to 12-month routine lupus follow-up visits using mixed regression analysis. RESULTS: Among 158 baseline patient visits, 19% had active lupus. Odds of active lupus were 71% lower in patients with levels within a 750 to 1,200 ng/mL range (adjusted odds ratio 0.29, 95% confidence interval 0.08-0.96). Using convenience sampling strategy during a pandemic, we longitudinally followed 42 patients. Among those patients, 17% flared during their follow-up visit. Maintaining HCQ levels within 750 to 1,200 ng/mL reduced the odds of a flare by 26% over a nine-month median follow-up. CONCLUSION: An effective reference range of HCQ blood levels, 750 to 1,200 ng/mL, was associated with 71% lower odds of active lupus, and maintaining levels within this range reduced odds of flares by 26%. These findings could guide clinicians to individualize HCQ doses to maintain HCQ levels within this range to maximize efficacy.


Subject(s)
Antirheumatic Agents , Lupus Erythematosus, Systemic , Humans , Hydroxychloroquine , Cross-Sectional Studies , Reference Values , Lupus Erythematosus, Systemic/diagnosis , Lupus Erythematosus, Systemic/drug therapy
9.
Clin Transplant ; 38(1): e15217, 2024 01.
Article in English | MEDLINE | ID: mdl-38078682

ABSTRACT

BACKGROUND: While presumably less common with modern molecular diagnostic and imaging techniques, fever of unknown origin (FUO) remains a challenge in kidney transplant recipients (KTRs). Additionally, the impact of FUO on patient and graft survival is poorly described. METHODS: A cohort of adult KTRs between January 1, 1995 and December 31, 2018 was followed at the University of Wisconsin Hospital. Patients transplanted from January 1, 1995 to December 31, 2005 were included in the "early era"; patients transplanted from January 1, 2006 to December 31, 2018 were included in the "modern era". The primary objective was to describe the epidemiology and etiology of FUO diagnoses over time. Secondary outcomes included rejection, graft and patient survival. RESULTS: There were 5590 kidney transplants at our center during the study window. FUO was identified in 323 patients with an overall incidence rate of .8/100 person-years. Considering only the first 3 years after transplant, the incidence of FUO was significantly lower in the modern era than in the early era, with an Incidence Rate Ratio (IRR) per 100 person-years of .48; 95% CI: .35-.63; p < .001. A total of 102 (31.9%) of 323 patients had an etiology determined within 90 days after FUO diagnosis: 100 were infectious, and two were malignancies. In the modern era, FUO remained significantly associated with rejection (HR = 44.1; 95% CI: 16.6-102; p < .001) but not graft failure (HR = 1.21; 95% CI: .68-2.18; p = .52) total graft loss (HR = 1.17; 95% CI: .85-1.62; p = .34), or death (HR = 1.17; 95% CI: .79-1.76; p = .43. CONCLUSIONS: FUO is less common in KTRs during the modern era. Our study suggests infection remains the most common etiology. FUO remains associated with significant increases in risk of rejection, warranting further inquiry into the management of immunosuppressive medications in SOT recipients in the setting of FUO.


Subject(s)
Fever of Unknown Origin , Kidney Transplantation , Neoplasms , Adult , Humans , Incidence , Kidney Transplantation/adverse effects , Fever of Unknown Origin/epidemiology , Fever of Unknown Origin/etiology , Fever of Unknown Origin/diagnosis
10.
Transplant Direct ; 9(9): e1526, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37654682

ABSTRACT

Background: Delayed graft function (DGF) among deceased donor kidney transplant recipients (DDKTRs) is a well-known risk factor for allograft rejection, decreased graft survival, and increased cost. Although DGF is associated with an increased risk of rejection, it is unclear whether it also increases the risk of infection. Methods: We reviewed all adult DDKTRs at our center between 2010 and 2018. The primary outcomes of interest were BK viremia, cytomegalovirus viremia, pneumonia, and urinary tract infection (UTI) within the first year of transplant. Additional analysis was made with censoring follow-up at the time of allograft rejection. Results: A total of 1512 DDKTRs were included, of whom 468 (31%) had DGF. As expected, several recipient, donor, and baseline immunological characteristics differed by DGF status. After adjustment, DGF was significantly associated with an increased risk of BK viremia (hazard ratio: 1.34; 95% confidence interval, 1.0-1.81; P = 0.049) and UTI (hazard ratio: 1.70; 95% confidence interval, 1.31-2.19; P < 0.001) but not cytomegalovirus viremia or pneumonia. Associations were similar in models censored at the time of rejection. Conclusions: DGF is associated with an increased risk of early infectious complications, mainly UTI and BK viremia. Close monitoring and appropriate management are warranted for better outcomes in this unique population.

12.
Nat Med ; 29(5): 1211-1220, 2023 05.
Article in English | MEDLINE | ID: mdl-37142762

ABSTRACT

For three decades, the international Banff classification has been the gold standard for kidney allograft rejection diagnosis, but this system has become complex over time with the integration of multimodal data and rules, leading to misclassifications that can have deleterious therapeutic consequences for patients. To improve diagnosis, we developed a decision-support system, based on an algorithm covering all classification rules and diagnostic scenarios, that automatically assigns kidney allograft diagnoses. We then tested its ability to reclassify rejection diagnoses for adult and pediatric kidney transplant recipients in three international multicentric cohorts and two large prospective clinical trials, including 4,409 biopsies from 3,054 patients (62.05% male and 37.95% female) followed in 20 transplant referral centers in Europe and North America. In the adult kidney transplant population, the Banff Automation System reclassified 83 out of 279 (29.75%) antibody-mediated rejection cases and 57 out of 105 (54.29%) T cell-mediated rejection cases, whereas 237 out of 3,239 (7.32%) biopsies diagnosed as non-rejection by pathologists were reclassified as rejection. In the pediatric population, the reclassification rates were 8 out of 26 (30.77%) for antibody-mediated rejection and 12 out of 39 (30.77%) for T cell-mediated rejection. Finally, we found that reclassification of the initial diagnoses by the Banff Automation System was associated with an improved risk stratification of long-term allograft outcomes. This study demonstrates the potential of an automated histological classification to improve transplant patient care by correcting diagnostic errors and standardizing allograft rejection diagnoses.ClinicalTrials.gov registration: NCT05306795 .


Subject(s)
Kidney Transplantation , Kidney , Adult , Humans , Male , Female , Child , Prospective Studies , Kidney/pathology , Kidney Transplantation/adverse effects , Transplantation, Homologous , Allografts , Graft Rejection/diagnosis , Biopsy
13.
Prostate ; 83(11): 1046-1059, 2023 Aug.
Article in English | MEDLINE | ID: mdl-37154584

ABSTRACT

BACKGROUND: Cholesterol reduction is considered a mechanism through which cholesterol-lowering drugs including statins are associated with a reduced aggressive prostate cancer risk. While prior cohort studies found positive associations between total cholesterol and more advanced stage and grade in White men, whether associations for total cholesterol, low (LDL)- and high (HDL)-density lipoprotein cholesterol, apolipoprotein B (LDL particle) and A1 (HDL particle), and triglycerides are similar for fatal prostate cancer and in Black men, who experience a disproportionate burden of total and fatal prostate cancer, is unknown. METHODS: We conducted a prospective study of 1553 Black and 5071 White cancer-free men attending visit 1 (1987-1989) of the Atherosclerosis Risk in Communities Study. A total of 885 incident prostate cancer cases were ascertained through 2015, and 128 prostate cancer deaths through 2018. We estimated multivariable-adjusted hazard ratios (HRs) of total and fatal prostate cancer per 1-standard deviation increments and for tertiles (T1-T3) of time-updated lipid biomarkers overall and in Black and White men. RESULTS: Greater total cholesterol concentration (HR per-1 SD = 1.25; 95% CI = 1.00-1.58) and LDL cholesterol (HR per-1 SD = 1.26; 95% CI = 0.99-1.60) were associated with higher fatal prostate cancer risk in White men only. Apolipoprotein B was nonlinearly associated with fatal prostate cancer overall (T2 vs. T1: HR = 1.66; 95% CI = 1.05-2.64) and in Black men (HR = 3.59; 95% CI = 1.53-8.40) but not White men (HR = 1.13; 95% CI = 0.65-1.97). Tests for interaction by race were not statistically significant. CONCLUSIONS: These findings may improve the understanding of lipid metabolism in prostate carcinogenesis by disease aggressiveness, and by race while emphasizing the importance of cholesterol control.


Subject(s)
Cholesterol , Prostatic Neoplasms , Male , Humans , Triglycerides , Cholesterol, HDL , Prospective Studies , Apolipoproteins , Prostatic Neoplasms/epidemiology , Risk Factors
14.
Stat Med ; 42(13): 2101-2115, 2023 06 15.
Article in English | MEDLINE | ID: mdl-36938960

ABSTRACT

Joint modeling and landmark modeling are two mainstream approaches to dynamic prediction in longitudinal studies, that is, the prediction of a clinical event using longitudinally measured predictor variables available up to the time of prediction. It is an important research question to the methodological research field and also to practical users to understand which approach can produce more accurate prediction. There were few previous studies on this topic, and the majority of results seemed to favor joint modeling. However, these studies were conducted in scenarios where the data were simulated from the joint models, partly due to the widely recognized methodological difficulty on whether there exists a general joint distribution of longitudinal and survival data so that the landmark models, which consists of infinitely many working regression models for survival, hold simultaneously. As a result, the landmark models always worked under misspecification, which caused difficulty in interpreting the comparison. In this paper, we solve this problem by using a novel algorithm to generate longitudinal and survival data that satisfies the working assumptions of the landmark models. This innovation makes it possible for a "fair" comparison of joint modeling and landmark modeling in terms of model specification. Our simulation results demonstrate that the relative performance of these two modeling approaches depends on the data settings and one does not always dominate the other in terms of prediction accuracy. These findings stress the importance of methodological development for both approaches. The related methodology is illustrated with a kidney transplantation dataset.


Subject(s)
Models, Statistical , Humans , Computer Simulation , Longitudinal Studies
15.
Ann Am Thorac Soc ; 20(8): 1107-1115, 2023 08.
Article in English | MEDLINE | ID: mdl-36812384

ABSTRACT

Rationale: Population-based data on the epidemiology of nontuberculosis mycobacterial (NTM) infections are limited, particularly with respect to variation in NTM infection among racial groups and socioeconomic strata. Wisconsin is one of a handful of states where mycobacterial disease is notifiable, allowing large, population-based analyses of the epidemiology of NTM infection in this state. Objectives: To estimate the incidence of NTM infection in Wisconsin adults, describe the geographic distribution of NTM infection across the state, identify the frequency and type of infection caused by different NTM species, and investigate associations between NTM infection and demographics and socioeconomic status. Methods: We conducted a retrospective cohort study using laboratory reports of all NTM isolates from Wisconsin residents submitted to the Wisconsin Electronic Disease Surveillance System from 2011 to 2018. For the analyses of NTM frequency, multiple reports from the same individual were enumerated as separate isolates when nonidentical, collected from different sites or collected more than one year apart. Results: A total of 8,135 NTM isolates from 6,811 adults were analyzed. Mycobacterium avium complex accounted for 76.4% of respiratory isolates. The M. chelonae-abscessus group was the most common species isolated from skin and soft tissue. The annual incidence of NTM infection was stable over the study period (from 22.1 per 100,000 to 22.4 per 100,000). The cumulative incidence of NTM infection among Black (224 per 100,000) and Asian (244 per 100,000) individuals was significantly higher compared with that among their White counterparts (97 per 100,000). Total NTM infections were significantly more frequent (P < 0.001) in individuals from disadvantaged neighborhoods, and racial disparities in the incidence of NTM infection generally remained consistent when stratified by measures of neighborhood disadvantage. Conclusions: More than 90% of NTM infections were from respiratory sites, with the vast majority caused by M. avium complex. Rapidly growing mycobacteria predominated as skin and soft tissue pathogens and were important minor respiratory pathogens. We found a stable annual incidence of NTM infection in Wisconsin between 2011 and 2018. NTM infection occurred more frequently in non-White racial groups and in individuals experiencing social disadvantage, suggesting that NTM disease may be more frequent in these groups as well.


Subject(s)
Mycobacterium Infections, Nontuberculous , Nontuberculous Mycobacteria , Adult , Humans , Wisconsin/epidemiology , Retrospective Studies , Mycobacterium Infections, Nontuberculous/epidemiology , Mycobacterium Infections, Nontuberculous/microbiology , Mycobacterium avium Complex
16.
J Vasc Access ; 24(6): 1398-1406, 2023 Nov.
Article in English | MEDLINE | ID: mdl-35259945

ABSTRACT

BACKGROUND: Arteriovenous fistulae (AVF) are considered the preferred hemodialysis access but up to 50% of all AVF created in the United States never mature. Doppler ultrasound (DUS) is useful for predicting fistula maturity and impending fistula failure. DUS is resource-intensive and is associated with poor compliance rates in dialysis patients, ranging from 12% to 33%. METHODS: EchoSure is an FDA-cleared 3D Doppler ultrasound device that automatically delivers quantitative blood flow and anatomic vascular information. The technology can be used at the bedside by personnel without formal sonographic training, nullifying limitations of traditional Duplex ultrasound imaging. This study compared the EchoSure system in the hands of inexpert personnel to a traditional expert-operated DUS for rapid assessment of a benchtop model vascular system with flow, diameter, and depth expected in a human AVF. RESULTS: Both Duplex and EchoSure performed within the expected tolerance of ultrasound readings (35%) for volume flow, with the average error (AE) between the observed measurement and the ground truth being 8% for Duplex and 8% for EchoSure. However, the average coefficient of variation (CV) for Duplex pooled over all flow rate measurements was 17% versus 4% for EchoSure. Regarding diameter, Duplex measurements had AE of 15% with an average CV of 6% across all measurements versus EchoSure AE of 4% and average CV of 2%. Duplex and EchoSure measurements over all depths had the same AE of 2%. The two modalities were not statistically different for depth measurement (p = 0.05) but EchoSure measured closer to the ground truth for flow rate and vessel diameter (flow: p = 0.028, ρ = -0.07; diameter: p < 0.001, ρ = 0.69). The inexpert personnel using EchoSure acquired data 62% faster than the expert sonographers using the Duplex ultrasound (141 min for Duplex vs 87 min for EchoSure). CONCLUSIONS: EchoSure may offer an accurate and convenient alternative for imaging fistulas in the clinic.


Subject(s)
Arteriovenous Fistula , Arteriovenous Shunt, Surgical , Humans , Arteriovenous Shunt, Surgical/adverse effects , Ultrasonography , Renal Dialysis/methods , Ultrasonography, Doppler, Duplex , Vascular Patency , Treatment Outcome
17.
J Rheumatol ; 50(3): 359-367, 2023 03.
Article in English | MEDLINE | ID: mdl-35970523

ABSTRACT

OBJECTIVE: Recent studies suggest young adults with systemic lupus erythematosus (SLE) have high 30-day readmission rates, which may necessitate tailored readmission reduction strategies. To aid in risk stratification for future strategies, we measured 30-day rehospitalization and mortality rates among Medicare beneficiaries with SLE and determined rehospitalization predictors by age. METHODS: In a 2014 20% national Medicare sample of hospitalizations, rehospitalization risk and mortality within 30 days of discharge were calculated for young (aged 18-35 yrs), middle-aged (aged 36-64 yrs), and older (aged 65+ yrs) beneficiaries with and without SLE. Multivariable generalized estimating equation models were used to predict rehospitalization rates among patients with SLE by age group using patient, hospital, and geographic factors. RESULTS: Among 1.39 million Medicare hospitalizations, 10,868 involved beneficiaries with SLE. Hospitalized young adult beneficiaries with SLE were more racially diverse, were living in more disadvantaged areas, and had more comorbidities than older beneficiaries with SLE and those without SLE. Thirty-day rehospitalization was 36% among young adult beneficiaries with SLE-40% higher than peers without SLE and 85% higher than older beneficiaries with SLE. Longer length of stay and higher comorbidity risk score increased odds of rehospitalization in all age groups, whereas specific comorbid condition predictors and their effect varied. Our models, which incorporated neighborhood-level socioeconomic disadvantage, had moderate-to-good predictive value (C statistics 0.67-0.77), outperforming administrative data models lacking comprehensive social determinants in other conditions. CONCLUSION: Young adults with SLE on Medicare had very high 30-day rehospitalization at 36%. Considering socioeconomic disadvantage and comorbidities provided good prediction of rehospitalization risk, particularly in young adults. Young beneficiaries with SLE with comorbidities should be a focus of programs aimed at reducing rehospitalizations.


Subject(s)
Lupus Erythematosus, Systemic , Patient Readmission , Middle Aged , Young Adult , Humans , Aged , United States , Medicare , Cohort Studies , Retrospective Studies , Hospitalization
18.
Clin Transplant ; 37(2): e14862, 2023 02.
Article in English | MEDLINE | ID: mdl-36380446

ABSTRACT

INTRODUCTION: Serum albumin is an indicator of overall health status, but it remains unclear how pre-transplant hypoalbuminemia is associated with early post-transplant outcomes. METHODS: This study included all adult kidney transplant recipients (KTRs) at our center from 01/01/2001-12/31/2017 with serum albumin measured within 30 days before transplantation. KTRs were grouped based on pretransplant albumin level normal (≥4.0 g/dL), mild (≥3.5 - < 4.0g/dL), moderate (≥3.0 - < 3.5g/dL), or severe hypoalbuminemia (<3.0g/dL). Outcomes of interest included: length of hospital stay (LOS), readmission within 30 days, delayed graft function(DGF), and re-operation related to post-transplant surgical complications. We also analyzed rejection, graft failure, and death within 6 months post-transplant. RESULTS: A total of 2807 KTRs were included 43.6% had normal serum albumin, 35.3% mild, 16.6% moderate, and 4.5% severe hypoalbuminemia. Mild and moderate hypoalbuminemia were associated with a shorter LOS by 1.22 (p < 0.001) and 0.80 days (p = 0.01), respectively, compared to normal albumin. Moderate (HR: 0.58; 95% CI: 0.37-0.91; p = 0.02) and severe hypoalbuminemia (HR: 0.21; 95% CI: 0.07-0.68; p = 0.01) were associated with significantly lower rates of acute rejection within 6 months post-transplant. CONCLUSION: Patients with pre-transplant hypoalbuminemia have post-transplant outcomes similar to those with normal serum albumin, but with a lower risk of acute rejection based on the degree of hypoalbuminemia.


Subject(s)
Hypoalbuminemia , Kidney Transplantation , Adult , Humans , Hypoalbuminemia/complications , Kidney Transplantation/adverse effects , Retrospective Studies , Serum Albumin , Transplant Recipients , Risk Factors , Graft Rejection/etiology
19.
Clin Transplant ; 37(1): e14852, 2023 01.
Article in English | MEDLINE | ID: mdl-36354280

ABSTRACT

PURPOSE: Studies conducted in the northern United States found cytomegalovirus (CMV) disease after liver transplantation follows a seasonal pattern, with increased incidence in fall and winter. This has not been evaluated in kidney transplant recipients. Improved understanding of CMV seasonality may help guide use of preventative therapies. METHODS: We evaluated adult patients receiving a kidney transplant at our center in Wisconsin from January 1, 1995 to December 31, 2018. CMV event was defined as quantifiable viral replication with clinical signs or symptoms suspicious for CMV per current consensus recommendations. Seasons were divided as follows: winter (December-February), spring (March-May), summer (June-August), and fall (September-November). The primary objective was to evaluate the annual distribution of CMV disease and determine whether this differed by season. RESULTS: There were 6151 kidney transplants in the study period. A total of 913 patients had 1492 episodes of CMV. Median time from transplant to first detection was 5.51 months (interquartile range [IQR] 2.87-11.7). The observed overall incidence exceeded the expected incidence in winter (+.7%), spring (+5.5%), and fall (+3.4%) and was less than expected in summer (-9.5%) (p = .18). The incidence of CMV during summer, however, was 21% less than expected (p = .001) in recipients who were CMV positive (R+) at the time of transplantation. No such difference was observed in CMV negative recipients (R-; p = .58). CONCLUSION: CMV after kidney transplant appears to be less common during the summer season in patients who were R+ at transplant but does not follow seasonal variation in R-. Reasons for this are unclear but are likely related to CMV-specific cell-mediated immunity. These findings may have clinical implications, particularly the use of non-pharmacologic strategies to improve response to antiviral therapy.


Subject(s)
Cytomegalovirus Infections , Kidney Transplantation , Adult , Humans , Seasons , Cytomegalovirus , Kidney Transplantation/adverse effects , Antiviral Agents/therapeutic use , Cytomegalovirus Infections/drug therapy , Cytomegalovirus Infections/epidemiology , Cytomegalovirus Infections/etiology , Transplant Recipients
20.
J Vasc Access ; : 11297298221136592, 2022 Nov 14.
Article in English | MEDLINE | ID: mdl-36377049

ABSTRACT

BACKGROUNDS: Clinical monitoring is the recommended standard for identifying dialysis access dysfunction; however, clinical monitoring requires skill and training, which is challenging for understaffed clinics and overburdened healthcare personnel. A vascular access risk stratification score was recently proposed to assist in detecting dialysis access dysfunction. PURPOSE: Our objective was to evaluate the utility of using vascular access risk scores to assess venous stenosis in hemodialysis vascular accesses. METHODS: We prospectively enrolled adult patients who were receiving hemodialysis through an arteriovenous access and who had a risk score ⩽3 (low-risk) or ⩾8 (high-risk). We compared the occurrence of access stenosis (>50% on ultrasonography or angiography) between low-risk and high-risk groups and assessed clinical monitoring results for each group. RESULTS: Of the 38 patients analyzed (18 low-risk; 20 high-risk), 16 (42%) had significant stenosis. Clinical monitoring results were positive in 39% of the low-risk and 60% of the high-risk group (p = 0.19). The high-risk group had significantly higher occurrence of stenosis than the low-risk group (65% vs 17%; p = 0.003). Sensitivity and specificity of a high score for identifying stenosis were 81% and 68%, respectively. The positive predictive value of a high-risk score was 65%, and the negative predictive value was 80%. Only 11 (58%) of 19 subjects with positive clinical monitoring had significant stenosis. In a multivariable model, the high-risk group had seven-fold higher odds of stenosis than the low-risk group (aOR = 7.38; 95% CI, 1.44-37.82; p = 0.02). Positive clinical monitoring results and previous stenotic history were not associated with stenosis. Every unit increase in the score was associated with 34% higher odds of stenosis (aOR = 1.34; 95% CI, 1.05-1.70; p = 0.02). CONCLUSIONS: A calculated risk score may help predict the development of hemodialysis vascular access stenosis and may provide a simple and reliable objective measure for risk stratification.

SELECTION OF CITATIONS
SEARCH DETAIL
...