Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 260
Filter
1.
Article in English | MEDLINE | ID: mdl-38766839

ABSTRACT

Apolipoprotein ɛ4 (APOE ɛ4) may be a genetic risk factor for reduced bone mineral density (BMD) and muscle function, which could have implications for fall and fracture risk. We examined the association between APOE ɛ4 status and long-term fall- and fracture-related hospitalization risk in older women. A total of 1 276 community-dwelling women from the Perth Longitudinal Study of Aging Women (mean age ±â€…SD = 75.2 ±â€…2.7 years) were included. At baseline, women underwent APOE genotyping and detailed phenotyping for covariates including prevalent falls and fractures, as well as health and lifestyle factors. The association between APOE ɛ4 and fall-, any fracture-, and hip fracture-related hospitalizations, obtained over 14.5 years from linked health records, was examined using multivariable-adjusted Cox-proportional hazard models. Over 14.5 years, 507 (39.7%) women experienced a fall-related hospitalization and 360 (28.2%) women experienced a fracture-related hospitalization, including 143 (11.2%) attributed to a hip fracture. In multivariable-adjusted models, compared to noncarriers, APOE ɛ4 carriers (n = 297, 23.3%) had greater risk for a fall- (hazard ratio [HR] 1.48, 95% CI: 1.22-1.81), fracture- (HR 1.28, 95% CI: 1.01-1.63), or hip fracture-related hospitalization (HR 1.83, 95% CI: 1.29-2.61). The estimates remained similar when specific fall and fracture risk factors (fear of falling, plasma 25-hydroxyvitamin D, grip strength, timed up-and-go, hip BMD, vitamin K status, prevalent diabetes, HbA1c, cholesterol, and abbreviated mental test score) were added to the multivariable model. In conclusion, APOE ɛ4 is a potential risk factor for fall- and fracture-related hospitalization in community-dwelling older women. Screening for APOE ɛ4 could provide clinicians an opportunity to direct higher-risk individuals to appropriate intervention strategies.


Subject(s)
Accidental Falls , Apolipoprotein E4 , Hospitalization , Humans , Female , Accidental Falls/statistics & numerical data , Aged , Hospitalization/statistics & numerical data , Longitudinal Studies , Risk Factors , Apolipoprotein E4/genetics , Fractures, Bone/epidemiology , Fractures, Bone/genetics , Bone Density/genetics , Genotype , Hip Fractures/epidemiology , Hip Fractures/genetics , Aged, 80 and over , Independent Living , Aging/genetics , Australia/epidemiology
2.
Stereotact Funct Neurosurg ; 102(3): 141-155, 2024.
Article in English | MEDLINE | ID: mdl-38636468

ABSTRACT

INTRODUCTION: Deep brain stimulation (DBS) is a well-established surgical therapy for patients with Parkinsons' Disease (PD). Traditionally, DBS surgery for PD is performed under local anesthesia, whereby the patient is awake to facilitate intraoperative neurophysiological confirmation of the intended target using microelectrode recordings. General anesthesia allows for improved patient comfort without sacrificing anatomic precision and clinical outcomes. METHODS: We performed a systemic review and meta-analysis on patients undergoing DBS for PD. Published randomized controlled trials, prospective and retrospective studies, and case series which compared asleep and awake techniques for patients undergoing DBS for PD were included. A total of 19 studies and 1,900 patients were included in the analysis. RESULTS: We analyzed the (i) clinical effectiveness - postoperative UPDRS III score, levodopa equivalent daily doses and DBS stimulation requirements. (ii) Surgical and anesthesia related complications, number of lead insertions and operative time (iii) patient's quality of life, mood and cognitive measures using PDQ-39, MDRS, and MMSE scores. There was no significant difference in results between the awake and asleep groups, other than for operative time, for which there was significant heterogeneity. CONCLUSION: With the advent of newer technology, there is likely to have narrowing differences in outcomes between awake or asleep DBS. What would therefore be more important would be to consider the patient's comfort and clinical status as well as the operative team's familiarity with the procedure to ensure seamless transition and care.


Subject(s)
Deep Brain Stimulation , Parkinson Disease , Wakefulness , Deep Brain Stimulation/methods , Humans , Parkinson Disease/therapy , Parkinson Disease/surgery , Anesthesia, General/methods , Treatment Outcome , Anesthesia/methods
3.
Article in English | MEDLINE | ID: mdl-38663982

ABSTRACT

Murine fur mites are commonly excluded in modern research animal programs, yet infestations continue to persist due to challenges in detection and control. Because all diagnostic methods and treatment options have limitations, programs must make many operational decisions when trying to eradicate these ectoparasites. The primary aim of this study was to assess various durations of treatment time with an ivermectin-compounded diet in eliminating Radfordia affinis in mice as determined by PCR testing and pelt examination. A shorter treatment duration would be highly advantageous as compared with the current regimen of 8 wk as it would minimize cost and time for animal management programs, impediments to research, and ivermectin drug effects on infested animals. Five experimental groups of R. affinis-positive mice received dietary ivermectin for 0, 2, 4, 6, or 8 wk. A fur mite-negative, naïve mouse was added to each group every 8 wk to perpetuate the infestation and amplify any remaining populations of fur mites. At 16 wk after the respective treatment end, PCR testing was performed for all treated groups in conjunction with the positive control group (no treatment). Visual examination of pelts for mites and eggs via direct microscopy was also performed at each time point. All treated mice were free of R. affinis at 16 wk after the end of treatment as confirmed by both PCR testing and pelt examination. These findings indicate that a dietary ivermectin treatment duration of as little as 2 wk is effective in eliminating R. affinis, making successful eradication initiatives more achievable.

4.
Transplantation ; 2024 Apr 30.
Article in English | MEDLINE | ID: mdl-38685196

ABSTRACT

BACKGROUND: The number of donors from donation after circulatory determination of death (DCDD) has increased by at least 4-fold over the past decade. This study evaluated the association between the antecedent cardiac arrest status of controlled DCDD donors and the risk of delayed graft function (DGF). METHODS: Using data from the Australia and New Zealand Dialysis and Transplant, the associations between antecedent cardiac arrest status of DCDD donors before withdrawal of cardiorespiratory support, DGF, posttransplant estimated glomerular filtration rate (eGFR), and allograft loss were examined using adjusted logistic, linear mixed modeling, and cox regression, respectively. Among donors who experienced cardiac arrest, we evaluated the association between duration and unwitnessed status of arrest and DGF. RESULTS: A total of 1173 kidney transplant recipients received DCDD kidneys from 646 donors in Australia between 2014 and 2019. Of these, 335 DCDD had antecedent cardiac arrest. Compared with recipients of kidneys from donors without antecedent cardiac arrest, the adjusted odds ratio (95% confidence interval) for DGF was 0.85 (0.65-1.11) among those with kidneys from donors with cardiac arrest. There was no association between antecedent cardiac arrest and posttransplant eGFR or allograft loss. The duration of cardiac arrest and unwitnessed status were not associated with DGF. CONCLUSIONS: This focused analysis in an Australian population showed that the allograft outcomes were similar whether DCDD donors had experienced a prior cardiac arrest, with no associations between duration or unwitnessed status of arrest and risk of DGF. This study thus provides important reassurance to transplant programs and the patients they counsel, to accept kidneys from donors through the DCDD pathway irrespective of a prior cardiac arrest.

7.
Clin Kidney J ; 17(3): sfad245, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38468698

ABSTRACT

Background: Diabetes mellitus (DM) is associated with a greater risk of mortality in kidney transplant patients, primarily driven by a greater risk of cardiovascular disease (CVD)-related mortality. However, the associations between diabetes status at time of first allograft loss and mortality on dialysis remain unknown. Methods: All patients with failed first kidney allografts transplanted in Australia and New Zealand between 2000 and 2020 were included. The associations between diabetes status at first allograft loss, all-cause and cause-specific mortality were examined using competing risk analyses, separating patients with diabetes into those with pre-transplant DM or post-transplant diabetes mellitus (PTDM). Results: Of 3782 patients with a median (IQR) follow-up duration of 2.7 (1.1-5.4) years, 539 (14%) and 390 (10%) patients had pre-transplant DM or developed PTDM, respectively. In the follow-up period, 1336 (35%) patients died, with 424 (32%), 264 (20%) and 199 (15%) deaths attributed to CVD, dialysis withdrawal and infection, respectively. Compared to patients without DM, the adjusted subdistribution HRs (95% CI) for pre-transplant DM and PTDM for all-cause mortality on dialysis were 1.47 (1.17-1.84) and 1.47 (1.23-1.76), respectively; for CVD-related mortality were 0.81 (0.51-1.29) and 1.02 (0.70-1.47), respectively; for infection-related mortality were 1.84 (1.02-3.35) and 2.70 (1.73-4.20), respectively; and for dialysis withdrawal-related mortality were 1.71 (1.05-2.77) and 1.51 (1.02-2.22), respectively. Conclusions: Patients with diabetes at the time of kidney allograft loss have a significant survival disadvantage, with the excess mortality risk attributed to infection and dialysis withdrawal.

9.
Transplantation ; 108(6): 1422-1429, 2024 Jun 01.
Article in English | MEDLINE | ID: mdl-38361237

ABSTRACT

BACKGROUND: Uncontrolled donation after circulatory death (uDCD) is a potential additional source of donor kidneys. This study reviewed uDCD kidney transplant outcomes to determine if these are comparable to controlled donation after circulatory death (cDCD). METHODS: MEDLINE, Cochrane, and Embase databases were searched. Data on demographic information and transplant outcomes were extracted from included studies. Meta-analyses were performed, and risk ratios (RR) were estimated to compare transplant outcomes from uDCD to cDCD. RESULTS: Nine cohort studies were included, from 2178 uDCD kidney transplants. There was a moderate degree of bias, as 4 studies did not account for potential confounding factors. The median incidence of primary nonfunction in uDCD was 12.3% versus 5.7% for cDCD (RR, 1.85; 95% confidence intervals, 1.06-3.23; P = 0.03, I 2 = 75). The median rate of delayed graft function was 65.1% for uDCD and 52.0% for cDCD. The median 1-y graft survival for uDCD was 82.7% compared with 87.5% for cDCD (RR, 1.43; 95% confidence intervals, 1.02-2.01; P = 0.04; I 2 = 71%). The median 5-y graft survival for uDCD and cDCD was 70% each. Notably, the use of normothermic regional perfusion improved primary nonfunction rates in uDCD grafts. CONCLUSIONS: Although uDCD outcomes may be inferior in the short-term, the long-term outcomes are comparable to cDCD.


Subject(s)
Graft Survival , Kidney Transplantation , Tissue Donors , Humans , Kidney Transplantation/adverse effects , Kidney Transplantation/methods , Tissue Donors/supply & distribution , Treatment Outcome , Delayed Graft Function/etiology , Risk Factors , Tissue and Organ Procurement/methods
10.
Arterioscler Thromb Vasc Biol ; 44(2): e54-e64, 2024 02.
Article in English | MEDLINE | ID: mdl-38095109

ABSTRACT

BACKGROUND: Abdominal aortic calcification (AAC), a marker of vascular disease, is associated with disease in other vascular beds including gastrointestinal arteries. We investigated whether AAC is related to rapid weight loss over 5 years and whether rapid weight loss is associated with 9.5-year all-cause mortality in community-dwelling older women. METHODS: Lateral spine images from dual-energy x-ray absorptiometry (1998/1999) were used to assess AAC (24-point AAC scoring method) in 929 older women. Over 5 years, body weight was assessed at 12-month intervals. Rapid weight loss was defined as >5% decrease in body weight within any 12-month interval. Multivariable-adjusted logistic regression was used to assess AAC and rapid weight loss and Cox regression to assess the relationship between rapid weight loss and 9.5-year all-cause mortality. RESULTS: Mean±SD age of women was 75.0±2.6 years. During the initial 5 years, 366 (39%) women presented with rapid weight loss. Compared with women with low AAC (24-point AAC score 0-1), those with moderate (24-point AAC score 2-5: odds ratio, 1.36 [95% CI, 1.00-1.85]) and extensive (24-point AAC score 6+: odds ratio, 1.59 [95% CI, 1.10-2.31]) AAC had higher odds for presenting with rapid weight loss. Results remained similar after further adjustment for dietary factors (alcohol, protein, fat, and carbohydrates), diet quality, blood pressure, and cholesterol measures. The estimates were similar in subgroups of women who met protein intake (n=599) and physical activity (n=735) recommendations (extensive AAC: odds ratios, 1.81 [95% CI, 1.12-2.92] and 1.58 [95% CI, 1.02-2.44], respectively). Rapid weight loss was associated with all-cause mortality over the next 9.5 years (hazard ratio, 1.49 [95% CI, 1.17-1.89]; P=0.001). CONCLUSIONS: AAC extent was associated with greater risk for rapid weight loss over 5 years in older women, a risk for all-cause mortality. Since the association was unchanged after taking nutritional intakes into account, these data support the possibility that vascular disease may play a role in the maintenance of body weight.


Subject(s)
Aortic Diseases , Vascular Calcification , Vascular Diseases , Humans , Female , Aged , Male , Risk Factors , Longitudinal Studies , Vascular Calcification/etiology , Aging , Body Weight , Weight Loss , Aorta, Abdominal/diagnostic imaging , Aortic Diseases/etiology
11.
Nephrology (Carlton) ; 29(1): 34-38, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37605476

ABSTRACT

Kidney transplantation in people living with HIV (PLWHIV) is occurring with increasing frequency. Limited international data suggest comparable patient and graft survival in kidney transplant recipients with and without HIV. All PLWHIV aged ≥18 years who received a kidney transplant between 2000 and 2020 were identified by retrospective data initially extracted from Australia and New Zealand Dialysis and Transplant Registry (ANZDATA), with additional HIV-specific clinical data extracted from linked local health-care records. Twenty-five PLWHIV and kidney failure received their first kidney transplant in Australia between January 2000 and December 2020. Majority were male (85%), with median age 54 years (interquartile range, IQR 43-57). Focal segmental glomerulosclerosis was the most common primary kidney disease (20%), followed by polycystic kidney disease (16%). 80% of patients underwent induction with basiliximab and none with anti-thymocyte globulin (ATG). Participants were followed for median time of 3.5 years (IQR 2.0-6.5). Acute rejection occurred in 24% of patients. Two patients lost their allografts and three died. Virological escape occurred in 28% of patients, with a maximum viral load of 190 copies/mL. In conclusion, kidney transplantation in PLWHIV in Australia is occurring with increasing frequency. Acute rejection is more common than in Australia's general transplant population, but this does not appear to be associated with higher rates of graft failure or mortality out to four years.


Subject(s)
HIV Infections , Kidney Transplantation , Humans , Male , Female , Adolescent , Adult , Middle Aged , Immunosuppressive Agents/adverse effects , Kidney Transplantation/adverse effects , HIV , Retrospective Studies , Graft Rejection/prevention & control , Renal Dialysis , Australia/epidemiology , HIV Infections/complications , HIV Infections/diagnosis , HIV Infections/drug therapy , Graft Survival
12.
Clin Kidney J ; 16(11): 1908-1916, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37915927

ABSTRACT

Kidney transplantation is the optimal treatment for most patients with kidney failure. For patients with a prior history of treated cancers, listing and transplant eligibility decisions are complex. Patients and health professionals are obliged to consider the time-periods between cancer cure and transplantation, the risk of cancer recurrence under the influence of immunosuppression and anti-cancer treatment options if the disease recurs. Cancer recurrence is associated with a high mortality rate, thus potentially reduces the projected survival benefit of transplantation, and dampens the utility of scarce organs. In view of the uncertain risk of harms, clinicians may consider transplantation for candidates with prior cancer history only after an extended period of cancer-free interval, as the fear of disease recurrence and shortened life expectancy may outweigh the benefits of receiving a kidney transplant compared with dialysis. Over the past decade, the evolution of novel anti-cancer therapies coupled with improved understanding of cancer genomics have led to considerable improvement in cancer-free survival. It is therefore justifiable to make individualized transplant suitability decisions based the joint effects of cancer biology, available therapeutic options and prognostic covariates on clinical outcomes. In this review, we first summarized the cancer epidemiology in kidney transplant recipients. We then explored how the probability of cancer cure, risk of recurrence and outcomes in candidates with a prior cancer history may influence the decisions to transplant. Finally, the role of shared decision-making between health professionals and patients regarding the optimal management options, and considerations of patients' preferences and values are discussed.

13.
Transpl Int ; 36: 11883, 2023.
Article in English | MEDLINE | ID: mdl-38020745

ABSTRACT

Cancer transmission from deceased donors is an exceedingly rare but potentially fatal complication in transplant recipients. We aimed to quantify the likelihood of non-utilization of kidneys for transplantation from donors with a prior cancer history. We included all intended and actual deceased donors in Australia and New Zealand between 1989 and 2017. Association between prior cancer history and non-utilization of donor kidneys was examined using adjusted logistic regression. Of 9,485 deceased donors, 345 (4%) had a prior cancer history. Of 345 donors with a prior cancer history, 197 (57%) were utilized for transplantation. Donor characteristics of age, sex and comorbidities were similar between utilized and non-utilized donors with prior cancer. The time from cancer to organ donation was similar between utilized and non-utilized donors, irrespective of cancer subtypes. Donors with a prior cancer history were less likely to be utilized [adjusted OR (95% CI) 2.29 (1.68-3.13)] than donors without prior cancer. Of all actual donors, the adjusted OR for non-utilization among those with prior cancer was 2.36 (1.58-3.53). Non-melanoma skin cancer was the most frequent prior cancer type for utilized and non-utilized potential donors. Donors with prior cancers were less likely to be utilized for transplantation, with no discernible differences in cancer characteristics between utilized and non-utilized donors.


Subject(s)
Kidney Transplantation , Neoplasms , Tissue and Organ Procurement , Humans , Tissue Donors , Kidney
14.
Kidney Int Rep ; 8(10): 1978-1988, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37850002

ABSTRACT

Introduction: Gastrointestinal (GI) symptoms in kidney transplant are common and debilitating. We aimed to ascertain patients' preferences for GI symptom management options to help future interventions align with treatment priorities. Methods: A discrete choice experiment was conducted with kidney transplant recipients in 3 Australian nephrology units. A multinomial logit model was used to quantify the preferences and trade-offs between 5 characteristics: cost, formulation, symptom burden, dietary changes, and medication quantities. Results: Seventy patients participated (mean age ± SD: 47 ± 15 years, 56% female), 57% had GI symptoms. Patients preferred interventions that will achieve complete resolution of GI symptoms compared to no improvement (odds ratio [95% confidence interval]: 15.3 [1.80, 129.50]), were delivered as a tablet rather than a sachet (1.6 [1.27, 2.08]), retained their current diet compared to eliminating food groups (6.0 [2.19, 16.27]), reduced medication burden (1.4 [1.06, 1.79]), and had lower costs (0.98 [0.96, 1.00]). Participants would be willing to pay AUD$142.20 [$83.90, $200.40] monthly to achieve complete resolution of GI symptoms or AUD$100.90 [$9.60, $192.10] to have moderate improvement in symptoms. Conclusions: Interventions that are highly effective in relieving all GI symptoms without the need for substantive dietary changes, and in tablet form, are most preferred by kidney transplant recipients.

15.
Medicine (Baltimore) ; 102(40): e35067, 2023 Oct 06.
Article in English | MEDLINE | ID: mdl-37800761

ABSTRACT

PURPOSE: To evaluate the cost-effectiveness of phacoemulsification simulation training in virtual reality simulator and wet laboratory on operating theater performance. METHODS: Residents were randomized to a combination of virtual reality and wet laboratory phacoemulsification or wet laboratory phacoemulsification. A reference control group consisted of trainees who had wet laboratory training without phacoemulsification. All trainees were assessed on operating theater performance in 3 sequential cataract patients. International Council of Ophthalmology Surgical Competency Assessment Rubric-phacoemulsification (ICO OSCAR phaco) scores by 2 masked independent graders and cost data were used to determine the incremental cost-effectiveness ratio (ICER). A decision model was constructed to indicate the most cost-effective simulation training strategy based on the willingness to pay (WTP) per ICO OSCAR phaco score gained. RESULTS: Twenty-two trainees who performed phacoemulsification in 66 patients were analyzed. Trainees who had additional virtual reality simulation achieved higher mean ICO OSCAR phaco scores compared with trainees who had wet laboratory phacoemulsification and control (49.5 ± standard deviation [SD] 9.8 vs 39.0 ± 15.8 vs 32.5 ± 12.1, P < .001). Compared with the control group, ICER per ICO OSCAR phaco of wet laboratory phacoemulsification was $13,473 for capital cost and $2209 for recurring cost. Compared with wet laboratory phacoemulsification, ICER per ICO OSCAR phaco of additional virtual reality simulator training was US $23,778 for capital cost and $1879 for recurring cost. The threshold WTP values per ICO OSCAR phaco score for combined virtual reality simulator and wet laboratory phacoemulsification to be most cost-effective was $22,500 for capital cost and $1850 for recurring cost. CONCLUSIONS: Combining virtual reality simulator with wet laboratory phacoemulsification training is effective for skills transfer in the operating theater. Despite of the high capital cost of virtual reality simulator, its relatively low recurring cost is more favorable toward cost-effectiveness.


Subject(s)
Cataract , Internship and Residency , Ophthalmology , Phacoemulsification , Simulation Training , Virtual Reality , Humans , Cost-Benefit Analysis , Clinical Competence , Computer Simulation
16.
Digit Health ; 9: 20552076231205747, 2023.
Article in English | MEDLINE | ID: mdl-37808235

ABSTRACT

Objective: Wound image analysis tools hold promise in helping patients to monitor their wounds. We aim to perform a novel feasibility study on the efficacy of a patient-owned wound surveillance system for diabetic foot ulcer (DFU) care. Methods: This two-institutional, prospective, single-arm pilot study examined patients with DFU. An artificial intelligence-enabled image analysis app calculating the wound surface area was installed and patients or caregivers were instructed to take pictures of wounds during dressing changes. Patients were followed until wound deterioration, wound healing, or wound stability at 6 months occurred and the outcomes of interest included study adherence, algorithm performance, and user experience. Results: Between January 2021 and December 2021, 39 patients were enrolled in the study, with a mean age of 61.6 ± 8.6 years, and 69% (n = 27) of subjects were male. All patients had documented diabetes and 85% (n = 33) of them had peripheral arterial disease. A mean follow-up for those completing the study was 12.0 ± 8.5 weeks. At the conclusion of the study, 80% of patients (n = 20) had primary wound healing whilst 20% (n = 5) had wound deterioration. The study completion rate was 64% (n = 25). Usage of the app for surveillance of DFU healing, as compared to physician evaluation, yielded a sensitivity of 100%, specificity of 20%, positive predictive value of 83%, and negative predictive value of 100%. Of those who provided user experience feedback, 59% (n = 10) felt the app was easy to use, 47% (n = 8) would recommend the wound analysis app to others but only 6% would pay for the app out of pocket (n = 1). Conclusion: Implementation of a patient-owned wound surveillance system is feasible. Most patients were able to effectively monitor wounds using a smartphone app-based solution. The image analysis algorithm demonstrates strong performance in identifying wound healing and is capable of detecting deterioration prior to interval evaluation by a physician. Patients generally found the app easy to use but were reluctant to pay for the use of the solution out of pocket.

17.
J Med Virol ; 95(8): e28993, 2023 08.
Article in English | MEDLINE | ID: mdl-37526404

ABSTRACT

Myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS) is estimated to affect 0.4%-2.5% of the global population. Most cases are unexplained; however, some patients describe an antecedent viral infection or response to antiviral medications. We report here a multicenter study for the presence of viral nucleic acid in blood, feces, and saliva of patients with ME/CFS using polymerase chain reaction and high-throughput sequencing. We found no consistent group-specific differences other than a lower prevalence of anelloviruses in cases compared to healthy controls. Our findings suggest that future investigations into viral infections in ME/CFS should focus on adaptive immune responses rather than surveillance for viral gene products.


Subject(s)
Fatigue Syndrome, Chronic , Humans , Fatigue Syndrome, Chronic/epidemiology , Saliva , Virome , Feces
18.
Clin Kidney J ; 16(7): 1170-1179, 2023 Jul.
Article in English | MEDLINE | ID: mdl-37398694

ABSTRACT

Background: Kidneys donated after circulatory death suffer a period of functional warm ischaemia before death, which may lead to early ischaemic injury. Effects of haemodynamic trajectories during the agonal phase on delayed graft function (DGF) is unknown. We aimed to predict the risk of DGF using patterns of trajectories of systolic blood pressure (SBP) declines in Maastricht category 3 kidney donors. Methods: We conducted a cohort study of all kidney transplant recipients in Australia who received kidneys from donation after circulatory death donors, divided into a derivation cohort (transplants between 9 April 2014 and 2 January 2018 [462 donors]) and a validation cohort (transplants between 6 January 2018 and 24 December 2019 [324 donors]). Patterns of SBP decline using latent class models were evaluated against the odds of DGF using a two-stage linear mixed effects model. Results: In the derivation cohort, 462 donors were included in the latent class analyses and 379 donors in the mixed effects model. Of the 696 eligible transplant recipients, 380 (54.6%) experienced DGF. Ten different trajectories, with distinct patterns of SBP decline were identified. Compared with recipients from donors with the slowest decline in SBP after withdrawal of cardiorespiratory support, the adjusted odds ratio (aOR) for DGF was 5.5 [95% confidence interval (CI) 1.38-28.0] for recipients from donors with a steeper decline and lowest SBP [mean 49.5 mmHg (standard deviation 12.5)] at the time of withdrawal. For every 1 mmHg/min reduction in the rate of decline of SBP, the respective aORs for DGF were 0.95 (95% CI 0.91-0.99) and 0.98 (95% CI 0.93-1.0) in the random forest and least absolute shrinkage and selection operator models. In the validation cohort, the respective aORs were 0.95 (95% CI 0.91-1.0) and 0.99 (95% CI 0.94-1.0). Conclusion: Trajectories of SBP decline and their determinants are predictive of DGF. These results support a trajectory-based assessment of haemodynamic changes in donors after circulatory death during the agonal phase for donor suitability and post-transplant outcomes.

19.
Bone ; 176: 116861, 2023 11.
Article in English | MEDLINE | ID: mdl-37524293

ABSTRACT

Lipocalin-2 (LCN2) is released by several cell types including osteoblasts and adipocytes and has been suggested as a marker of renal dysfunction, metabolic syndrome (MetS) and type 2 diabetes (T2D). Whether LCN2 is linked to these diseases in older women remains unknown. This study investigated whether LCN2 is related to features of MetS and T2D in older women. This cross-sectional study included 705 non-diabetic women (mean age 75.1 ± 2.6 years) for MetS analysis and 76 women (mean age 75.4 ± 2.8 years) with T2D. Total circulating LCN2 levels were analysed using a two-step chemiluminescent microparticle monoclonal immunoassay. MetS was determined by a modified National Cholesterol Education Program Adult Treatment Panel III classification. Multivariable-adjusted logistic regression analysis was used to assess odds ratios between LCN2 quartiles and MetS. Women in the highest LCN2 quartile had approximately 3 times greater risk for MetS compared to women in the lowest quartile (OR 3.05; 95%CI 1.86-5.02). Women with T2D or MetS scores of ≥ 3 had higher LCN2 levels compared to women with a MetS score of 0 (p < 0.05). Higher LCN2 correlated with higher body mass index, fat mass, triglycerides and glycated haemoglobin and lower high-density lipoprotein cholesterol and estimated glomerular filtration rate (p < 0.05). Higher circulating levels of LCN2 are associated with worsened cardio-metabolic risk factors and increased odds of MetS and T2D in older women. Whether it can be used as a biomarker for identifying those at risk for MetS and T2D should be explored further.


Subject(s)
Diabetes Mellitus, Type 2 , Metabolic Syndrome , Aged , Female , Humans , Cholesterol , Cross-Sectional Studies , Diabetes Mellitus, Type 2/complications , Independent Living , Lipocalin-2 , Risk Factors
20.
Transplantation ; 107(11): 2424-2432, 2023 Nov 01.
Article in English | MEDLINE | ID: mdl-37322595

ABSTRACT

BACKGROUND: Antibody-mediated rejection (AMR) is a major cause of kidney allograft failure and demonstrates different properties depending on whether it occurs early (<6 mo) or late (>6 mo) posttransplantation. We aimed to compare graft survival and treatment approaches for early and late AMR in Australia and New Zealand. METHODS: Transplant characteristics were obtained for patients with an AMR episode reported to the Australia and New Zealand Dialysis and Transplant Registry from January 2003 to December 2019. The primary outcome of time to graft loss from AMR diagnosis, with death considered a competing risk, was compared between early and late AMR using flexible parametric survival models. Secondary outcomes included treatments used, response to treatment, and time from AMR diagnosis to death. RESULTS: After adjustment for other explanatory factors, late AMR was associated with twice the risk of graft loss relative to early AMR. The risk was nonproportional over time, with early AMR having an increased early risk. Late AMR was also associated with an increased risk of death. Early AMR was treated more aggressively than late with more frequent use of plasma exchange and monoclonal/polyclonal antibodies. There was substantial variation in treatments used by transplant centers. Early AMR was reported to be more responsive to treatment than late. CONCLUSIONS: Late AMR is associated with an increased risk of graft loss and death compared with early AMR. The marked heterogeneity in the treatment of AMR highlights the need for effective, new therapeutic options for these conditions.

SELECTION OF CITATIONS
SEARCH DETAIL
...