Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 457
Filter
1.
PLoS One ; 19(6): e0304351, 2024.
Article in English | MEDLINE | ID: mdl-38838037

ABSTRACT

INTRODUCTION: Almost all patient-reported outcomes measures (PROMs) are text-based, which impedes accurate completion by low and limited literacy patients. Few PROMs are designed or validated to be self-administered, either in clinical or research settings, by patients of all literacy levels. We aimed to adapt the Patient Reported Outcomes Measurement Information System Upper Extremity Short Form (PROMIS-UE) to a multimedia version (mPROMIS-UE) that can be self-administered by hand and upper extremity patients of all literacy levels. METHODS: Our study in which we applied the Multimedia Adaptation Protocol included seven phases completed in a serial, iterative fashion: planning with our community advisory board; direct observation; discovery interviews with patients, caregivers, and clinic staff; ideation; prototyping; member-checking interviews; and feedback. Direct observations were documented in memos that underwent rapid thematic analysis. Interviews were audio-recorded and documented using analytic memos; a rapid, framework-guided thematic analysis with both inductive and deductive themes was performed. Themes were distilled into design challenges to guide ideation and prototyping that involved our multidisciplinary research team. To assess completeness, credibility, and acceptability we completed additional interviews with member-checking of initial findings and consulted our community advisory board. RESULTS: We conducted 12 hours of observations. We interviewed 17 adult English-speaking participants (12 patients, 3 caregivers, 2 staff) of mixed literacy. Our interviews revealed two distinct user personas and three distinct literacy personas; we developed the mPROMIS-UE with these personas in mind. Themes from interviews were distilled into four broad design challenges surrounding literacy, customizability, convenience, and shame. We identified features (audio, animations, icons, avatars, progress indicator, illustrated response scale) that addressed the design challenges. The last 6 interviews included member-checking; participants felt that the themes, design challenges, and corresponding features resonated with them. These features were synthesized into an mPROMIS-UE prototype that underwent rounds of iterative refinement, the last of which was guided by recommendations from our community advisory board. DISCUSSION: We successfully adapted the PROMIS-UE to an mPROMIS-UE that addresses the challenges identified by a mixed literacy hand and upper extremity patient cohort. This demonstrates the feasibility of adapting PROMs to multimedia versions. Future research will include back adaptation, usability testing via qualitative evaluation, and psychometric validation of the mPROMIS-UE. A validated mPROMIS-UE will expand clinicians' and investigators' ability to capture patient-reported outcomes in mixed literacy populations.


Subject(s)
Literacy , Multimedia , Patient Reported Outcome Measures , Humans , Female , Male , Middle Aged , Adult , Aged , Health Literacy
2.
Molecules ; 29(12)2024 Jun 11.
Article in English | MEDLINE | ID: mdl-38930837

ABSTRACT

In this work, a novel formaldehyde sensor was constructed based on nanoporous, flower-like, Pb-containing Pd-Au nanoparticles deposited on the cathode in a double-cabin galvanic cell (DCGC) with a Cu plate as the anode, a multiwalled carbon nanotube-modified glassy carbon electrode as the cathode, a 0.1 M HClO4 aqueous solution as the anolyte, and a 3.0 mM PdCl2 + 1.0 mM HAuCl4 + 5.0 mM Pb(ClO4)2 + 0.1 M HClO4 aqueous solution as the catholyte, respectively. Electrochemical studies reveal that the stripping of bulk Cu can induce underpotential deposition (UPD) of Pb during the galvanic replacement reaction (GRR) process, which affects the composition and morphology of Pb-containing Pd-Au nanoparticles. The electrocatalytic activity of Pb-containing nanoparticles toward formaldehyde oxidation was examined in an alkaline solution, and the experimental results showed that formaldehyde mainly caused direct oxidation on the surface of Pb-containing Pd-Au nanoparticles while inhibiting the formation of CO poison to a large degree. The proposed formaldehyde sensor exhibits a linear amperometric response to formaldehyde concentrations from 0.01 mM to 5.0 mM, with a sensitivity of 666 µA mM-1 cm-2, a limit of detection (LOD) of 0.89 µM at triple signal-to-noise, rapid response, high anti-interference ability, and good repeatability.

3.
Article in English | MEDLINE | ID: mdl-38944568

ABSTRACT

BACKGROUND: Liver transplantation (LT) is a pivotal treatment for end-stage liver disease. However, bloodstream infections (BSI) in the post-operative period present a significant threat to patient survival. This study aims to identify risk factors for post-LT BSI and crucial prognostic indicators for mortality among affected patients. METHODS: We conducted a retrospective study of adults diagnosed with end-stage liver disease who underwent their initial LT between 2010 and 2021. Those who developed BSI post-LT during the same hospital admission were classified into the BSI group. RESULTS: In this cohort of 1049 patients, 89 (8.4%) developed BSI post-LT, while 960 (91.5%) did not contract any infection. Among the BSI cases, 17 (19.1%) patients died. The average time to BSI onset was 48 days, with 46% occurring within the first month post-LT. Of the 123 isolated microorganisms, 97 (78.8%) were gram-negative bacteria. BSI patients had significantly longer stays in the intensive care unit and hospital compared to non-infected patients. The 90-day and in-hospital mortality rates for recipients with BSI were significantly higher than those without infections. Multivariate analysis indicated heightened BSI risk in patients with blood loss >3000 mL during LT (odds ratio [OR] 2.128), re-operation within 30 days (OR 2.341), post-LT bile leakage (OR 3.536), and graft rejection (OR 2.194). Additionally, chronic kidney disease (OR 6.288), each 1000 mL increase in intraoperative blood loss (OR 1.147) significantly raised mortality risk in BSI patients, whereas each 0.1 mg/dL increase in albumin levels correlated with a lower risk of death from BSI (OR 0.810). CONCLUSIONS: This study underscores the need for careful monitoring and management in the post-LT period, especially for patients at higher risk of BSI. It also suggests that serum albumin levels could serve as a valuable prognostic indicator for outcomes in LT recipients experiencing BSI.

4.
Hepatobiliary Surg Nutr ; 13(3): 425-443, 2024 Jun 01.
Article in English | MEDLINE | ID: mdl-38911194

ABSTRACT

Background: Liver retransplant is the only option to save a patient with liver graft failure. However, it is controversial due to its poor survival outcome compared to primary transplantation. Insufficient deceased organ donation in Taiwan leads to high waitlist mortality. Hence, living-donor grafts offer a valuable alternative for retransplantation. This study aims to analyze the single center's outcome in living donor liver retransplantation (re-LDLT) and deceased donor liver retransplantation (re-DDLT) as well as the survival related confounding risk factors. Methods: This is a single center retrospective study including 32 adults who underwent liver retransplantation (re-LT) from June 2002 to April 2020. The cohort was divided into a re-LDLT and a re-DDLT group and survival outcomes were analyzed. Patient outcomes over different periods, the effect of timing on survival, and multivariate analysis for risk factors were also demonstrated. Results: Of the 32 retransplantations, the re-LDLT group (n=11) received grafts from younger donors (31.3 vs. 43.75 years, P=0.016), with lower graft weights (688 vs. 1,457.2 g, P<0.001) and shorter cold ischemia time (CIT) (45 vs. 313 min, P<0.001). The 5-year survival was significantly better in the re-LDLT group than in the re-DDLT group (100% vs. 70.8%, P=0.02). This difference was adjusted when only retransplantation after 2010 was analyzed. Further analysis showed that the timing of retransplantation (early vs. late) did not affect patient survival. Multivariate analysis revealed that prolonged warm ischemia time (WIT) and intraoperative blood transfusion were related to poor long-term survival. Conclusions: Retransplantation with living donor graft demonstrated good long-term outcomes with acceptable complications to both recipient and donor. It may serve as a choice in areas lacking deceased donors. The timing of retransplantation did not affect the long-term survival. Further effort should be made to reduce WIT and massive blood transfusion as they contributed to poor survival after retransplantation.

5.
Am J Transplant ; 2024 Jun 22.
Article in English | MEDLINE | ID: mdl-38914281

ABSTRACT

Decreasing the graft size in living donor liver transplantation (LDLT) increases the risk of early allograft dysfunction. Graft-to-recipient-weight-ratio (GRWR) of 0.8 is considered the threshold. There is evidence that smaller volume grafts may also provide equally good outcomes, the cut-off of which remains unknown. In this retrospective multi-center study, 92 adult LDLT with a final GRWR<=0.6 performed at 12 international liver transplant (LT) centers over a 3-year period were included. Perioperative data including preoperative status, portal flow hemodynamics (PFH) and portal flow modulation (PFM), development of SFSS, morbidity and mortality was collated and analyzed. Thirty-two (36.7%) patients developed SFSS and this was associated with increased 30-day, 90-day and one-year mortality. Pre-operative MELD and inpatient status were independent predictors for SFSS (p<0.05). Pre-LT renal dysfunction was an independent predictor of survival (Hazard ratio- 3.1;95% ci 1.1,8.9, p=0.035). PFH or PFM were not predictive of SFSS or survival. We report the largest ever multi-center study of LDLT outcomes using ultralow-GRWR grafts and for the first-time validate the ILTS-iLDLT-LTSI consensus definition and grading of SFSS. Pre-operative recipient condition rather than GRWR and PFH were independent predictors of SFSS. Algorithms to predict SFSS and LT outcomes should incorporate recipient factors along with GRWR.

6.
Int J Surg ; 2024 Jun 13.
Article in English | MEDLINE | ID: mdl-38870007

ABSTRACT

BACKGROUND: Active vaccination has been utilized to prevent de novo hepatitis B virus infection (DNHB) in anti-HBc (+) grafts after liver transplantation (LT). However, the long-term efficacy of active vaccination and graft/patient outcomes of anti-HBc (+) grafts have yet to be comprehensively investigated. MATERIALS AND METHODS: Among 204 pediatric patients enrolled in the study, 82 recipients received anti-HBc (+) grafts. For DNHB prevention, active vaccination was repeatedly administered prior to transplant. Anti-viral therapy was given to patients with pre-transplant anti-HBs<1000 IU/ ml (non-robust response) for 2 years and discontinued when post-transplant patients achieved anti-HBs>1000 IU/mL, while anti-viral therapy was not given in patients with an anti-HBs titer over 1000 IU/mL. The primary outcome was to investigate the long-term efficacy of active vaccination, while the secondary outcomes included the graft and patient survival rates. RESULTS: Among the 82 anti-HBc (+) transplant patients, 68% of recipients achieved a robust immune response, thus not requiring antiviral therapy. Two patients (2.4%) developed DNHB infection, one of which was due to an escape mutant. With a median follow-up of 150 months, the overall 10-year patient and graft survival rates were significantly worse in recipients of anti-HBc (+) grafts than those of anti-HBc (-) grafts (85.2% vs 93.4%, P=0.026; 85.1% vs 93.4%, P=0.034, respectively). Additionally, the 10-year patient and graft outcomes of the anti-HBc (+) graft recipients were significantly worse than those of the anti-HBc (-) graft recipients after excluding early mortality and non-graft mortality values (90.8% vs 96.6%, P=0.036; 93.0% vs 98.3%, P=0.011, respectively). CONCLUSION: Our long-term follow-up study demonstrates that active vaccination is a simple, cost-effective strategy against DNHB infection in anti-HBc (+) graft patients, whereby the need for life-long antiviral therapy is removed. Notably, both the anti-HBc (+) grafts and patients exhibited inferior long-term survival rates, although the exact mechanisms remain unclear.

7.
Eur J Radiol ; 177: 111551, 2024 Jun 04.
Article in English | MEDLINE | ID: mdl-38875747

ABSTRACT

BACKGROUND: Liver transplantation is an effective treatment for preventing hepatocellular carcinoma (HCC) recurrence. This retrospective study aimed to quantitatively evaluate the attenuation in Hounsfield units (HU) on contrast-enhanced computed tomography (CECT) as a prognostic factor for hepatocellular carcinoma (HCC) following liver transplantation as a treatment. Our goal is to optimize its predictive ability for early tumor recurrence and compare it with the other imaging modality-positron emission tomography (PET). METHODS: In 618 cases of LDLT for HCC, only 131 patients with measurable viable HCC on preoperative CECT and preoperative positron emission tomography (PET) evaluations were included, with a minimum follow-up period of 6 years. Cox regression models were developed to identify predictors of postoperative recurrence. Performance metrics for both CT and PET were assessed. The correlation between these two imaging modalities was also evaluated. Survival analyses were conducted using time-dependent receiver operating characteristic (ROC) curve analysis and area under the curve (AUC) to assess accuracy and determine optimized cut-off points. RESULTS: Univariate and multivariate analyses revealed that both arterial-phase preoperative tumor attenuation (HU) and PET were independent prognostic factors for recurrence-free survival. Both lower arterial tumor enhancement (Cut-off value = 59.2, AUC 0.88) on CT and PET positive (AUC 0.89) increased risk of early tumor recurrence 0.5-year time-dependent ROC. Composites with HU < 59.2 and a positive PET result exhibited significantly higher diagnostic accuracy in detecting early tumor recurrence (AUC = 0.96). CONCLUSION: Relatively low arterial tumor enhancement values on CECT effectively predict early HCC recurrence after LDLT. The integration of CT and PET imaging may serve as imaging markers of early tumor recurrence in HCC patients after LDLT.

8.
Article in English | MEDLINE | ID: mdl-38812356

ABSTRACT

Bionic porous structure has been widely used in the field of bone implantation, because it can imitate the topological structure of bone, reduce the elastic modulus of metal bone implantation, and meet the mechanical properties and material transmission characteristics after implantation. This paper mainly studies the effects of different bionic porous structures on the mechanical and material transport properties of bone scaffolds. Firstly, under the same porosity condition, 12 groups of bionic porous structures with different shapes were designed, including G, P, D, I-type three period minimal surface (TPMS) and Voronoi porous structures with different irregularities. Then uses ABAQUS to carry out mechanical finite element simulation on different bionic porous structures, and uses Ti-6Al-4V alloy as forming material, uses laser powder bed fusion technology (LPBF) to prepare the scaffold, then carries out compression experiments. At the same time, COMSOL software is used to simulate the flow characteristics, analyze the permeability characteristics, and verified through cell experiment in vivo. The results show that the mechanical and permeability are different with vary scaffolds. In terms of topology, the morphological characteristics of TPMS are similar to trabecular bone, its compressive strength is relatively strong. Voronoi scaffold has lower elastic modulus, which can provide sufficient mechanical support while reducing stress shielding. In addition, the permeability of TPMS scaffold is better than Voronoi scaffold, which is helpful to promote cell proliferation and bone ingrowth. These bionic porous structures have their own advantages. Therefore, when designing porous structures for bone implantation, it is necessary to select the appropriate porous structure according to different bone implantation requirements. The research will help promote the clinical application of porous structures in the field of bone implantation, and provide theoretical support for the exploration of bone implantation structure design.

9.
Am J Transplant ; 2024 Apr 29.
Article in English | MEDLINE | ID: mdl-38692411

ABSTRACT

The publisher regrets that this article has been temporarily removed. A replacement will appear as soon as possible in which the reason for the removal of the article will be specified, or the article will be reinstated. The full Elsevier Policy on Article Withdrawal can be found at https://www.elsevier.com/about/policies/article-withdrawal.

10.
Cell Calcium ; 121: 102895, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38703416

ABSTRACT

Liver fibrosis is characterized by excessive deposition of extracellular matrix (ECM) as a wound healing process. Activated hepatic stellate cells (HpSCs) are the major producer of the ECM and play a central role in liver fibrogenesis. It has been widely accepted that elimination of activated HpSCs or reversion to a quiescent state can be a feasible strategy for resolving the disease, further highlighting the urgent need for novel therapeutic targets. Calreticulin (CRT) is a molecular chaperone that normally resides in the endoplasmic reticulum (ER), important in protein folding and trafficking through the secretory pathway. CRT also plays a critical role in calcium (Ca2+) homeostasis, with its Ca2+ storage capacity. In the current study, we aimed to demonstrate its function in directing HpSC activation. In a mouse liver injury model, CRT was up-regulated in HpSCs. In cellular experiments, we further showed that this activation was through modulating the canonical TGF-ß signaling. As down-regulation of CRT in HpSCs elevated intracellular Ca2+ levels through a form of Ca2+ influx, named store-operated Ca2+ entry (SOCE), we examined whether moderating SOCE affected TGF-ß signaling. Interestingly, blocking SOCE had little effect on TGF-ß-induced gene expression. In contrast, inhibition of ER Ca2+ release using the inositol trisphosphate receptor inhibitor 2-APB increased TGF-ß signaling. Treatment with 2-APB did not alter SOCE but decreased intracellular Ca2+ at the basal level. Indeed, adjusting Ca2+ concentrations by EGTA or BAPTA-AM chelation further enhanced TGF-ß-induced signaling. Our results suggest a crucial role of CRT in the liver fibrogenic process through modulating Ca2+ concentrations and TGF-ß signaling in HpSCs, which may provide new information and help advance the current discoveries for liver fibrosis.


Subject(s)
Calreticulin , Hepatic Stellate Cells , Signal Transduction , Smad Proteins , Transforming Growth Factor beta , Hepatic Stellate Cells/metabolism , Hepatic Stellate Cells/drug effects , Calreticulin/metabolism , Animals , Transforming Growth Factor beta/metabolism , Signal Transduction/drug effects , Smad Proteins/metabolism , Mice , Humans , Calcium/metabolism , Liver Cirrhosis/metabolism , Liver Cirrhosis/pathology , Male , Calcium Signaling/drug effects , Mice, Inbred C57BL
11.
Diagnostics (Basel) ; 14(8)2024 Apr 11.
Article in English | MEDLINE | ID: mdl-38667453

ABSTRACT

Acute cellular rejection (ACR) is a significant immune issue among recipients following liver transplantation. Although diffusion-weighted magnetic resonance imaging (DWI) is widely used for diagnosing liver disease, it has not yet been utilized for monitoring ACR in patients after liver transplantation. Therefore, the aim of this study was to evaluate the efficacy of DWI in monitoring treatment response among recipients with ACR. This study enrolled 25 recipients with highly suspected ACR rejection, and all subjects underwent both biochemistry and DWI scans before and after treatment. A pathological biopsy was performed 4 to 24 h after the first MRI examination to confirm ACR and degree of rejection. All patients were followed up and underwent a repeated MRI scan when their liver function returned to the normal range. After data acquisition, the DWI data were post-processed to obtain the apparent diffusion coefficient (ADC) map on a voxel-by-voxel basis. Five regions of interest were identified on the liver parenchyma to measure the mean ADC values from each patient. Finally, the mean ADC values and biochemical markers were statistically compared between ACR and non-ACR groups. A receiver operating characteristic (ROC) curve was constructed to evaluate the performance of the ADC and biochemical data in detecting ACR, and correlation analysis was used to understand the relationship between the ADC values, biochemical markers, and the degree of rejection. The histopathologic results revealed that 20 recipients had ACR, including 10 mild, 9 moderate, and 1 severe rejection. The results demonstrated that the ACR patients had significantly lower hepatic ADC values than those in patients without ACR. After treatment, the hepatic ADC values in ACR patients significantly increased to levels similar to those in non-ACR patients with treatment. The ROC analysis showed that the sensitivity and specificity for detecting ACR were 80% and 95%, respectively. Furthermore, the correlation analysis revealed that the mean ADC value and alanine aminotransferase level had strong and moderate negative correlation with the degree of rejection, respectively (r = -0.72 and -0.47). The ADC values were useful for detecting hepatic ACR and monitoring treatment response after immunosuppressive therapy.

12.
Transplant Proc ; 56(3): 625-633, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38519269

ABSTRACT

BACKGROUND: Advancements in surgical techniques, immunosuppression regimens, and peri-operative and postoperative care have resulted in marked improvement in outcomes after pediatric living donor liver transplantation (PLDLT). Despite these developments, infectious complications remain a major cause of morbidity and mortality. METHODS: This is a retrospective cohort analysis of pediatric recipients from January 2004 to December 2018. Patients were classified into infected and non-infected groups based on the occurrence of bacterial infection during the first 3 months after transplant. Perioperative risk factors for early post-transplant bacterial infections and postoperative outcomes were investigated. RESULTS: Seventy-two out of 221 children developed early bacterial infection (32.6%). The first episodes of bacterial infection most frequently occurred in the second week after LDLT (37.5%). In multivariate analysis, active infection before transplant and complications with Clavien-Dindo grading >3 were the only independent risk factors. Early bacterial infections were independently associated with longer intensive care unit stays, longer hospital stays, and a higher incidence of readmission for bacterial infection during the first year after transplant. Additionally, the overall patient survival rate was significantly higher in the non-infected group (P = .001). Risk factors for infection, such as age, weight, disease severity, ABO-incompatible, and other operative factors, were not identified as independent risk factors. CONCLUSION: We have demonstrated that there are similarities and disparities in the epidemiology and risk factors for early bacterial infection after transplant between centers. Identification and better characterization of these predisposing factors are essential in the modification of current preventive strategies and treatment protocols to improve outcomes for this highly vulnerable group.


Subject(s)
Bacterial Infections , Liver Transplantation , Living Donors , Humans , Liver Transplantation/adverse effects , Risk Factors , Retrospective Studies , Male , Female , Bacterial Infections/epidemiology , Child , Child, Preschool , Infant , Postoperative Complications/epidemiology , Adolescent , Length of Stay
13.
Transplant Proc ; 56(3): 596-601, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38472083

ABSTRACT

AIM: To compare the effectiveness of drug-eluting bead transarterial chemoembolization (DEB-TACE) with different particle sizes in bridging and downstaging in pretransplant hepatocellular carcinoma patients. Assess the recurrent and survival rates after living donor liver transplantation (LDLT). METHODS: Retrospective review of 580 patients who underwent TACE using DEB from August 2012 to June 2020 at Taiwan Kaohsiung Chang Gung Memorial Hospital. Pre- and post-TACE computed tomography scan images of the liver were reviewed, and treatment responses were assessed using modified Response Evaluation Criteria in Solid Tumors criteria. Patients were divided by who met the criteria (n = 342) or beyond (n = 238) the University of California San Francisco criteria for successful bridging and downstaging rate evaluation. Each group was divided into subgroups according to DEB particle sizes (group A: <100µm, group B: 100-300 µm, group C: 300-500 µm, and group D: 500-700 µm) to compare objective response rate and post-LDLT survival rate. RESULTS: Overall successful bridging and downstaging rate is 97.1% and 58.4%, respectively, in the group of patients who meet the criteria (n = 332) and are beyond (n = 139) the University of California San Francisco criteria. Group B (100-300 µm) had a higher successful bridging rate (99.5%, P = .003) and downstaging rate (63.8%, P = .443). This subgroup also demonstrated a higher objective response rate in single (93.2%, P = .038) tumors, multiple (83.3%, P = .001) tumors, and tumors with size less than 5 cm (93.9%, P = .005). There are no significant differences in post-LDLT overall survival rate between different particle sizes. CONCLUSION: TACE with 100 to 300 µm DEB particles is associated with a better chance of bridging and downstaging hepatocellular carcinoma patients to LDLT.


Subject(s)
Carcinoma, Hepatocellular , Chemoembolization, Therapeutic , Liver Neoplasms , Liver Transplantation , Living Donors , Particle Size , Humans , Carcinoma, Hepatocellular/therapy , Carcinoma, Hepatocellular/pathology , Carcinoma, Hepatocellular/mortality , Liver Neoplasms/therapy , Liver Neoplasms/pathology , Liver Neoplasms/mortality , Retrospective Studies , Male , Female , Middle Aged , Treatment Outcome , Adult , Neoplasm Staging , Microspheres , Aged
14.
Am J Transplant ; 2024 Feb 28.
Article in English | MEDLINE | ID: mdl-38428639

ABSTRACT

In living-donor liver transplantation, biliary complications including bile leaks and biliary anastomotic strictures remain significant challenges, with incidences varying across different centers. This multicentric retrospective study (2016-2020) included 3633 adult patients from 18 centers and aimed to identify risk factors for these biliary complications and their impact on patient survival. Incidences of bile leaks and biliary strictures were 11.4% and 20.6%, respectively. Key risk factors for bile leaks included multiple bile duct anastomoses (odds ratio, [OR] 1.8), Roux-en-Y hepaticojejunostomy (OR, 1.4), and a history of major abdominal surgery (OR, 1.4). For biliary anastomotic strictures, risk factors were ABO incompatibility (OR, 1.4), blood loss >1 L (OR, 1.4), and previous abdominal surgery (OR, 1.7). Patients experiencing biliary complications had extended hospital stays, increased incidence of major complications, and higher comprehensive complication index scores. The impact on graft survival became evident after accounting for immortal time bias using time-dependent covariate survival analysis. Bile leaks and biliary anastomotic strictures were associated with adjusted hazard ratios of 1.7 and 1.8 for graft survival, respectively. The study underscores the importance of minimizing these risks through careful donor selection and preoperative planning, as biliary complications significantly affect graft survival, despite the availability of effective treatments.

15.
Biosensors (Basel) ; 14(3)2024 Feb 23.
Article in English | MEDLINE | ID: mdl-38534228

ABSTRACT

Development of an efficient technique for accurate and sensitive dibutyl phthalate (DBP) determination is crucial for food safety and environment protection. An ultrasensitive molecularly imprinted polymers (MIP) voltammetric sensor was herein engineered for the specific determination of DBP using poly-l-lysine/poly(3,4-ethylenedioxythiophene)/porous graphene nanocomposite (PLL/PEDOT-PG) and poly(o-phenylenediamine)-imprinted film as a label-free and sensing platform. Fabrication of PEDOT-PG nanocomposites was achieved through a simple liquid-liquid interfacial polymerization. Subsequently, poly-l-lysine (PLL) functionalization was employed to enhance the dispersibility and stability of the prepared PEDOT-PG, as well as promote its adhesion on the sensor surface. In the presence of DBP, the imprinted poly(o-phenylenediamine) film was formed on the surface of PLL/PEDOT-PG. Investigation of the physical properties and electrochemical behavior of the MIP/PLL/PEDOT-PG indicates that the incorporation of PG into PEDOT, with PLL uniformly wrapping its surface, significantly enhanced conductivity, carrier mobility, stability, and provided a larger surface area for specific recognition sites. Under optimal experimental conditions, the electrochemical response exhibited a linear relationship with a logarithm of DBP concentration within the range of 1 fM to 5 µM, with the detection limit as low as 0.88 fM. The method demonstrated exceptional stability and repeatability and has been successfully applied to quantify DBP in plastic packaging materials.


Subject(s)
Bridged Bicyclo Compounds, Heterocyclic , Graphite , Molecular Imprinting , Nanocomposites , Phenylenediamines , Polymers , Dibutyl Phthalate , Molecularly Imprinted Polymers , Electrochemical Techniques/methods , Graphite/chemistry , Polylysine , Porosity , Nanocomposites/chemistry , Molecular Imprinting/methods , Limit of Detection , Electrodes
16.
Transplant Proc ; 56(3): 573-580, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38326205

ABSTRACT

PURPOSE: Despite technological and immunologic innovations, some living-donor liver transplant (LDLT) recipients still face poor liver regeneration. Sarcopenia is often recognized as a biomarker for poor outcomes in surgical patients. This study aimed to evaluate associations between sarcopenia and liver regeneration in LDLT recipients. MATERIALS AND METHODS: This retrospective review included consecutive patients who had received LDLT at Chang Gung Memorial Hospital between 2005 and 2017. Sarcopenia was assessed using the psoas muscle index (PMI) in cross-sectional images. Receiver operating characteristic curve analysis was used to determine the ability of PMI to predict relatively poor survival rates. Correlations between liver regeneration and sarcopenia were evaluated using regression analysis. RESULTS: A total of 109 LDLT recipients were included. The 1-, 3-, 5, 10-, and 15-year survival rates were 93.7%, 84.8%, 79.7%, 74.7%, and 73.3% in males and 93.3%, 83.3%, 83.3%, 71.4%, and 71.4% in females. PMIs were significantly different based on 10- and 15-year overall survival rates (P = .001 and P = .000) in male patients. Receiver operating characteristic curve analysis revealed the PMI cutoff point at 6.7 cm2/m2 (sensitivity = 48.3%, specificity = 81%, AUC (area under the ROC curve) = 0.685) based on 10-year survival. Linear regression analysis revealed that PMI was significantly associated with liver regeneration in males (P = .013). CONCLUSIONS: Sarcopenia and low PMI are associated with poor liver regeneration and long-term survival after LDLT in male patients. Further studies, including sarcopenia with conventional scores, may help to more reliably predict liver regeneration and mortality among LDLT patients with hepatocellular carcinoma.


Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Liver Regeneration , Liver Transplantation , Living Donors , Sarcopenia , Humans , Sarcopenia/mortality , Male , Female , Retrospective Studies , Middle Aged , Liver Neoplasms/surgery , Liver Neoplasms/mortality , Carcinoma, Hepatocellular/surgery , Carcinoma, Hepatocellular/mortality , Adult , Survival Rate
17.
JAMA Netw Open ; 7(2): e240118, 2024 Feb 05.
Article in English | MEDLINE | ID: mdl-38381432

ABSTRACT

Importance: The No Surprises Act implemented in 2022 aims to protect patients from surprise out-of-network (OON) bills, but it does not include ground ambulance services. Understanding ground ambulance OON and balance billing patterns from previous years could guide legislation aimed to protect patients following ground ambulance use. Objective: To characterize OON billing from ground ambulance services by evaluating whether OON billing risk differs by the site of ambulance origination (home, hospital, nonhospital medical facility, or scene of incident). Design, Setting, and Participants: Cross-sectional study of the Merative MarketScan dataset between January 1, 2015, and December 31, 2020, using claims-based data from employer-based private health insurance plans in the US. Participants included patients who utilized ground ambulances during the study period. Data were analyzed from June to December 2023. Exposure: Medical encounter requiring ground ambulance transportation. Main Outcomes and Measures: Ground ambulance OON billing prevalence was calcuated. Linear probability models adjusted for state-level mixed effects were fit to evaluate OON billing probability across ambulance origins. Secondary outcomes included the allowed payment, patient cost-sharing amounts, and potential balance bills for OON ambulances. Results: Among 2 031 937 ground ambulance services (1 375 977 unique patients) meeting the inclusion and exclusion criteria, 1 072 791 (52.8%) rides transported men, and the mean (SD) patient age was 41 (18) years. Of all services, 1 113 676 (54.8%) were billed OON. OON billing probabilities for ambulances originating from home or scene were higher by 12.0 percentage points (PP) (95% CI, 11.8-12.2 PP; P < .001 for home; 95% CI, 11.7-12.2 PP; P < .001 for scene) vs those originating from hospitals. Mean (SD) total financial burden, including cost-sharing and potential balance bills per ambulance service, was $434.70 ($415.99) per service billed OON vs $132.21 ($244.92) per service billed in-network. Conclusions and Relevance: In this cross-sectional study of over 2 million ground ambulance services, ambulances originating from home, the scene of an incident, and nonhospital medical facilities were more likely to result in OON bills. Legislation is needed to protect patients from surprise billing following use of ground ambulances, more than half of which resulted in OON billing. Future legislation should at minimum offer protections for these subsets of patients often calling for an ambulance in urgent or emergent situations.


Subject(s)
Ambulances , Cost Sharing , Male , Humans , Adult , Cross-Sectional Studies , Financial Stress , Health Facilities
18.
J Hand Surg Am ; 49(3): 203-211, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38069952

ABSTRACT

PURPOSE: Current guidelines recommend bone mineral density (BMD) testing after fragility fractures in patients aged 50 years or older. This study aimed to assess BMD testing and subsequent fragility fractures after low-energy distal radius fractures (DRFs) among patients aged 50-59 years. METHODS: We used the 2010-2020 MarketScan dataset to identify patients with initial DRFs with ages ranging between 50 and 59 years. We assessed the 1-year BMD testing rate and 3-year non-DRF fragility fracture rate. We created Kaplan-Meier plots to depict fragility fracture-free probabilities over time and used log-rank tests to compare the Kaplan-Meier curves. RESULTS: Among 78,389 patients aged 50-59 years with DRFs, 24,589 patients met our inclusion criteria, and most patients were women (N = 17,580, 71.5%). The BMD testing rate within 1 year after the initial DRF was 12.7% (95% CI, 12.3% to 13.2%). In addition, 1-year BMD testing rates for the age groups of 50-54 and 55-59 years were 10.4% (95% CI, 9.9% to 11.0%) and 14.9% (95% CI, 14.2% to 15.6%), respectively. Only 1.8% (95% CI, 1.5% to 2.1%) of men, compared with 17.1% (95% CI, 16.5% to 17.7%) of women, underwent BMD testing within 1 year after the initial fracture. The overall 3-year fragility fracture rate was 6.0% (95% CI, 5.6% to 6.3%). The subsequent fragility fracture rate was lower for those with any BMD testing (4.4%; 95% CI, 3.7% to 5.2%), compared with those without BMD testing (6.2%; 95% CI, 5.9% to 6.6%; P < .05). CONCLUSIONS: We report a low BMD testing rate for patients aged between 50 and 59 years after initial isolated DRFs, especially for men and patients aged between 50 and 54 years. Patients who received BMD testing had a lower rate of subsequent fracture within 3 years. We recommend that providers follow published guidelines and initiate an osteoporosis work-up for patients with low-energy DRFs to ensure early diagnosis. This provides an opportunity to initiate treatment that may prevent subsequent fractures. TYPE OF STUDY/LEVEL OF EVIDENCE: Prognosis II.


Subject(s)
Fractures, Bone , Osteoporosis , Osteoporotic Fractures , Radius Fractures , Wrist Fractures , United States/epidemiology , Male , Humans , Aged , Female , Middle Aged , Bone Density , Radius Fractures/diagnostic imaging , Radius Fractures/therapy , Medicare , Osteoporosis/complications , Osteoporosis/diagnosis , Osteoporotic Fractures/prevention & control
19.
Clin Transplant ; 38(1): e15163, 2024 01.
Article in English | MEDLINE | ID: mdl-37823247

ABSTRACT

BACKGROUND AND AIM: Limited data are available regarding pre-liver transplantation (LT) bacteremia in adults with end-stage liver disease. In this study, we investigated the risk factors independently associated with pre-LT bacteremia and their effects on clinical outcomes of LT. METHODS: This retrospective study performed between 2010 and 2021 included 1287 LT recipients. The study population was categorized into patients with pre-LT bacteremia and those without pre-LT infection. Pre-LT bacteremia was defined as bacteremia detected within 90 days before LT. RESULTS: Among 1287 LT recipients, 92 (7.1%) developed pre-LT bacteremia. The mean interval between bacteremia and LT was 28.3 ± 19.5 days. Of these 92 patients, seven (7.6%) patients died after LT. Of the 99 microorganisms isolated in this study, gram-negative bacteria were the most common microbes (72.7%). Bacteremia was mainly attributed to spontaneous bacterial peritonitis. The most common pathogen isolated was Escherichia coli (25.2%), followed by Klebsiella pneumoniae (18.2%), and Staphylococcus aureus (15.1%). Multivariate analysis showed that massive ascites (adjusted odds ratio [OR] 1.67, 95% confidence Interval [CI] 1.048-2.687) and a prolonged international normalized ratio for prothrombin time (adjusted OR 1.13, 95% CI 1.074-1.257) were independent risk factors for pre-LT bacteremia in patients with end-stage liver disease. Intensive care unit and in-hospital stay were significantly longer, and in-hospital mortality was significantly higher among LT recipients with pre-LT bacteremia than among those without pre-LT infection. CONCLUSIONS: This study highlights predictors of pre-LT bacteremia in patients with end-stage liver disease. Pre-LT bacteremia increases the post-transplantation mortality risk.


Subject(s)
Bacteremia , End Stage Liver Disease , Liver Transplantation , Adult , Humans , Liver Transplantation/adverse effects , Retrospective Studies , End Stage Liver Disease/complications , End Stage Liver Disease/surgery , Risk Factors , Bacteremia/epidemiology
20.
Molecules ; 28(24)2023 Dec 11.
Article in English | MEDLINE | ID: mdl-38138526

ABSTRACT

Bisphenol A is one of the most widely used industrial compounds. Over the years, it has raised severe concern as a potential hazard to the human endocrine system and the environment. Developing robust and easy-to-use sensors for bisphenol A is important in various areas, such as controlling and monitoring water purification and sewage water systems, food safety monitoring, etc. Here, we report an electrochemical method to fabricate a bisphenol A (BPA) sensor based on a modified Au nanoparticles/multiwalled carbon nanotubes composite electrocatalyst electrode (AuCu-UPD/MWCNTs/GCE). Firstly, the Au-Cu alloy was prepared via a convenient and controllable Cu underpotential/bulk Au co-electrodeposition on a multiwalled modified carbon nanotubes glassy carbon electrode (GCE). Then, the AuCu-UPD/MWCNTs/GCE was obtained via the electrochemical anodic stripping of Cu underpotential deposition (UPD). Our novel prepared sensor enables the high-electrocatalytic and high-performance sensing of BPA. Under optimal conditions, the modified electrode showed a two-segment linear response from 0.01 to 1 µM and 1 to 20 µM with a limit of detection (LOD) of 2.43 nM based on differential pulse voltammetry (DPV). Determination of BPA in real water samples using AuCu-UPD/MWCNTs/GCE yielded satisfactory results. The proposed electrochemical sensor is promising for the development of a simple, low-cost water quality monitoring system for the detection of BPA in ambient water samples.

SELECTION OF CITATIONS
SEARCH DETAIL
...