Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 3.783
Filter
1.
Intensive Crit Care Nurs ; 86: 103819, 2024 Sep 09.
Article in English | MEDLINE | ID: mdl-39255615

ABSTRACT

OBJECTIVES: Nurse practitioner-led MET calls have been shown to improve clinical outcomes versus ICU registrar-led MET calls. However, the cost implications of a nurse practitioner-led MET call system is not known. We conducted cost analysis from the healthcare service perspective to compare the costs of nurse practitioner- and ICU registrar-led MET calls. RESEARCH METHODOLOGY: A retrospective study of MET calls between 1 June 2016 and 9 March 2018 including patients with first MET call during their hospital admission. The cost analysis compared MET calls attended by nurse practitioners against those attended by ICU registrars. MAIN OUTCOME MEASURES: Inpatient costs for nurse practitioner- and ICU registrar-led MET calls. RESULTS: 1,343 MET calls were included in the full dataset with a mean cost per ICU registrar-led MET calls and nurse practitioner led MET calls of AU$19,836 (95 % CI: AU$15,778 - AU$23,895) versus AU$16,404 (95 % CI: AU$14,988 - AU$17,820) respectively and a difference of AU$3,432 (95 % CI: -AU$38 - AU$6,903, p = 0.053). In the propensity-score matched analysis, the mean cost per ICU registrar-led MET calls and nurse practitioner led MET calls was AU$19,009 (95 % CI: AU$15,439 - AU$22,578) and AU$13,937 (95 % CI: AU$12,038 - AU$15,835) respectively, with a difference of AU$5,072 (95 % CI: AU$1,061 - AU$9,082, p = 0.013). A 24-hour nurse practitioners-led MET call service would break even at 101 MET calls leading to ICU admissions per year. CONCLUSION: Nurse practitioners-led MET calls saved significant costs compared to ICU registrar-led MET calls. Assuming that the difference in costs is due to shorter ICU length of stay, a health service that receives more than 101 MET calls leading to ICU admissions per year can save costs with a 24-hour nurse practitioner-led MET call service. IMPLICATIONS FOR CLINICAL PRACTICE: This study helps in identifying the healthcare services where nurse practitioners -led MET systems could be implemented to be cost saving from health service perspective.

2.
Clin Ophthalmol ; 18: 2481-2485, 2024.
Article in English | MEDLINE | ID: mdl-39246557

ABSTRACT

Purpose: To estimate the economic and environmental impact of single-use instruments (SUIs) to perform standard cataract surgery in six ophthalmology centers located in Europe and in the United States. Setting: Online survey and interview. Design: Comparative cost analysis based on an online survey with follow-up questionnaire and interview. The carbon footprint calculation was made by ClimatePartner. Methods: Annual costs of reusable instruments (RUIs) were calculated based on data provided by the centers. Annual costs of SUIs were estimated based on the average-selling price of a single-use cataract set of 5 instruments and the reported annual volume of cataract surgery. The calculation carbon footprint of a cataract instrument covered the whole life cycle from production to end-of-life. Results: Annual costs for SUIs were found inferior or similar to the annual costs for RUIs for 4 out of the 6 centers included in this study. The centers where SUIs were demonstrated to be the most cost-effective were also associated with the highest costs of sterilization per instrument. The carbon footprint of 5-years usage of a cataract instrument was found to be 5478.2 kg CO2 eq for SUIs without recycling, 4639.9 kg CO2 eq for SUIs with recycling and 20.6 kg CO2 eq for RUIs. Conclusion: The study demonstrated that SUIs can be an alternative solution to using RUIs in multispecialty hospitals associated with high sterilization costs.

3.
JMIR Public Health Surveill ; 10: e54750, 2024 Sep 06.
Article in English | MEDLINE | ID: mdl-39240545

ABSTRACT

Background: The COVID-19 pandemic highlighted the need for pathogen surveillance systems to augment both early warning and outbreak monitoring/control efforts. Community wastewater samples provide a rapid and accurate source of environmental surveillance data to complement direct patient sampling. Due to its global presence and critical missions, the US military is a leader in global pandemic preparedness efforts. Clinical testing for COVID-19 on US Air Force (USAF) bases (AFBs) was effective but costly with respect to direct monetary costs and indirect costs due to lost time. To remain operating at peak capacity, such bases sought a more passive surveillance option and piloted wastewater surveillance (WWS) at 17 AFBs to demonstrate feasibility, safety, utility, and cost-effectiveness from May 2021 to January 2022. Objective: We model the costs of a wastewater program for pathogens of public health concern within the specific context of US military installations using assumptions based on the results of the USAF and Joint Program Executive Office for Chemical, Biological, Radiological and Nuclear Defense pilot program. The objective was to determine the cost of deploying WWS to all AFBs relative to clinical swab testing surveillance regimes. Methods: A WWS cost projection model was built based on subject matter expert input and actual costs incurred during the WWS pilot program at USAF AFBs. Several SARS-CoV-2 circulation scenarios were considered, and the costs of both WWS and clinical swab testing were projected. Analysis was conducted to determine the break-even point and how a reduction in swab testing could unlock funds to enable WWS to occur in parallel. Results: Our model confirmed that WWS is complementary and highly cost-effective when compared to existing alternative forms of biosurveillance. We found that the cost of WWS was between US $10.5-$18.5 million less expensive annually in direct costs as compared to clinical swab testing surveillance. When the indirect cost of lost work was incorporated, including lost work associated with required clinical swab testing, we estimated that over two-thirds of clinical swab testing could be maintained with no additional costs upon implementation of WWS. Conclusions: Our results support the adoption of WWS across US military installations as part of a more comprehensive and early warning system that will enable adaptive monitoring during disease outbreaks in a more cost-effective manner than swab testing alone.


Subject(s)
COVID-19 , Wastewater , Humans , United States/epidemiology , COVID-19/epidemiology , COVID-19/prevention & control , Pilot Projects , Military Personnel/statistics & numerical data , Military Facilities , Costs and Cost Analysis , Cost-Benefit Analysis
4.
Surg Today ; 2024 Sep 04.
Article in English | MEDLINE | ID: mdl-39227396

ABSTRACT

PURPOSE: Japan has adopted its own reimbursement system, which differs from other countries in terms of its diagnostic procedure combination (DPC) methods. However, there are few reports on the cost analysis of open repair and endovascular aneurysm repair (EVAR) for abdominal aortic aneurysms in Japan. We aimed to evaluate the long-term outcomes and cost-effectiveness of these two procedures. METHODS: This study included patients who underwent open repair (n = 224) and EVAR (n = 87) between January 2012 and December 2022. After propensity score matching, we compared the two groups. RESULTS: The drug and blood products, procedures, and DPC costs were significantly higher in the open repair group (p < 0.001) than in the EVAR group. The surgical equipment and total costs were significantly higher in the EVAR group than in the open repair group (p < 0.001). There was no significant difference in the 5-year survival rate (88.5% in the open repair group vs. 72.0% in the EVAR group; p = 0.33) and freedom from re-intervention rate at 5 years (93.1% in the open repair group vs. 89.9% in the EVAR group; p = 0.15) between the two groups. CONCLUSIONS: Open repair is more cost-effective than EVAR. The cost-effectiveness of EVAR may therefore depend on the cost of the endograft.

5.
Orthopadie (Heidelb) ; 2024 Sep 04.
Article in German | MEDLINE | ID: mdl-39230676

ABSTRACT

BACKGROUND: The Achilles tendon is the strongest tendon in humans and is frequently injured, especially in the physically active young to middle-aged population. An increasing frequency of Achilles tendon ruptures (ATR) has been reported in several studies. However, there is no international consensus regarding possible non-operative (N-OP) or operative (surgical) treatment (OP). OBJECTIVES: The aim of this article is to semi-quantitatively compare both treatment options for ATR by analyzing the results reported in the literature. MATERIAL AND METHODS: For this purpose, relevant categories were identified, and the literature was then evaluated in a PubMed analysis. Ten meta-analyses and two cost analyses were included. The data was extracted according to the categories and evaluated comparatively. RESULTS: OP and N­OP for acute ATR can lead to equally good restitution of clinical function if early functional rehabilitation is applied. The lower re-rupture rate is an advantage of OP, whereas the lower general complications speak in favor of N­OP. The minimally invasive or percutaneous surgical technique (M-OP) appears to be advantageous over the open surgical technique (O-OP), although studies show an increased rate of lesions of the sural nerve. CONCLUSION: There is no consensus regarding the superiority of OP or N­OP for acute ATR, as several studies conducted since the introduction of early mobilization protocols have shown similar results for these two interventions. Results and complications of M­OP and O­OP are also comparable. Considering the available data on the various surgical procedures, the authors prefer the M­OP technique with adequate sural nerve protection for repair of acute ATR, combined with an early mobilization protocol.

6.
Nephrol Dial Transplant ; 39(Supplement_2): ii11-ii17, 2024 Sep 05.
Article in English | MEDLINE | ID: mdl-39235197

ABSTRACT

BACKGROUND: Hemodialysis (HD) is the most commonly utilized modality for kidney replacement therapy worldwide. This study assesses the organizational structures, availability, accessibility, affordability and quality of HD care worldwide. METHODS: This cross-sectional study relied on desk research data as well as survey data from stakeholders (clinicians, policymakers and patient advocates) from countries affiliated with the International Society of Nephrology from July to September 2022. RESULTS: Overall, 167 countries or jurisdictions participated in the survey. In-center HD was available in 98% of countries with a median global prevalence of 322.7 [interquartile range (IQR) 76.3-648.8] per million population (pmp), ranging from 12.2 (IQR 3.9-103.0) pmp in Africa to 1575 (IQR 282.2-2106.8) pmp in North and East Asia. Overall, home HD was available in 30% of countries, mostly in countries of Western Europe (82%). In 74% of countries, more than half of people with kidney failure were able to access HD. HD centers increased with increasing country income levels from 0.31 pmp in low-income countries to 9.31 pmp in high-income countries. Overall, the annual cost of in-center HD was US$19 380.3 (IQR 11 817.6-38 005.4), and was highest in North America and the Caribbean (US$39 825.9) and lowest in South Asia (US$4310.2). In 19% of countries, HD services could not be accessed by children. CONCLUSIONS: This study shows significant variations that have remained consistent over the years in availability, access and affordability of HD across countries with severe limitations in lower-resourced countries.


Subject(s)
Global Health , Renal Dialysis , Humans , Renal Dialysis/economics , Renal Dialysis/statistics & numerical data , Cross-Sectional Studies , Health Services Accessibility/statistics & numerical data , Kidney Failure, Chronic/therapy , Kidney Failure, Chronic/epidemiology
7.
Lancet Reg Health West Pac ; 50: 101162, 2024 Sep.
Article in English | MEDLINE | ID: mdl-39219627

ABSTRACT

Background: School-based targeted preventive chemotherapy (PC), the primary strategy for soil-transmitted helminth (STH) control, typically focusing on primary schoolchildren, was expanded to secondary school students in the Philippines in 2016. This program still excludes adults, who may also suffer from considerable morbidity and can be a significant reservoir of infection. Mass drug administration (MDA), where the entire population is treated, would bring additional health benefits but will also increase implementation costs. The incremental cost of implementing MDA for STH control compared to expanded school-based targeted PC, however, is unknown. Methods: A cost survey was conducted in Zamboanga Peninsula region in 2021 to estimate the economic and financial cost of implementing MDA compared to the expanded school-based targeted PC from a government payer perspective. A budget impact analysis was conducted to estimate the financial cost to the government of implementing MDA over a five-year timeframe. Monte Carlo simulation accounted for uncertainty in cost estimates. Costs were reported in 2021 United States Dollars ($). Findings: The economic cost of MDA was $809,000 per year (95% CI: $679,000-$950,000) or $0.22 per person targeted (95% CI: $0.19-$0.26), while the expanded school-based targeted PC would cost $625,000 (95% CI: $549,000-$706,000) or $0.57 per person targeted (95% CI: $0.50-$0.64). Over five years, the financial cost to the government for MDA would be $3,113,000 (95% CI: $2,475,000-$3,810,000); $740,000 (95% CI: $486,000-$1,019,000) higher than expanded school-based targeted PC. Interpretation: Implementing MDA in the region will increase the economic and financial costs by 29% and 31%, respectively, when compared to expanded school-based targeted PC. Implementing MDA would require the Department of Health to increase their total expenditure for STH control by 0.2% and could be key in addressing the ongoing STH burden. Funding: The project was funded by the Australian Centre for the Control and Elimination of Neglected Tropical Diseases (NHMRC GA19028), and JPCDT was supported by a UNSW Scientia PhD Scholarship. SVN is funded by an NHMRC Investigator Grant (APP 2018220).

8.
Arq. bras. cardiol ; 121(8): e20230672, ago. 2024. tab, graf
Article in Portuguese | LILACS-Express | LILACS | ID: biblio-1568815

ABSTRACT

Resumo Fundamento O choque cardiogênico (CC) refratário está associado com altas taxas de mortalidade, e o uso de oxigenação por membrana extracorpórea venoarterial (VA-ECMO, do inglês venoarterial extracorporeal membrane oxygenation) como uma opção terapêutica tem gerado discussões. Nesse sentido, sua custo-efetividade, principalmente em países de baixa e média renda como o Brasil, continua incerto.Objetivos: Conduzir uma análise de custo-efetividade na perspectiva do Sistema Único de Saúde (SUS) para avaliar a custo-efetividade de VA-ECMO combinado com o tratamento padrão em comparação ao tratamento padrão isolado em pacientes adultos com CC refratário. Métodos Acompanhamos uma coorte de pacientes com CC refratário tratados com VA-ECMO em centros de assistência terciária do sul brasileiro. Coletamos dados de desfechos e custos hospitalares. Realizamos uma revisão sistemática para complementar nossos dados e usamos o modelo de Markov para estimar a razão de custo-efetividade incremental (RCEI) por ano de vida ajustado pela qualidade (QALY) e por ano de vida ganho. Resultados Na análise do caso-base, a VA-ECMO gerou uma RCEI de Int$ 37 491 por QALY. Análises de sensibilidade identificaram o custo de internação, o risco relativo de sobrevida, e a sobrevida do grupo submetido à VA-ECMO como principais variáveis influenciando os resultados. A análise de sensibilidade probabilística mostrou um benefício do uso de VA-ECMO, com uma probabilidade de 78% de custo-efetividade no limiar recomendado de disposição a pagar. Conclusões Nosso estudo sugere que, dentro do SUS, VA-ECMO pode ser uma terapia custo-efetiva para o CC refratário. Contudo, a escassez de dados sobre a eficácia e de ensaios clínicos recentes que abordem seus benefícios em subgrupos específicos de pacientes destaca a necessidade de mais pesquisas. Ensaios clínicos rigorosos, incluindo perfis diversos de pacientes, são essenciais para confirmar a custo-efetividade com uso de VA-ECMO e assegurar acesso igualitário a intervenções médicas avançadas dentro dos sistemas de saúde, especialmente em países com desigualdades socioeconômicas como o Brasil.


Abstract Background Refractory cardiogenic shock (CS) is associated with high mortality rates, and the use of venoarterial extracorporeal membrane oxygenation (VA-ECMO) as a therapeutic option has generated discussions. Therefore, its cost-effectiveness, especially in low- and middle-income countries like Brazil, remains uncertain.Objectives: To conduct a cost-utility analysis from the Brazilian Unified Health System perspective to assess the cost-effectiveness of VA-ECMO combined with standard care compared to standard care alone in adult refractory CS patients. Methods We followed a cohort of refractory CS patients treated with VA-ECMO in tertiary care centers located in Southern Brazilian. We collected data on hospital outcomes and costs. We conducted a systematic review to supplement our data and utilized a Markov model to estimate incremental cost-effectiveness ratios (ICERs) per quality-adjusted life year (QALY) and per life-year gained. Results In the base-case analysis, VA-ECMO yielded an ICER of Int$ 37,491 per QALY. Sensitivity analyses identified hospitalization cost, relative risk of survival, and VA-ECMO group survival as key drivers of results. Probabilistic sensitivity analysis favored VA-ECMO, with a 78% probability of cost-effectiveness at the recommended willingness-to-pay threshold. Conclusions Our study suggests that, within the Brazilian Health System framework, VA-ECMO may be a cost-effective therapy for refractory CS. However, limited efficacy data and recent trials questioning its benefit in specific patient subsets highlight the need for further research. Rigorous clinical trials, encompassing diverse patient profiles, are essential to confirm cost-effectiveness and ensure equitable access to advanced medical interventions within healthcare systems, particularly in socio-economically diverse countries like Brazil.

9.
BMC Infect Dis ; 24(1): 775, 2024 Aug 02.
Article in English | MEDLINE | ID: mdl-39095714

ABSTRACT

INTRODUCTION: HIV treatment currently consists of daily oral antiretroviral therapy (ART). Cabotegravir + rilpivirine long-acting (CAB + RPV LA) is the first ART available in Spain administered every 2 months through intramuscular injection by a healthcare professional (HCP). The objective of this analysis was to assess potential healthcare resource use (HRU) and cost impact of implementing CAB + RPV LA vs. daily oral ART at National Health System (NHS) hospitals. METHODS: Online quantitative interviews and cost analysis were performed. Infectious disease specialists (IDS), hospital pharmacists (HP) and nurses were asked about their perception of potential differences in HRU between CAB + RPV LA vs. daily oral ART, among other concepts of interest. Spanish official tariffs were applied as unit costs to the HRU estimates (€2022). RESULTS: 120 responders (n = 40 IDS, n = 40 HP, n = 40 nurses) estimated an average number of annual visits per patient by speciality (IDS, HP, and nurse, respectively) of 3.3 vs. 3.7; 4.4 vs. 6.2; 6.1 vs. 3.9, for CAB + RPV LA vs. daily oral ART, and 3.0 vs. 3.2; 4.8 vs. 5.8; 6.9 vs. 4.9, respectively when adjusting by corresponding specialist responses. Estimation by the total sample led to an annual total cost per patient of €2,076 vs. €2,473, being €2,032 vs. €2,237 after adjusting by corresponding HCP, for CAB + RPV LA vs. daily oral ART. CONCLUSIONS: These results suggest that the implementation of CAB + RPV LA in NHS hospitals would not incur in increased HRU-related costs compared to current daily oral ARTs, being potentially neutral or even cost-saving.


Subject(s)
Anti-HIV Agents , HIV Infections , Pyridones , Rilpivirine , Humans , HIV Infections/drug therapy , HIV Infections/economics , Rilpivirine/therapeutic use , Rilpivirine/economics , Rilpivirine/administration & dosage , Spain , Anti-HIV Agents/therapeutic use , Anti-HIV Agents/economics , Anti-HIV Agents/administration & dosage , Pyridones/economics , Pyridones/therapeutic use , Pyridones/administration & dosage , Administration, Oral , Injections, Intramuscular , Health Care Costs/statistics & numerical data , Health Resources/economics , Health Resources/statistics & numerical data , Diketopiperazines
10.
Global Spine J ; : 21925682241274373, 2024 Aug 08.
Article in English | MEDLINE | ID: mdl-39116341

ABSTRACT

STUDY DESIGN: Retrospective Cohort Study. OBJECTIVE: The aim of this study was to compare the efficacy of CT-based computer assisted navigation (CAN) to conventional pedicle screw placement for patients with Adolescent Idiopathic Scoliosis (AIS). METHODS: This retrospective cohort study drew data from the National Readmissions Database, years 2016-2019. Patients undergoing posterior fusion for AIS, either via CAN or fluoroscopic-guided procedures, were identified via ICD-10 codes. Multivariate regression was performed to compare outcomes between operative techniques. Negative binomial regression was used to asses discharge disposition, while Gamma regression was performed to assess length of stay (LOS) and total charges. Patient demographics and comorbidities, measured via the Elixhauser comorbidity index, were both controlled for in our regression analysis. RESULTS: 28,868 patients, 2095 (7.3%) undergoing a CAN procedure, were included in our analysis. Patients undergoing CAN procedures had increased surgical complications (Odds Ratio (OR) 2.23; P < 0.001), namely, blood transfusions (OR 2.47; P < 0.001). Discharge disposition and LOS were similar, as were reoperation and readmission rates; however, total charges were significantly greater in the CAN group (OR 1.37; P < 0.001). Mean charges were 191,489.42 (119,302.30) USD for conventional surgery vs 268 589.86 (105,636.78) USD for the CAN cohort. CONCLUSION: CAN in posterior fusion for AIS does not appear to decrease postoperative complications and is associated with an increased need for blood transfusions. Given the much higher total cost of care that was also seen with CAN, this study calls into question whether the use of CAN is justified in this setting.

11.
Eur J Obstet Gynecol Reprod Biol ; 301: 105-113, 2024 Jul 27.
Article in English | MEDLINE | ID: mdl-39116478

ABSTRACT

BACKGROUND: As a minimally invasive technique, robot-assisted hysterectomy (RAH) offers surgical advantages and significant reduction in morbidity compared to open surgery. Despite the increasing use of RAH in benign gynaecology, there is limited data on its cost-effectiveness, especially in a European context. Our goal is to assess the costs of the different hysterectomy approaches, to describe their clinical outcomes, and to evaluate the impact of introduction of RAH on the rates of different types of hysterectomy. METHODS: A retrospective single-centre cost-analysis was performed for patients undergoing a hysterectomy for benign indications. Abdominal hysterectomy (AH), vaginal hysterectomy (VH), laparoscopic hysterectomy (LH), laparoscopically assisted vaginal hysterectomy (LAVH) and RAH were included. We considered the costs of operating room and hospital stay for the different hysterectomy techniques using the "Activity Centre-Care program model". We report on intra- and postoperative complications for the different approaches as well as their cost relationship. RESULTS: Between January 2014 and December 2021, 830 patients were operated; 67 underwent VH (8%), 108 LAVH (13%), 351 LH (42%), 148 RAH (18%) and 156 AH (19%). After the implementation and learning curve of a dedicated program for RAH in 2018, AH declined from 27.3% in 2014-2017, to 22.1% in 2018 and 6.9 % in 2019-2021. The reintervention rate was 3-4% for all surgical techniques. Pharmacological interventions and blood transfusions were performed after AH in 28%, and in 17-22% of the other approaches. AH had the highest hospital stay cost with an average of €2236.40. Mean cost of the hospital stay ranged from €1136.77-€1560.66 for minimally invasive techniques. The average total costs for RAH were €6528.10 compared to €4400.95 for AH. CONCLUSION: Implementation of RAH resulted in a substantial decrease of open surgery rate. However, RAH remains the most expensive technique in our cohort, mainly due to high material and depreciation costs. Therefore, RAH should not be considered for every patient, but for those who would otherwise need more invasive surgery, with higher risk of complications. Future prospective studies should focus on the societal costs and patient reported outcomes, in order to do cost-benefit analysis and further evaluate the exact value of RAH in the current healthcare setting.

12.
J Surg Oncol ; 2024 Aug 19.
Article in English | MEDLINE | ID: mdl-39155702

ABSTRACT

BACKGROUND AND OBJECTIVES: Surgical treatment of soft tissue sarcoma (STS) involves wide resection of the tumor, which can necessitate soft tissue reconstruction with local or free tissue flaps. This retrospective study compares cost, surgical and oncologic outcomes between patients undergoing reconstruction with immediate versus delayed flap coverage following STS resection. METHODS: Thirty-four patients who underwent planned flap reconstruction following resection of primary STS were identified retrospectively. Twenty-four (71%) received immediate reconstruction during the index surgery and 10 (29%) underwent planned delayed reconstruction. Preoperative patient-specific metrics, tumor characteristics, and surgical and patient outcomes were collected. Total hospital charges associated with every encounter during the perioperative period were obtained. RESULTS: Patient demographics, comorbidities, tumor metrics, and surgical characteristics were equivalent between groups. Postoperative wound complications, reoperations, readmissions, and disease-specific survival did not differ between cohorts. Costs associated with each reconstruction strategy were equivalent on bivariate and multivariate testing, when accounting for operating room time, hospital length of stay, and reoperation rate. CONCLUSIONS: Our study identifies no significant difference in patient outcome measures or cost between planned immediate and delayed flap reconstruction following STS resection. These results support the implementation of either treatment strategy in keeping with patient-centered, multidisciplinary care principles.

13.
Article in English | MEDLINE | ID: mdl-39095537

ABSTRACT

PURPOSE: The resection of lymph nodes/neck dissection is a typical part of the surgical treatment of head and neck malignancies. The aim of this study was to compare subcutaneous closure using single knotted, braided suture (VicrylTM, standard arm) with continuous self-locking, monofilament barbed suture (V-LocTM, experimental arm). METHODS: Neck Lock was a randomized clinical trial at a single tertiary referral center. It was conducted from 2016 till 2022 with a follow-up period of 3 months. Assessment of safety and aesthetic outcome was double-blinded. 68 patients were randomized after application of exclusion criteria. Subcutaneous wound closure was performed in an intrapatient randomized fashion for suture technique. The primary endpoint was the duration of subcutaneous sutures. Wound healing and scar formation were recorded at multiple postoperative intervals as secondary endpoints. RESULTS: The median age was 61 years, 89.7% were male. 92.6% suffered from a squamous cell carcinoma. There was a significant difference in median subcutaneous suture time (p = 0.024) between the experimental (6:11 ± 2:30 min) and standard (7:01 ± 2.42 min) arms. There was no significant difference in safety when assessing adverse events (AEs). At least one AE occurred in 14.7% vs. 5.9%, for barbed and smooth sutures respectively (p = 0.16). CONCLUSION: For neck dissection of head and neck malignancies, subcutaneous wound closure with self-locking sutures offers significant time savings over the single knot technique with similar safety and aesthetic results. TRIAL REGISTRATION INFORMATION: The trial was registered with WHO acknowledged primary registry "German Clinical Trials Register" under the ID DRKS00025831 ( https://drks.de/search/de/trial/DRKS00025831 ).

14.
Cost Eff Resour Alloc ; 22(1): 56, 2024 Aug 01.
Article in English | MEDLINE | ID: mdl-39090621

ABSTRACT

BACKGROUND: Health Complex Model was implemented to provide primary health care services in urban, especially in slum areas. As a pilot at a provincial level, Chamran Health Complex offers healthcare for more than 57,000 residents of Tabriz. Despite the necessity of cost information in healthcare decision-making, there was limited knowledge about the unit cost of services. This study aims to analyze the cost and efficiency of health centers. METHODS: Activity-Based Costing method with direct and step-down allocation methods was adopted. We estimated unit costs in a hypothetical scenario according to national standards to quantify the gap between current and standard practice. Input-oriented Data Envelopment Analysis was administered to measure the efficiency of health centers. RESULTS: The total cost of the complex was $2,841,897, of which 67% ($1910373) and 33% ($931523) were accounted for direct and indirect costs, respectively. The vaccination center had the lowest ($9), and the occupational health center had the highest average unit cost ($76). The average technical efficiency of the health centers was 0.519, where the HC1 and HC3 showed the best performance. CONCLUSION: There is remarkable variability in service costs across health centers, which must be addressed in performance management and contracting practices. Although we found a gap between current and standard practice in terms of staff and facilities according to national standards, Chamran Health Complex has an untouched capacity that can be utilized with better planning and without incurring additional costs. It raises the need for revising national standards by the Iran Ministry of Health.

15.
Braz J Cardiovasc Surg ; 39(5): e20240205, 2024 Aug 02.
Article in English | MEDLINE | ID: mdl-39094093

ABSTRACT

INTRODUCTION: Blood transfusion is one of the most common medical practices worldwide. However, current scientific literature has shown that the immunomodulatory effects of blood transfusion are associated with an increased likelihood of infection, prolonged hospitalization, and morbimortality. Also, it means high costs for healthcare systems. METHODS: In this context, acknowledging that blood transfusions are essentially heterologous cell transplantations, the use of therapeutic options has gained strength and is collectively known as the patient blood management (PBM) program. PBM is an approach based on three main pillars: (1) treating anemias and coagulopathies in an optimized manner, especially in the preoperative period; (2) optimizing perioperative hemostasis and the use of blood recovery systems to avoid the loss of the patient's blood; (3) anemia tolerance, with improved oxygen delivery and reduced oxygen demand, particularly in the postoperative period. RESULTS: Current scientific evidence supports the effectiveness of PBM by reducing the need for blood transfusions, decreasing associated complications, and promoting more efficient and safer blood management. Thus, PBM not only improves clinical outcomes for patients but also contributes to the economic sustainability of healthcare systems. CONCLUSION: The aim of this review was to summarize PBM strategies in a comprehensive, evidence-based approach through a systematic and structured model for PBM implementation in tertiary hospitals. The recommendations proposed herein are from researchers and experts of a high-complexity university hospital in the network of the Sistema Único de Saúde, presenting itself as a strategy that can be followed as a guideline for PBM implementation in other settings.


Subject(s)
Anemia , Blood Transfusion , Humans , Blood Transfusion/standards , Anemia/therapy , Anemia/prevention & control , Blood Coagulation Disorders/therapy , Blood Coagulation Disorders/prevention & control
16.
Healthc (Amst) ; 12(4): 100750, 2024 Aug 13.
Article in English | MEDLINE | ID: mdl-39142233

ABSTRACT

BACKGROUND: Remdesivir is FDA-approved for the treatment of hospitalized patients with severe COVID-19. Many patients improve clinically to allow for hospital dismissal before completing the 5-day course. In a prior work, patients who continued remdesivir in an outpatient setting experienced better 28-day clinical outcomes. Here, we assessed patients' perspectives and the economic impact of this outpatient practice. METHODS: Hospitalized patients who received remdesivir for COVID-19 at Mayo Clinic, Rochester, from 11/6/2020 to 11/5/2021 and were dismissed to continue remdesivir in the outpatient setting were surveyed. The cost of care was compared between those who remained hospitalized versus those who were dismissed. RESULTS: 93 (19.8 %) among 470 eligible patients responded to the electronic survey. Responders were older than non-responders. The majority (70.5 %) had symptoms resolved by the time of the survey. Ten (11.4 %) patients had persistent symptoms attributed to long COVID-19. The majority were satisfied with the quality of care (82.3 %) and overall experience (76.0 %) in the infusion clinic. After adjusting for gender, comorbidity score, and WHO severity scale, the predicted costs for the groups were $16,544 (inpatient) and $9,097 (outpatient) per patient (difference of $7,447; p < .01). An estimate of 1,077 hospital bed-days were made available to other patients as a result of this transition to outpatient. CONCLUSION: An outpatient remdesivir program that allowed for early dismissal was perceived favorably by patients. The program resulted in significant cost and resource savings, the latter in terms of the availability of hospital beds for other patients needing critical services.

17.
Materials (Basel) ; 17(15)2024 Aug 01.
Article in English | MEDLINE | ID: mdl-39124442

ABSTRACT

Engineered geopolymer composites (EGCs) exhibit excellent tensile ductility and crack control ability, making them promising for concrete structure repair. However, their widespread use is limited by high costs of reinforcement fiber and a lack of an EGC-concrete interface bonding mechanism. This study investigated a hybrid PE/PVA fiber-reinforced EGC using domestically produced unoiled PVA fibers to replace commonly used PE fibers. The bond performance of the EGC-concrete interface was evaluated through direct tensile and slant shear tests, focusing on the effects of PE fiber content (1%, 2%, and 3%), fiber hybrid ratios (2.0:0.0, 1.5:0.5, 1.0:1.0, 0.5:1.5, and 0.0:2.0), concrete substrate strength (C30, C50, and C70), and the ratio of fly ash (FA) to ground granulated blast furnace slag (GGBS) (6:4, 7:3, and 8:2) on interface bond strength. Results showed that the EGCs' compressive strength ranged from 77.1 to 108.9 MPa, with increased GGBS content significantly enhancing the compressive strength and elastic modulus. Most of the specimens exhibited strain-hardening behavior after initial cracking. Interface bonding tests revealed that a PE/PVA ratio of 1.0 increased tensile bond strength by 8.5% compared with using 2.0% PE fiber alone. Increasing the PE fiber content, PVA/PE ratio, GGBS content, and concrete substrate strength all improved the shear bond strength. This improvement was attributed to the flexible fibers' ability to restrict thermo-hydro damage and deflect and blunt microcracks, enhancing the interface's failure resistance. Cost analysis showed that replacing 50% of the PE fiber in EGC with unoiled PVA fiber reduced costs by 44.2% compared with PE fiber alone, offering the best cost-performance ratio. In summary, hybrid PE/PVA fiber EGC has promising prospects for improving economic efficiency while maintaining tensile ductility and crack-control ability. Future optimization of fiber ratios and interface design could further enhance its potential for concrete repair applications.

18.
Ophthalmol Ther ; 2024 Aug 10.
Article in English | MEDLINE | ID: mdl-39126559

ABSTRACT

INTRODUCTION: This study evaluated the cost-effectiveness of anti-vascular endothelial growth factor (VEGF) therapies for subtypes of neovascular age-related macular degeneration (nAMD) from the societal perspective, and for any nAMD from the patient perspective in Japan. METHODS: A Markov model was developed to simulate the lifetime transitions of a cohort of patients with nAMD through various health states based on the involvement of nAMD, the treatment status, and decimal best-corrected visual acuity. Ranibizumab biosimilar was compared with aflibercept from the societal perspective regardless of treatment regimen for the analysis of three subtypes (typical nAMD, polypoidal choroidal vasculopathy (PCV), and retinal angiomatous proliferation (RAP)). Two analyses from the patient perspective focusing on the treat-and-extend regimens were performed, one with a cap on patients' copayments and one without. Ranibizumab biosimilar was compared with branded ranibizumab, aflibercept, aflibercept as the loading dose switching to ranibizumab biosimilar during maintenance (aflibercept switching to ranibizumab biosimilar), and best supportive care (BSC), for patients with any nAMD. RESULTS: In the subtype analyses, ranibizumab biosimilar when compared with aflibercept resulted in incremental quality-adjusted life years (QALYs) of - 0.015, 0.026, and 0.009, and the incremental costs of Japanese yen (JPY) - 50,447, JPY - 997,243, and JPY - 1,286,570 for typical nAMD, PCV, and RAP, respectively. From the patient perspective, ranibizumab biosimilar had incremental QALYs of 0.015, 0.009, and 0.307, compared with aflibercept, aflibercept switching to ranibizumab biosimilar, and BSC, respectively. The incremental costs for ranibizumab biosimilar over a patient lifetime excluding the cap on copayment were estimated to be JPY - 138,948, JPY - 391,935, JPY - 209,099, and JPY - 6,377,345, compared with branded ranibizumab, aflibercept, aflibercept switching to ranibizumab biosimilar, and BSC, respectively. CONCLUSIONS: Ranibizumab biosimilar was demonstrated as a cost-saving option compared to aflibercept across all subtypes of nAMD, irrespective of the perspectives considered.

19.
Diagnostics (Basel) ; 14(15)2024 Aug 01.
Article in English | MEDLINE | ID: mdl-39125538

ABSTRACT

BACKGROUND: This study investigated and compared the efficacy, safety, radiation exposure, and financial compensation of two modalities for percutaneous radiologic gastrostomy (PRG): multislice computed tomography biopsy mode (MS-CT BM)-guided and fluoroscopy-guided (FPRG). The aim was to provide insights into optimizing radiologically assisted gastrostomy procedures. METHODS: We conducted a retrospective analysis of PRG procedures performed at a single center from January 2018 to January 2024. The procedures were divided into two groups based on the imaging modality used. We compared patient demographics, intervention parameters, complication rates, and procedural times. Financial compensation was evaluated based on the tariff structure for outpatient medical services in Switzerland (TARMED). Statistical differences were determined using Fisher's exact test and the Mann-Whitney U test. RESULTS: The study cohort included 133 patients: 55 with MS-CT BM-PRG and 78 with FPRG. The cohort comprised 35 women and 98 men, with a mean age of 64.59 years (±11.91). Significant differences were observed between the modalities in effective dose (MS-CT BM-PRG: 10.95 mSv ± 11.43 vs. FPRG: 0.169 mSv ± 0.21, p < 0.001) and procedural times (MS-CT BM-PRG: 41.15 min ± 16.14 vs. FPRG: 28.71 min ± 16.03, p < 0.001). Major complications were significantly more frequent with FPRG (10% vs. 0% in MS-CT BM-PRG, p = 0.039, φ = 0.214). A higher single-digit number of MS-CT BM-guided PRG was required initially to reduce procedure duration by 10 min. Financial comparison revealed that only 4% of MS-CT BM-guided PRGs achieved reimbursement equivalent to the most frequent comparable examination, according to TARMED. CONCLUSIONS: Based on our experience from a retrospective, single-center study, the execution of a PRG using MS-CT BM, as opposed to FPRG, is currently justified in challenging cases despite a lower incidence of major complications. However, further well-designed prospective multicenter studies are needed to determine the efficacy, safety, and cost-effectiveness of these two modalities.

20.
Prev Vet Med ; 230: 106284, 2024 Sep.
Article in English | MEDLINE | ID: mdl-39089162

ABSTRACT

BACKGROUND: As low probability events, United States producers, value chain actors, and veterinary services (VS) have limited experience with identifying foreign animal disease (FAD), which can allow FADs to spread undetected. Point-of-care (POC) diagnostic testing may help reduce the time from detecting an initial suspect case to implementing actionable interventions compared to the current approach of only using laboratory diagnostic testing for disease diagnosis and confirmation. To evaluate the value of the reduced response time, we compare the associated costs between the two diagnostic approaches while accounting for the uncertainty surrounding the size of a FAD event. METHODS: We apply a state-contingent approach (SCA) to model the uncertainty surrounding a FAD through alternative events, where the event defines the scale of outbreak size and its duration. We apply this approach within a cost-benefit framework (CBA) to determine the economic value from the two testing investment strategies to help explain the policymaker's response (and costs) to alternative FAD events while also considering the cost impacts on the producers from each event. RESULTS: Compared to the current laboratory strategy, a POC strategy that reduces response time by 0.5-days (swine, cattle scenarios) and 1.5-days (poultry scenario) may provide cost-saving to both producers and public response efforts. The benefit-cost analysis further suggests that despite the higher fixed costs to adopt the POC strategy, the swine and cattle sectors may benefit while the benefits may not be as pronounced in the poultry sector. DISCUSSION: POC testing that can reduce the time between detection and response during a FAD event may be a sound strategy for public expenditure and provide cost-savings for producers, especially when minimal fixed costs are incurred. However, to fully determine the value of POC testing, the consequences (costs) associated with potential actions if something goes wrong, (e.g. false positive results), should be considered in future studies.


Subject(s)
Cost-Benefit Analysis , Point-of-Care Testing , Animals , United States , Cattle , Point-of-Care Testing/economics , Swine , Swine Diseases/diagnosis , Swine Diseases/economics , Communicable Diseases, Imported/veterinary , Communicable Diseases, Imported/diagnosis , Communicable Diseases, Imported/prevention & control , Communicable Diseases, Imported/economics , Cattle Diseases/diagnosis , Cattle Diseases/economics , Poultry Diseases/diagnosis , Poultry Diseases/economics , Point-of-Care Systems/economics , Poultry , Disease Outbreaks/veterinary , Disease Outbreaks/prevention & control , Disease Outbreaks/economics , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL