Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 2.085
Filter
1.
Work ; 2024 Jul 17.
Article in English | MEDLINE | ID: mdl-39031425

ABSTRACT

BACKGROUND: Due to the negative effects of occupational fatigue on health, absenteeism, and economic cost it is essential to control and manage its risk factors effectively. OBJECTIVE: This study seeks to draw researchers' attention to the research requirements concerning occupational fatigue. METHODS: The study briefly explores the consequences of occupational fatigue and discusses tools for its assessment. It then addresses the challenge of integrating risk factors and identifying efficient interventions. Lastly, it emphasizes the importance of addressing occupational fatigue related to new technologies. RESULTS: Wearable sensors, biomarkers in biological samples, and image processing are valuable tools for accurately assessing occupational fatigue. Artificial intelligence (AI) models can integrate multiple risk factors; while economic evaluations can help assess the effectiveness of control measures. Employers and researchers should be prepared to manage and monitor occupational fatigue resulting from interactions with new technologies. CONCLUSIONS: This commentary highlights the research gap in the field of occupational fatigue to better manage this phenomenon in today's evolving world.

2.
Sci Total Environ ; 948: 174873, 2024 Jul 20.
Article in English | MEDLINE | ID: mdl-39038673

ABSTRACT

Carbon Capture and Utilisation (CCU) technologies play a significant role in climate change mitigation, as these platforms aim to capture and convert CO2 that would be otherwise emitted into the atmosphere. Effective and economically sustainable technologies are crucial to support the transition to renewable and low-carbon energy sources by 2030 and beyond. Currently, studies exploring the financial viability of CCU technologies besides the joint analyses of life-cycle costs and environmental and social impacts are still limited. In this context, the study developed and validated an innovative and integrated methodology, called Life Cycle Cost and Sustainability Assessment (LCC-SA) which allows the joint assessment of (i) project life-cycle costs, (ii) socio-cultural and environmental externalities. This tool was validated with an application to an algal photobioreactors (PBRs) and allowed to assess the economic and environmental sustainability besides identifying the main critical issues to be addressed during the transition from pilot-scale plant to industrial application. The methodology's implementation estimated benefits in two main areas: (i) environmental, including CO2 removal and avoidance through biodiesel production instead of fossil-derived diesel; (ii) socio-cultural, encompassing new patents, knowledge spillovers, human capital formation, and knowledge outputs. The analysis returned as main result that the present value of the social externalities amounts to around EUR 550,000 and the present value of the costs to approximately EUR 60,000. The Economic Net Present Value (ENPV) is EUR 487,394, which shows the significance of the extra-financial effects generated by the research project. At full-scale application, environmental benefits include capturing 187 to 1867 tons of CO2 per year and avoiding 1.7 to 16.7 tons of CO2 annually through biodiesel production instead of fossil-derived diesel.

3.
Front Oncol ; 14: 1340081, 2024.
Article in English | MEDLINE | ID: mdl-39040451

ABSTRACT

Introduction: Advancements in rectal cancer (RC) treatment not only led to an increase in lives saved but also improved quality of life (QoL). Notwithstanding these benefits, RC treatment comes at the price of gastrointestinal morbidity in many patients. Health economic modelling poses an opportunity to explore the societal burden of such side-effects. This study aims to quantify radiation-induced late small bowel (SB) toxicity in survivors of RC for Three-Dimensional Conformal Radiation Therapy (3D-CRT), Intensity Modulated Radiation Therapy (IMRT) and Intensity Modulated Radiation Therapy - Image Guided Radiation Therapy (IMRT/IGRT). Materials and methods: Materials and A model-based health economic evaluation was performed. The theoretical cohort consists of a case-mix of survivors of RC aged 25-99 years according to Belgian age-specific incidence rates. A societal perspective was adopted. The base case analysis was complemented with one-way deterministic analyses, deterministic scenario analyses and probabilistic sensitivity analysis (1,000 iterations). Results were presented as mean lifetime incremental cost (€) and utility (QALYs) per patient. Results: The analyses showed that the use of innovative radiotherapy (RT) improves lifetime QoL in survivors of RC by 0.11 QALYs and 0.05 QALYs by preferring IMRT/IGRT and IMRT over 3D-CRT, respectively. The use of IMRT/IGRT and IMRT results in an incremental cost-saving of €3,820 and €1,863 per patient, solely by radiation-induced SB toxicity, compared to 3D-CRT. Discussion and conclusion: It is important to consider late toxicity effects in decisions regarding investments and reimbursement as our analysis highlighted the potential long-term cost-savings and improved QoL of novel RT techniques in patients with rectal cancer.

4.
Int J Gynecol Cancer ; 2024 Jul 23.
Article in English | MEDLINE | ID: mdl-39043573

ABSTRACT

Observational and cohort studies using large databases have made important contributions to gynecologic oncology. Knowledge of the advantages and potential limitations of commonly used databases benefits both readers and reviewers. In this review, researchers familiar with National Cancer Database (NCDB), Surveillance, Epidemiology, and End Results Program (SEER), SEER-Medicare, MarketScan, Healthcare Cost and Utilization Project (HCUP), National Surgical Quality Improvement Program (NSQIP), and Premier, describe each database, its included data, access, management, storage, highlights, and limitations. A better understanding of these commonly used datasets can help readers, reviewers, and researchers to more effectively interpret and apply study results, evaluate new research studies, and develop compelling and practice-changing research.

5.
Ann Surg Oncol ; 2024 Jul 10.
Article in English | MEDLINE | ID: mdl-38987370

ABSTRACT

INTRODUCTION: Extreme oncoplastic breast-conserving surgery (eOBCS) describes the application of OBCS to patients who would otherwise need a mastectomy, and its safety has been previously described. OBJECTIVE: We aimed to compare the costs of eOBCS and mastectomy. METHODS: We reviewed our institutional database to identify breast cancer patients treated surgically from 2018 to 2023. We included patients with a large disease span (≥5 cm) and multifocal/multicentric disease. Patients were grouped by their surgical approach, i.e. eOBCS or mastectomy. The direct costs of care were determined and compared; however, indirect costs were not included. RESULTS: Eighty-six patients met the inclusion criteria, 10 (11.6%) of whom underwent mastectomy and 76 (88.4%) who underwent eOBCS. Six mastectomy patients (60%) had reconstruction and 6 (60%) underwent external beam radiation therapy (EBRT). Reconstructions were completed in a staged fashion, and the mean cost of the index operation (mastectomy and tissue expander) was $17,816. These patients had one to three subsequent surgeries to complete their reconstruction, at a mean cost of $45,904. The mean cost of EBRT was $5542. Thirty-four eOBCS patients (44.7%) underwent 44 margin re-excisions, including 6 (7.9%) who underwent mastectomy. Sixty (78.9%) of the eOBCS patients had EBRT. The mean cost of their index operation was $6345; the mean cost of a re-excision was $3615; the mean cost of their mastectomies with reconstruction was $49,400; and the mean cost of EBRT was $6807. The cost of care for eOBCS patients remained lower than that for mastectomy patients, i.e. $17,318 versus $57,416. CONCLUSION: eOBCS is associated with a lower cost than mastectomy and had a low conversion rate to mastectomy.

6.
Health Technol Assess ; 28(35): 1-169, 2024 Jul.
Article in English | MEDLINE | ID: mdl-39056437

ABSTRACT

Background: Estimation of glomerular filtration rate using equations based on creatinine is widely used to manage chronic kidney disease. In the UK, the Chronic Kidney Disease Epidemiology Collaboration creatinine equation is recommended. Other published equations using cystatin C, an alternative marker of kidney function, have not gained widespread clinical acceptance. Given higher cost of cystatin C, its clinical utility should be validated before widespread introduction into the NHS. Objectives: Primary objectives were to: (1) compare accuracy of glomerular filtration rate equations at baseline and longitudinally in people with stage 3 chronic kidney disease, and test whether accuracy is affected by ethnicity, diabetes, albuminuria and other characteristics; (2) establish the reference change value for significant glomerular filtration rate changes; (3) model disease progression; and (4) explore comparative cost-effectiveness of kidney disease monitoring strategies. Design: A longitudinal, prospective study was designed to: (1) assess accuracy of glomerular filtration rate equations at baseline (n = 1167) and their ability to detect change over 3 years (n = 875); (2) model disease progression predictors in 278 individuals who received additional measurements; (3) quantify glomerular filtration rate variability components (n = 20); and (4) develop a measurement model analysis to compare different monitoring strategy costs (n = 875). Setting: Primary, secondary and tertiary care. Participants: Adults (≥ 18 years) with stage 3 chronic kidney disease. Interventions: Estimated glomerular filtration rate using the Chronic Kidney Disease Epidemiology Collaboration and Modification of Diet in Renal Disease equations. Main outcome measures: Measured glomerular filtration rate was the reference against which estimating equations were compared with accuracy being expressed as P30 (percentage of values within 30% of reference) and progression (variously defined) studied as sensitivity/specificity. A regression model of disease progression was developed and differences for risk factors estimated. Biological variation components were measured and the reference change value calculated. Comparative costs of monitoring with different estimating equations modelled over 10 years were calculated. Results: Accuracy (P30) of all equations was ≥ 89.5%: the combined creatinine-cystatin equation (94.9%) was superior (p < 0.001) to other equations. Within each equation, no differences in P30 were seen across categories of age, gender, diabetes, albuminuria, body mass index, kidney function level and ethnicity. All equations showed poor (< 63%) sensitivity for detecting patients showing kidney function decline crossing clinically significant thresholds (e.g. a 25% decline in function). Consequently, the additional cost of monitoring kidney function annually using a cystatin C-based equation could not be justified (incremental cost per patient over 10 years = £43.32). Modelling data showed association between higher albuminuria and faster decline in measured and creatinine-estimated glomerular filtration rate. Reference change values for measured glomerular filtration rate (%, positive/negative) were 21.5/-17.7, with lower reference change values for estimated glomerular filtration rate. Limitations: Recruitment of people from South Asian and African-Caribbean backgrounds was below the study target. Future work: Prospective studies of the value of cystatin C as a risk marker in chronic kidney disease should be undertaken. Conclusions: Inclusion of cystatin C in glomerular filtration rate-estimating equations marginally improved accuracy but not detection of disease progression. Our data do not support cystatin C use for monitoring of glomerular filtration rate in stage 3 chronic kidney disease. Trial registration: This trial is registered as ISRCTN42955626. Funding: This award was funded by the National Institute for Health and Care Research (NIHR) Health Technology Assessment programme (NIHR award ref: 11/103/01) and is published in full in Health Technology Assessment; Vol. 28, No. 35. See the NIHR Funding and Awards website for further award information.


Chronic kidney disease, which affects approximately 14% of the adult population, often has no symptoms but, in some people, may later develop into kidney failure. Kidney disease is most often detected using a blood test called creatinine. Creatinine does not identify everyone with kidney disease, or those most likely to develop more serious kidney disease. An alternative blood test called cystatin C may be more accurate, but it is more expensive than the creatinine test. We compared the accuracy of these two tests in more than 1000 people with moderate kidney disease. Participants were tested over 3 years to see if the tests differed in their ability to detect worsening kidney function. We also wanted to identify risk factors associated with loss of kidney function, and how much the tests normally vary to better understand what results mean. We compared the accuracy and costs of monitoring people with the two markers. Cystatin C was found slightly more accurate than the creatinine test at estimating kidney function when comparing the baseline single measurements (95% accurate compared to 90%), but not at detecting worsening function over time. This means that the additional cost of monitoring people over time with cystatin C to detect kidney disease progression could not be justified. Kidney test results could vary by up to 20% between tests without necessarily implying changes in underlying kidney function ­ this is the normal level of individual variation. Cystatin C marginally improved accuracy of kidney function testing but not ability to detect worsening kidney function. Cystatin C improves identification of moderate chronic kidney disease, but our results do not support its use for routine monitoring of kidney function in such patients.


Subject(s)
Creatinine , Cystatin C , Disease Progression , Glomerular Filtration Rate , Renal Insufficiency, Chronic , Humans , Cystatin C/blood , Creatinine/blood , Male , Female , Renal Insufficiency, Chronic/physiopathology , Middle Aged , Aged , Prospective Studies , Longitudinal Studies , Biomarkers , Cost-Benefit Analysis , Adult , United Kingdom , Albuminuria
7.
Health Econ Rev ; 14(1): 47, 2024 Jul 03.
Article in English | MEDLINE | ID: mdl-38958775

ABSTRACT

BACKGROUND: Significant gaps in scholarship on the cost-benefit analysis of haemodialysis exist in low-middle-income countries, including Nigeria. The study, therefore, assessed the cost-benefit of haemodialysis compared with comprehensive conservative care (CCC) to determine if haemodialysis is socially worthwhile and justifies public funding in Nigeria. METHODS: The study setting is Abuja, Nigeria. The study used a mixed-method design involving primary data collection and analysis of secondary data from previous studies. We adopted an ingredient-based costing approach. The mean costs and benefits of haemodialysis were derived from previous studies. The mean costs and benefits of CCC were obtained from a primary cross-sectional survey. We estimated the benefit-cost ratios (BCR) and net benefits to determine the social value of the two interventions. RESULTS: The net benefit of haemodialysis (2,251.30) was positive, while that of CCC was negative (-1,197.19). The benefit-cost ratio of haemodialysis was 1.09, while that of CCC was 0.66. The probabilistic and one-way sensitivity analyses results demonstrate that haemodialysis was more cost-beneficial than CCC, and the BCRs of haemodialysis remained above one in most scenarios, unlike CCC's BCR. CONCLUSION: The benefit of haemodialysis outweighs its cost, making it cost-beneficial to society and justifying public funding. However, the National Health Insurance Authority requires additional studies, such as budget impact analysis, to establish the affordability of full coverage of haemodialysis.

9.
J Environ Manage ; 365: 121562, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38944959

ABSTRACT

Awareness of the subsurface and its multitude of resources is generally low and decisions on access to subsurface resources are often guided by a 'first come, first served principle'. Although not yet fully developed, the concept of geosystem services has been put forward to make subsurface resources more visible and acknowledged in decision-making. This study (1) illustrates a systematic mapping of effects on geosystem services using a process-oriented perspective in two conceptual case studies; (2) translates the mapped effects into costs and benefits items in a qualitative cost-benefit analysis (CBA) context; and (3) presents a systematic review of economic valuation studies of geosystem services to investigate the available support for a quantitative CBA. The findings suggest that systematic mapping of effects on multiple geosystem services can inform different types of assessment methods and decision-makers on trade-offs and provide a basis for well-informed and responsible decisions on subsurface use. Combining such mapping with a CBA can further strengthen decision support through indications of the net effects on human well-being. However, although economic valuation of non-market geosystem services is possible using established valuation methods, such studies are scarce in scientific literature. Thus, although a CBA can provide a basis for supporting decisions on subsurface use from a consequentialist perspective, full quantification of all effects may require great efforts, and it needs to be complemented with other methods to capture the full range of values the subsurface can provide. This study also highlights that depending on the context, supporting and regulating geosystem services can be either intermediate or final services. Therefore, if geosystem services are to be included in the abiotic extension of CICES, in which supporting services by definition are excluded, reclassification of the supporting geosystem services should be considered not to risk being overlooked in economic valuation and CBA.


Subject(s)
Cost-Benefit Analysis , Decision Making , Conservation of Natural Resources/economics , Humans
10.
Med Decis Making ; 44(5): 512-528, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38828516

ABSTRACT

BACKGROUND: The COVID-19 pandemic underscored the criticality and complexity of decision making for novel treatment approval and further research. Our study aims to assess potential decision-making methodologies, an evaluation vital for refining future public health crisis responses. METHODS: We compared 4 decision-making approaches to drug approval and research: the Food and Drug Administration's policy decisions, cumulative meta-analysis, a prospective value-of-information (VOI) approach (using information available at the time of decision), and a reference standard (retrospective VOI analysis using information available in hindsight). Possible decisions were to reject, accept, provide emergency use authorization, or allow access to new therapies only in research settings. We used monoclonal antibodies provided to hospitalized COVID-19 patients as a case study, examining the evidence from September 2020 to December 2021 and focusing on each method's capacity to optimize health outcomes and resource allocation. RESULTS: Our findings indicate a notable discrepancy between policy decisions and the reference standard retrospective VOI approach with expected losses up to $269 billion USD, suggesting suboptimal resource use during the wait for emergency use authorization. Relying solely on cumulative meta-analysis for decision making results in the largest expected loss, while the policy approach showed a loss up to $16 billion and the prospective VOI approach presented the least loss (up to $2 billion). CONCLUSION: Our research suggests that incorporating VOI analysis may be particularly useful for research prioritization and treatment implementation decisions during pandemics. While the prospective VOI approach was favored in this case study, further studies should validate the ideal decision-making method across various contexts. This study's findings not only enhance our understanding of decision-making strategies during a health crisis but also provide a potential framework for future pandemic responses. HIGHLIGHTS: This study reviews discrepancies between a reference standard (retrospective VOI, using hindsight information) and 3 conceivable real-time approaches to research-treatment decisions during a pandemic, suggesting suboptimal use of resources.Of all prospective decision-making approaches considered, VOI closely mirrored the reference standard, yielding the least expected value loss across our study timeline.This study illustrates the possible benefit of VOI results and the need for evidence accumulation accompanied by modeling in health technology assessment for emerging therapies.


Subject(s)
COVID-19 Drug Treatment , COVID-19 , Decision Making , Drug Approval , SARS-CoV-2 , Humans , Uncertainty , COVID-19/epidemiology , United States , Pandemics , United States Food and Drug Administration , Antibodies, Monoclonal/therapeutic use
11.
Health Technol Assess ; 28(28): 1-238, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38938145

ABSTRACT

Background: To limit the use of antimicrobials without disincentivising the development of novel antimicrobials, there is interest in establishing innovative models that fund antimicrobials based on an evaluation of their value as opposed to the volumes used. The aim of this project was to evaluate the population-level health benefit of cefiderocol in the NHS in England, for the treatment of severe aerobic Gram-negative bacterial infections when used within its licensed indications. The results were used to inform the National Institute for Health and Care Excellence guidance in support of commercial discussions regarding contract value between the manufacturer and NHS England. Methods: The health benefit of cefiderocol was first derived for a series of high-value clinical scenarios. These represented uses that were expected to have a significant impact on patients' mortality risks and health-related quality of life. The clinical effectiveness of cefiderocol relative to its comparators was estimated by synthesising evidence on susceptibility of the pathogens of interest to the antimicrobials in a network meta-analysis. Patient-level costs and health outcomes of cefiderocol under various usage scenarios compared with alternative management strategies were quantified using decision modelling. Results were reported as incremental net health effects expressed in quality-adjusted life-years, which were scaled to 20-year population values using infection number forecasts based on data from Public Health England. The outcomes estimated for the high-value clinical scenarios were extrapolated to other expected uses for cefiderocol. Results: Among Enterobacterales isolates with the metallo-beta-lactamase resistance mechanism, the base-case network meta-analysis found that cefiderocol was associated with a lower susceptibility relative to colistin (odds ratio 0.32, 95% credible intervals 0.04 to 2.47), but the result was not statistically significant. The other treatments were also associated with lower susceptibility than colistin, but the results were not statistically significant. In the metallo-beta-lactamase Pseudomonas aeruginosa base-case network meta-analysis, cefiderocol was associated with a lower susceptibility relative to colistin (odds ratio 0.44, 95% credible intervals 0.03 to 3.94), but the result was not statistically significant. The other treatments were associated with no susceptibility. In the base case, patient-level benefit of cefiderocol was between 0.02 and 0.15 quality-adjusted life-years, depending on the site of infection, the pathogen and the usage scenario. There was a high degree of uncertainty surrounding the benefits of cefiderocol across all subgroups. There was substantial uncertainty in the number of infections that are suitable for treatment with cefiderocol, so population-level results are presented for a range of scenarios for the current infection numbers, the expected increases in infections over time and rates of emergence of resistance. The population-level benefits varied substantially across the base-case scenarios, from 896 to 3559 quality-adjusted life-years over 20 years. Conclusion: This work has provided quantitative estimates of the value of cefiderocol within its areas of expected usage within the NHS. Limitations: Given existing evidence, the estimates of the value of cefiderocol are highly uncertain. Future work: Future evaluations of antimicrobials would benefit from improvements to NHS data linkages; research to support appropriate synthesis of susceptibility studies; and application of routine data and decision modelling to assess enablement value. Study registration: No registration of this study was undertaken. Funding: This award was funded by the National Institute for Health and Care Research (NIHR) Health Technology Assessment Policy Research Programme (NIHR award ref: NIHR135591), conducted through the Policy Research Unit in Economic Methods of Evaluation in Health and Social Care Interventions, PR-PRU-1217-20401, and is published in full in Health Technology Assessment; Vol. 28, No. 28. See the NIHR Funding and Awards website for further award information.


This project tested new methods for estimating the value to the NHS of an antimicrobial, cefiderocol, so its manufacturer could be paid fairly even if very little drug is used in order to reduce the risk of bacteria becoming resistant to the product. Clinicians said that the greatest benefit of cefiderocol is when used for complicated urinary tract infections and pneumonia acquired within hospitals caused by two types of bacteria (called Enterobacterales and Pseudomonas aeruginosa), with a resistance mechanism called metallo-beta-lactamase. Because there were no relevant clinical trial data, we estimated how effective cefiderocol and alternative treatments were by doing a systematic literature review of studies that grew bacteria from infections in the laboratory and tested the drugs on them. We linked this to data estimating the long-term health and survival of patients. Some evidence was obtained by asking clinicians detailed questions about what they thought the effects would be based on their experience and the available evidence. We included the side effects of the alternative treatments, some of which can cause kidney damage. We estimated how many infections there would be in the UK, whether they would increase over time and how resistance to treatments may change over time. Clinicians told us that they would also use cefiderocol to treat intra-abdominal and bloodstream infections, and some infections caused by another bacteria called Stenotrophomonas. We estimated how many of these infections there would be, and assumed the same health benefits as for other types of infections. The total value to the NHS was calculated using these estimates. We also considered whether we had missed any additional elements of value. We estimated that the value to the NHS was £18­71 million over 20 years. This reflects the maximum the NHS could pay for use of cefiderocol if the health lost as a result of making these payments rather than funding other NHS services is not to exceed the health benefits of using this antimicrobial. However, these estimates are uncertain due to limitations with the evidence used to produce them and assumptions that had to be made.


Subject(s)
Anti-Bacterial Agents , Cefiderocol , Cephalosporins , Cost-Benefit Analysis , Gram-Negative Bacterial Infections , Quality-Adjusted Life Years , Technology Assessment, Biomedical , Humans , Cephalosporins/therapeutic use , Anti-Bacterial Agents/therapeutic use , Anti-Bacterial Agents/economics , England , Gram-Negative Bacterial Infections/drug therapy , State Medicine , Quality of Life
12.
BMJ Open Qual ; 13(2)2024 Jun 05.
Article in English | MEDLINE | ID: mdl-38839395

ABSTRACT

OBJECTIVES: In many countries, the healthcare sector is dealing with important challenges such as increased demand for healthcare services, capacity problems in hospitals and rising healthcare costs. Therefore, one of the aims of the Dutch government is to move care from in-hospital to out-of-hospital care settings. An example of an innovation where care is moved from a more specialised setting to a less specialised setting is the performance of an antenatal cardiotocography (aCTG) in primary midwife-led care. The aim of this study was to assess the budget impact of implementing aCTG for healthy pregnant women in midwife-led care compared with usual obstetrician-led care in the Netherlands. METHODS: A budget impact analysis was conducted to estimate the actual costs and reimbursement of aCTG performed in midwife-led care and obstetrician-led care (ie, base-case analysis) from the Dutch healthcare perspective. Epidemiological and healthcare utilisation data describing both care pathways were obtained from a prospective cohort, survey and national databases. Different implementation rates of aCTG in midwife-led care were explored. A probabilistic sensitivity analysis was conducted to estimate the uncertainty surrounding the budget impact estimates. RESULTS: Shifting aCTG from obstetrician-led care to midwife-led-care would increase actual costs with €311 763 (97.5% CI €188 574 to €426 072) and €1 247 052 (97.5% CI €754 296 to €1 704 290) for implementation rates of 25% and 100%, respectively, while it would decrease reimbursement with -€7 538 335 (97.5% CI -€10 302 306 to -€4 559 661) and -€30 153 342 (97.5% CI -€41 209 225 to -€18 238 645) for implementation rates of 25% and 100%, respectively. The sensitivity analysis results were consistent with those of the main analysis. CONCLUSIONS: From the Dutch healthcare perspective, we estimated that implementing aCTG in midwife-led care may increase the associated actual costs. At the same time, it might lower the healthcare reimbursement.


Subject(s)
Budgets , Cardiotocography , Midwifery , Humans , Female , Netherlands , Pregnancy , Midwifery/statistics & numerical data , Midwifery/economics , Midwifery/methods , Cardiotocography/methods , Cardiotocography/statistics & numerical data , Cardiotocography/economics , Cardiotocography/standards , Budgets/statistics & numerical data , Budgets/methods , Adult , Prospective Studies , Prenatal Care/statistics & numerical data , Prenatal Care/economics , Prenatal Care/methods
13.
Cost Eff Resour Alloc ; 22(1): 37, 2024 May 05.
Article in English | MEDLINE | ID: mdl-38705990

ABSTRACT

BACKGROUND: Prostate cancer (PCa) causes a substantial health and financial burden worldwide, underscoring the need for efficient mass screening approaches. This study attempts to evaluate the Net Cost-Benefit Index (NCBI) of PCa screening in Iran to offer insights for informed decision-making and resource allocation. METHOD: The Net Cost-Benefit Index (NCBI) was calculated for four age groups (40 years and above) using a decision-analysis model. Two screening strategies, prostate-specific antigen (PSA) solely and PSA with Digital Rectal Examination (DRE), were evaluated from the health system perspective. A retrospective assessment of 1402 prostate cancer (PCa) patients' profiles were conducted, and direct medical and non-medical costs were calculated based on the 2021 official tariff rates, patient records, and interviews. The monetary value of mass screening was determined through Willingness to Pay (WTP) assessments, which served as a measure for the benefit aspect. RESULT: The combined PSA and DRE strategy of screening is cost-effective, yields up to $3 saving in costs per case and emerges as the dominant strategy over PSA alone. Screening for men aged 70 and above does not meet economic justification, indicated by a negative Net Cost-Benefit Index (NCBI). The 40-49 age group exhibits the highest net benefit, $13.81 based on basic information and $13.54 based on comprehensive information. Sensitivity analysis strongly supports the cost-effectiveness of the combined screening approach. CONCLUSION: This study advocates prostate cancer screening with PSA and DRE, is economically justified for men aged 40-69. The results of the study recommend that policymakers prioritize resource allocation for PCa screening programs based on age and budget constraints. Men's willingness to pay, especially for the 40-49 age group which had the highest net benefit, leverages their financial participation in screening services. Additionally, screening services for other age groups, such as 50-54 or 55-59, can be provided either for free or at a reduced cost.

14.
BMC Health Serv Res ; 24(1): 694, 2024 May 31.
Article in English | MEDLINE | ID: mdl-38822341

ABSTRACT

BACKGROUND: For many countries, especially those outside the USA without incentive payments, implementing and maintaining electronic medical records (EMR) is expensive and can be controversial given the large amounts of investment. Evaluating the value of EMR implementation is necessary to understand whether or not, such investment, especially when it comes from the public source, is an efficient allocation of healthcare resources. Nonetheless, most countries have struggled to measure the return on EMR investment due to the lack of appropriate evaluation frameworks. METHODS: This paper outlines the development of an evidence-based digital health cost-benefit analysis (eHealth-CBA) framework to calculate the total economic value of the EMR implementation over time. A net positive benefit indicates such investment represents improved efficiency, and a net negative is considered a wasteful use of public resources. RESULTS: We developed a three-stage process that takes into account the complexity of the healthcare system and its stakeholders, the investment appraisal and evaluation practice, and the existing knowledge of EMR implementation. The three stages include (1) literature review, (2) stakeholder consultation, and (3) CBA framework development. The framework maps the impacts of the EMR to the quadruple aim of healthcare and clearly creates a method for value assessment. CONCLUSIONS: The proposed framework is the first step toward developing a comprehensive evaluation framework for EMRs to inform health decision-makers about the economic value of digital investments rather than just the financial value.


Subject(s)
Cost-Benefit Analysis , Electronic Health Records , Cost-Benefit Analysis/methods , Humans , Electronic Health Records/economics
15.
Bioresour Technol ; 402: 130781, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38701986

ABSTRACT

Textile and medical effluents causing bioaccumulation and biomagnification have been successfully biodegraded by fungal laccases. Here, a decision-making tool was developed and applied to evaluate 45 different laccase production strategies which determined the best potential source from a techno-economical perspective. Laccase production cost was calculated with a fixed output of 109 enzymatic units per batch (USD$per109U) and a sensitivity analysis was performed. Results indicate that optimization of enzymatic kinetics for each organism is essential to avoid exceeding the fermentation time point at which production titer reaches its peak and, therefore, higher production costs. Overall, the most cost-effective laccase-producing strategy was obtained when using Pseudolagarobasidium acaciicola with base production cost of USD $42.46 per 109 U. This works serves as platform for decision-making to find the optimal laccase production strategy based on techno-economic parameters.


Subject(s)
Laccase , Laccase/metabolism , Decision Support Techniques , Biotechnology/methods , Biotechnology/economics , Fungi/enzymology , Kinetics , Fermentation
16.
Sci Total Environ ; 934: 173137, 2024 Jul 15.
Article in English | MEDLINE | ID: mdl-38740207

ABSTRACT

Non-conventional water recovery, recycling, and reuse have been considered imperative approaches to addressing water scarcity in China. The objective of this study was to evaluate the technical and economic feasibility of Water Reclamation Plants (WRP) based on an anaerobic-anoxic-oxic membrane bioreactor (A2O-MBR) system for unconventional water resource treatment and reuse in towns (domestic sewage and rainwater). Rainwater is collected and stored in the rainwater reservoir through the rainwater pipe network, and then transported to the WRP for treatment and reuse through the rainwater reuse pumping station during the peak water demand period. During a year of operation and evaluation process, a total of 610,000 cubic meters of rainwater were reused, accounting for 10.4 % of the treated wastewater. In the A2O-MBR operation, the average effluent concentrations for COD (chemical oxygen demand), NH4+-N (ammonium), TN (total nitrogen), and TP (total phosphorus) were 14.23 ± 4.07 mg/L, 0.22 ± 0.26 mg/L, 11.97 ± 1.54 mg/L, and 0.13 ± 0.09 mg/L, respectively. The effluent quality met standards suitable for reuse in industrial cooling water or for direct discharge. The WRP demonstrates a positive financial outlook, with total capital and operating costs totaling 0.16 $/m3. A comprehensive cost-benefit analysis indicates a positive net present value for the WRP, and the estimated annualized net profit is 0.024 $/m3. This research has achieved near-zero discharge of wastewater and effective allocation of rainwater resources across time and space.

17.
BMC Oral Health ; 24(1): 534, 2024 May 09.
Article in English | MEDLINE | ID: mdl-38724990

ABSTRACT

OBJECTIVES: The objectives of this study were to evaluate the cost-effectiveness and cost-benefit of fluoride varnish (FV) interventions for preventing caries in the first permanent molars (FPMs) among children in rural areas in Guangxi, China. METHODS: This study constituted a secondary analysis of data from a randomised controlled trial, analysed from a social perspective. A total of 1,335 children aged 6-8 years in remote rural areas of Guangxi were enrolled in this three-year follow-up controlled study. Children in the experimental group (EG) and the control group (CG) received oral health education and were provided with a toothbrush and toothpaste once every six months. Additionally, FV was applied in the EG. A decision tree model was developed, and single-factor and probabilistic sensitivity analyses were conducted. RESULTS: After three years of intervention, the prevalence of caries in the EG was 50.85%, with an average decayed, missing, and filled teeth (DMFT) index score of 1.12, and that in the CG was 59.04%, with a DMFT index score of 1.36. The total cost of caries intervention and postcaries treatment was 42,719.55 USD for the EG and 46,622.13 USD for the CG. The incremental cost-effectiveness ratio (ICER) of the EG was 25.36 USD per caries prevented, and the cost-benefit ratio (CBR) was 1.74 USD benefits per 1 USD cost. The results of the sensitivity analyses showed that the increase in the average DMFT index score was the largest variable affecting the ICER and CBR. CONCLUSIONS: Compared to oral health education alone, a comprehensive intervention combining FV application with oral health education is more cost-effective and beneficial for preventing caries in the FPMs of children living in economically disadvantaged rural areas. These findings could provide a basis for policy-making and clinical choices to improve children's oral health.


Subject(s)
Cariostatic Agents , Cost-Benefit Analysis , DMF Index , Dental Caries , Fluorides, Topical , Humans , Dental Caries/prevention & control , Dental Caries/economics , China , Fluorides, Topical/therapeutic use , Fluorides, Topical/economics , Child , Cariostatic Agents/therapeutic use , Cariostatic Agents/economics , Male , Female , Health Education, Dental/economics , Toothbrushing/economics , Toothpastes/therapeutic use , Toothpastes/economics , Follow-Up Studies , Molar , Decision Trees
18.
Health Technol Assess ; 28(23): 1-121, 2024 May.
Article in English | MEDLINE | ID: mdl-38767959

ABSTRACT

Background: Pelvic organ prolapse is common, causes unpleasant symptoms and negatively affects women's quality of life. In the UK, most women with pelvic organ prolapse attend clinics for pessary care. Objectives: To determine the clinical effectiveness and cost-effectiveness of vaginal pessary self-management on prolapse-specific quality of life for women with prolapse compared with clinic-based care; and to assess intervention acceptability and contextual influences on effectiveness, adherence and fidelity. Design: A multicentre, parallel-group, superiority randomised controlled trial with a mixed-methods process evaluation. Participants: Women attending UK NHS outpatient pessary services, aged ≥ 18 years, using a pessary of any type/material (except shelf, Gellhorn or Cube) for at least 2 weeks. Exclusions: women with limited manual dexterity, with cognitive deficit (prohibiting consent or self-management), pregnant or non-English-speaking. Intervention: The self-management intervention involved a 30-minute teaching appointment, an information leaflet, a 2-week follow-up telephone call and a local clinic telephone helpline number. Clinic-based care involved routine appointments determined by centres' usual practice. Allocation: Remote web-based application; minimisation was by age, pessary user type and centre. Blinding: Participants, those delivering the intervention and researchers were not blinded to group allocation. Outcomes: The patient-reported primary outcome (measured using the Pelvic Floor Impact Questionnaire-7) was prolapse-specific quality of life, and the cost-effectiveness outcome was incremental cost per quality-adjusted life-year (a specifically developed health Resource Use Questionnaire was used) at 18 months post randomisation. Secondary outcome measures included self-efficacy and complications. Process evaluation data were collected by interview, audio-recording and checklist. Analysis was by intention to treat. Results: Three hundred and forty women were randomised (self-management, n = 169; clinic-based care, n = 171). At 18 months post randomisation, 291 questionnaires with valid primary outcome data were available (self-management, n = 139; clinic-based care, n = 152). Baseline economic analysis was based on 264 participants (self-management, n = 125; clinic-based care, n = 139) with valid quality of life and resource use data. Self-management was an acceptable intervention. There was no group difference in prolapse-specific quality of life at 18 months (adjusted mean difference -0.03, 95% confidence interval -9.32 to 9.25). There was fidelity to intervention delivery. Self-management was cost-effective at a willingness-to-pay threshold of £20,000 per quality-adjusted life-year gained, with an estimated incremental net benefit of £564.32 and an 80.81% probability of cost-effectiveness. At 18 months, more pessary complications were reported in the clinic-based care group (adjusted mean difference 3.83, 95% confidence interval 0.81 to 6.86). There was no group difference in general self-efficacy, but self-managing women were more confident in pessary self-management activities. In both groups, contextual factors impacted on adherence and effectiveness. There were no reported serious unexpected serious adverse reactions. There were 32 serious adverse events (self-management, n = 17; clinic-based care, n = 14), all unrelated to the intervention. Skew in the baseline data for the Pelvic Floor Impact Questionnaire-7, the influence of the global COVID-19 pandemic, the potential effects of crossover and the lack of ethnic diversity in the recruited sample were possible limitations. Conclusions: Self-management was acceptable and cost-effective, led to fewer complications and did not improve or worsen quality of life for women with prolapse compared with clinic-based care. Future research is needed to develop a quality-of-life measure that is sensitive to the changes women desire from treatment. Study registration: This study is registered as ISRCTN62510577. Funding: This award was funded by the National Institute for Health and Care Research (NIHR) Health Technology Assessment programme (NIHR award ref: 16/82/01) and is published in full in Health Technology Assessment; Vol. 28, No. 23. See the NIHR Funding and Awards website for further award information.


Pelvic organ prolapse is a common and distressing condition experienced by large numbers of women. Prolapse is when the organs that are usually in the pelvis drop down into the vagina. Women experience a feeling of something coming down into the vagina, along with bowel, bladder and sexual problems. One possible treatment is a vaginal pessary. The pessary is a device that is inserted into the vagina and holds the pelvic organs back in their usual place. Women who use a vaginal pessary usually come back to clinic every 6 months to have their pessary removed and replaced; this is called clinic-based care. However, it is possible for a woman to look after the pessary herself; this is called self-management. This study compared self-management with clinic-based care. Three hundred and forty women with prolapse took part; 171 received clinic-based care and 169 undertook self-management. Each woman had an equal chance of being in either group. Women in the self-management group received a 30-minute teaching appointment, an information leaflet, a 2-week follow-up telephone call and a telephone number for their local centre. Women in the clinic-based care group returned to clinic as advised by the treating healthcare professional. Self-management was found to be acceptable. Women self-managed their pessary in ways that suited their lifestyle. After 18 months, there was no difference between the groups in women's quality of life. Women in the self-management group experienced fewer pessary complications than women who received clinic-based care. Self-management costs less to deliver than clinic-based care. In summary, self-management did not improve women's quality of life more than clinic-based care, but it did lead to women experiencing fewer complications and cost less to deliver in the NHS. The findings support self-management as a treatment pathway for women using a pessary for prolapse.


Subject(s)
Cost-Benefit Analysis , Pelvic Organ Prolapse , Pessaries , Quality of Life , Self-Management , Humans , Female , Pelvic Organ Prolapse/therapy , Self-Management/methods , Middle Aged , Aged , United Kingdom , Quality-Adjusted Life Years , Adult
19.
Adv Exp Med Biol ; 1447: 91-104, 2024.
Article in English | MEDLINE | ID: mdl-38724787

ABSTRACT

Atopic dermatitis (AD) is a chronic inflammatory disorder that affects over 30 million people in the United States. Given the large and growing prevalence of AD, the associated economic burden is significant. It has been estimated that AD costs over $5 billion dollars annually. These costs include both direct and indirect costs. Direct costs include prescription medicines, visits to health-care providers, hospitalizations, and transportation. Indirect costs include missed days or lost productivity at work or school, career modification, and reduced quality of life. Understanding and measuring these costs can be accomplished through rigorous economic evaluation, which is the organized process of considering inputs and outcomes of various activities. Economic evaluation has been used to contextualize the burden of AD in society. It has also been used to inform patients, providers, and other stakeholders on how to deliver the most evidence-based, efficient way possible. Understanding the economic impact of atopic dermatitis is an important aspect of delivering high-quality care.


Subject(s)
Cost of Illness , Dermatitis, Atopic , Health Care Costs , Quality of Life , Dermatitis, Atopic/economics , Humans , United States/epidemiology
20.
Vaccine ; 2024 May 27.
Article in English | MEDLINE | ID: mdl-38806354

ABSTRACT

BACKGROUND: Human adenovirus (HAdV) is a prevalent causative agent of acute respiratory disease (ARD) and is frequently responsible for outbreaks, particularly in military environments. Current vaccines do not effectively cover HAdV subtypes commonly found among Korean military personnel, highlighting the need for a new targeted vaccine. This study presents a cost-benefit analysis to evaluate the economic viability of developing and implementing such a vaccine within a military context. METHODS: We adopted a societal perspective for this cost-benefit analysis, which included estimating costs associated with vaccine development, production, and distribution over a projected timeline. We assumed a development period of five years, after which vaccine production and administration were initiated in the sixth year. The cost associated with vaccine development, production, and dispensation was considered. The benefits were calculated based on both direct and indirect cost savings from preventing HAdV infections through vaccination. All financial figures were expressed in 2023 US dollars. A sensitivity analysis was conducted to explore the impact of varying factors such as vaccination rate, incidence of infection, vaccine efficacy, and discount rate. RESULTS: For the base case scenario, we assumed a vaccination rate of 100 %, an incidence rate of 0.02, and a vaccine efficacy of 95 %, applying a 3 % discount rate. Initially, in the sixth year, the benefit-cost ratio stood at 0.71, suggesting a cost disadvantage at the onset of vaccination. However, this ratio improved to 1.32 in the following years, indicating a cost benefit from the seventh year onward. The cumulative benefit-cost ratio over a decade reached 2.72. The outcomes from the sensitivity analysis were consistent with these findings. CONCLUSION: Our cost-benefit analysis demonstrates that the introduction of an HAdV vaccine for the Korean military is economically advantageous, with substantial cost benefits accruing from the seventh year after the commencement of vaccination.

SELECTION OF CITATIONS
SEARCH DETAIL
...