Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 8 de 8
Filter
1.
Conserv Biol ; : e14260, 2024 Apr 18.
Article in English | MEDLINE | ID: mdl-38638064

ABSTRACT

Aquatic invasive species (AIS) are one of the greatest threats to the functioning of aquatic ecosystems worldwide. Once an invasive species has been introduced to a new region, many governments develop management strategies to reduce further spread. Nevertheless, managing AIS in a new region is challenging because of the vast areas that need protection and limited resources. Spatial heterogeneity in invasion risk is driven by environmental suitability and propagule pressure, which can be used to prioritize locations for surveillance and intervention activities. To better understand invasion risk across aquatic landscapes, we developed a simulation model to estimate the likelihood of a waterbody becoming invaded with an AIS. The model included waterbodies connected via a multilayer network that included boater movements and hydrological connections. In a case study of Minnesota, we used zebra mussels (Dreissena polymorpha) and starry stonewort (Nitellopsis obtusa) as model species. We simulated the impacts of management scenarios developed by stakeholders and created a decision-support tool available through an online application provided as part of the AIS Explorer dashboard. Our baseline model revealed that 89% of new zebra mussel invasions and 84% of new starry stonewort invasions occurred through boater movements, establishing it as a primary pathway of spread and offering insights beyond risk estimates generated by traditional environmental suitability models alone. Our results highlight the critical role of interventions applied to boater movements to reduce AIS dispersal.


Modelo del riesgo de la invasión de especies acuáticas dispersadas por movimiento de botes y conexiones entre ríos Resumen Las especies acuáticas invasoras (EAI) son una de las principales amenazas para el funcionamiento de los ecosistemas acuáticos a nivel mundial. Una vez que una especie invasora ha sido introducida a una nueva región, muchos gobiernos desarrollan estrategias de manejo para disminuir la dispersión. Sin embargo, el manejo de las especies acuáticas invasoras en una nueva región se complica debido a las amplias áreas que necesitan protección y los recursos limitados. La heterogeneidad espacial de un riesgo de invasión es causada por la idoneidad ambiental y la presión de propágulo, que puede usarse para priorizar la ubicación de las actividades de vigilancia e intervención. Desarrollamos una simulación para estimar la probabilidad de que un cuerpo de agua sea invadido por EAI para tener un mejor entendimiento del riesgo de invasión en los paisajes acuáticos. El modelo incluyó cuencas conectadas a través de una red multicapa que incluía movimiento de botes y conexiones hidrológicas. Usamos como especies modelo a Dreissena polymorpha y a Nitellopsis obtusa en un estudio de caso en Minnesota. Simulamos el impacto de los escenarios de manejo desarrollado por los actores y creamos una herramienta de decisiones por medio de una aplicación en línea proporcionada como parte del tablero del Explorer de EAI. Nuestro modelo de línea base reveló que el 89% de las invasiones nuevas de D. polymorpha y el 84% de las de N. obtusa ocurrieron debido al movimiento de los botes, lo que lo estableció como una vía primaria de dispersión y nos proporcionó información más allá de las estimaciones de riesgo generadas por los modelos tradicionales de idoneidad ambiental. Nuestros resultados resaltan el papel crítico de las intervenciones aplicadas al movimiento de los botes para reducir la dispersión de especies acuáticas invasoras.

2.
MDM Policy Pract ; 8(2): 23814683231202716, 2023.
Article in English | MEDLINE | ID: mdl-37841496

ABSTRACT

Background. To support proactive decision making during the COVID-19 pandemic, mathematical models have been leveraged to identify surveillance indicator thresholds at which strengthening nonpharmaceutical interventions (NPIs) is necessary to protect health care capacity. Understanding tradeoffs between different adaptive COVID-19 response components is important when designing strategies that balance public preference and public health goals. Methods. We considered 3 components of an adaptive COVID-19 response: 1) the threshold at which to implement the NPI, 2) the time needed to implement the NPI, and 3) the effectiveness of the NPI. Using a compartmental model of SARS-CoV-2 transmission calibrated to Minnesota state data, we evaluated different adaptive policies in terms of the peak number of hospitalizations and the time spent with the NPI in force. Scenarios were compared with a reference strategy, in which an NPI with an 80% contact reduction was triggered when new weekly hospitalizations surpassed 8 per 100,000 population, with a 7-day implementation period. Assumptions were varied in sensitivity analysis. Results. All adaptive response scenarios substantially reduced peak hospitalizations relative to no response. Among adaptive response scenarios, slower NPI implementation resulted in somewhat higher peak hospitalization and a longer time spent under the NPIs than the reference scenario. A stronger NPI response resulted in slightly less time with the NPIs in place and smaller hospitalization peak. A higher trigger threshold resulted in greater peak hospitalizations with little reduction in the length of time under the NPIs. Conclusions. An adaptive NPI response can substantially reduce infection circulation and prevent health care capacity from being exceeded. However, population preferences as well as the feasibility and timeliness of compliance with reenacting NPIs should inform response design. Highlights: This study uses a mathematical model to compare different adaptive nonpharmaceutical intervention (NPI) strategies for COVID-19 management across 3 dimensions: threshold when the NPI should be implemented, time it takes to implement the NPI, and the effectiveness of the NPI.All adaptive NPI response scenarios considered substantially reduced peak hospitalizations compared with no response.Slower NPI implementation results in a somewhat higher peak hospitalization and longer time spent with the NPI in place but may make an adaptive strategy more feasible by allowing the population sufficient time to prepare for changing restrictions.A stronger, more effective NPI response results in a modest reduction in the time spent under the NPIs and slightly lower peak hospitalizations.A higher threshold for triggering the NPI delays the time at which the NPI starts but results in a higher peak hospitalization and does not substantially reduce the time the NPI remains in force.

3.
PLoS One ; 18(8): e0288961, 2023.
Article in English | MEDLINE | ID: mdl-37535647

ABSTRACT

PURPOSE: To facilitate use of timely, granular, and publicly available data on COVID-19 mortality, we provide a method for imputing suppressed COVID-19 death counts in the National Center for Health Statistic's 2020 provisional mortality data by quarter, county, and age. METHODS: We used a Bayesian approach to impute suppressed COVID-19 death counts by quarter, county, and age in provisional data for 3,138 US counties. Our model accounts for multilevel data structures; numerous zero death counts among persons aged <50 years, rural counties, early quarters in 2020; highly right-skewed distributions; and different levels of data granularity (county, state or locality, and national levels). We compared three models with different prior assumptions of suppressed COVID-19 deaths, including noninformative priors (M1), the same weakly informative priors for all age groups (M2), and weakly informative priors that differ by age (M3) to impute the suppressed death counts. After the imputed suppressed counts were available, we assessed three prior assumptions at the national, state/locality, and county level, respectively. Finally, we compared US counties by two types of COVID-19 death rates, crude (CDR) and age-standardized death rates (ASDR), which can be estimated only through imputing suppressed death counts. RESULTS: Without imputation, the total COVID-19 death counts estimated from the raw data underestimated the reported national COVID-19 deaths by 18.60%. Using imputed data, we overestimated the national COVID-19 deaths by 3.57% (95% CI: 3.37%-3.80%) in model M1, 2.23% (95% CI: 2.04%-2.43%) in model M2, and 2.96% (95% CI: 2.76%-3.16%) in model M3 compared with the national report. The top 20 counties that were most affected by COVID-19 mortality were different between CDR and ASDR. CONCLUSIONS: Bayesian imputation of suppressed county-level, age-specific COVID-19 deaths in US provisional data can improve county ASDR estimates and aid public health officials in identifying disparities in deaths from COVID-19.


Subject(s)
COVID-19 , Humans , United States/epidemiology , COVID-19/epidemiology , Bayes Theorem
4.
Cancer Causes Control ; 34(3): 205-212, 2023 Mar.
Article in English | MEDLINE | ID: mdl-36449145

ABSTRACT

PURPOSE: We report the prevalence and economic cost of skin cancer treatment compared to other cancers overall in the USA from 2012 to 2018. METHODS: Using the Medical Expenditure Panel Survey full-year consolidated data files and associated medical conditions and medical events files, we estimate the prevalence, total costs, and per-person costs of treatment for melanoma and non-melanoma skin cancer among adults aged ≥ 18 years in the USA. To understand the changes in treatment prevalence and treatment costs of skin cancer in the context of overall cancer treatment, we also estimate the prevalence, total costs, and per-person costs of treatment for non-skin cancer among US adults. RESULTS: During 2012-15 and 2016-18, the average annual number of adults treated for any skin cancer was 5.8 (95% CI: 5.2, 6.4) and 6.1 (95% CI: 5.6, 6.6) million, respectively, while the average annual number of adults treated for non-skin cancers rose from 10.8 (95% CI: 10.0, 11.5) to 11.9 (95% CI: 11.2, 12.6) million, respectively. The overall estimated annual costs rose from $8.0 (in 2012-2015) to $8.9 billion (in 2016-18) for skin cancer treatment and $70.2 to $79.4 billion respectively for non-skin cancer treatment. CONCLUSION: The prevalence and economic cost of skin cancer treatment modestly increased in recent years. Given the substantial cost of skin cancer treatment, continued public health attention to implementing evidence-based sun-safety interventions to reduce skin cancer risk may help prevent skin cancer and the associated treatment costs.


Subject(s)
Melanoma , Skin Neoplasms , Adult , Humans , United States/epidemiology , Health Expenditures , Financial Stress , Skin Neoplasms/epidemiology , Skin Neoplasms/therapy , Health Care Costs , Melanoma/epidemiology , Melanoma/therapy , Cost of Illness
5.
Public Health Rep ; 138(1): 190-199, 2023.
Article in English | MEDLINE | ID: mdl-36200805

ABSTRACT

OBJECTIVE: State-issued behavioral policy interventions (BPIs) can limit community spread of COVID-19, but their effects on COVID-19 transmission may vary by level of social vulnerability in the community. We examined the association between the duration of BPIs and the incidence of COVID-19 across levels of social vulnerability in US counties. METHODS: We used COVID-19 case counts from USAFacts and policy data on BPIs (face mask mandates, stay-at-home orders, gathering bans) in place from April through December 2020 and the 2018 Social Vulnerability Index (SVI) from the Centers for Disease Control and Prevention. We conducted multilevel linear regression to estimate the associations between duration of each BPI and monthly incidence of COVID-19 (cases per 100 000 population) by SVI quartiles (grouped as low, moderate low, moderate high, and high social vulnerability) for 3141 US counties. RESULTS: Having a BPI in place for longer durations (ie, ≥2 months) was associated with lower incidence of COVID-19 compared with having a BPI in place for <1 month. Compared with having no BPI in place or a BPI in place for <1 month, differences in marginal mean monthly incidence of COVID-19 per 100 000 population for a BPI in place for ≥2 months ranged from -4 cases in counties with low SVI to -401 cases in counties with high SVI for face mask mandates, from -31 cases in counties with low SVI to -208 cases in counties with high SVI for stay-at-home orders, and from -227 cases in counties with low SVI to -628 cases in counties with high SVI for gathering bans. CONCLUSIONS: Establishing COVID-19 prevention measures for longer durations may help reduce COVID-19 transmission, especially in communities with high levels of social vulnerability.


Subject(s)
COVID-19 , Humans , COVID-19/epidemiology , COVID-19/prevention & control , Incidence , Policy , Social Vulnerability , United States/epidemiology
6.
Sex Transm Dis ; 47(2): 71-79, 2020 02.
Article in English | MEDLINE | ID: mdl-31935206

ABSTRACT

BACKGROUND: It is well established that network structure strongly influences infectious disease dynamics. However, little is known about how the network structure impacts the cost-effectiveness of disease control strategies. We evaluated partner management strategies to address bacterial sexually transmitted infections (STIs) as a case study to explore the influence of the network structure on the optimal disease management strategy. METHODS: We simulated a hypothetical bacterial STI spread through 4 representative network structures: random, community-structured, scale-free, and empirical. We simulated disease outcomes (prevalence, incidence, total infected person-months) and cost-effectiveness of 4 partner management strategies in each network structure: routine STI screening alone (no partner management), partner notification, expedited partner therapy, and contact tracing. We determined the optimal partner management strategy following a cost-effectiveness framework and varied key compliance parameters of partner management in sensitivity analysis. RESULTS: For the same average number of contacts and disease parameters in our setting, community-structured networks had the lowest incidence, prevalence, and total infected person-months, whereas scale-free networks had the highest without partner management. The highly connected individuals were more likely to be reinfected in scale-free networks than in the other network structures. The cost-effective partner management strategy depended on the network structures, the compliance in partner management, the willingness-to-pay threshold, and the rate of external force of infection. CONCLUSIONS: Our findings suggest that contact network structure matters in determining the optimal disease control strategy in infectious diseases. Information on a population's contact network structure may be valuable for informing optimal investment of limited resources.


Subject(s)
Community Networks , Computer Simulation , Contact Tracing , Cost-Benefit Analysis , Sexual Partners/psychology , Sexually Transmitted Diseases/prevention & control , Bacterial Infections/prevention & control , Bacterial Infections/transmission , Communicable Disease Control/economics , Communicable Disease Control/methods , Community Networks/economics , Contact Tracing/methods , Contact Tracing/statistics & numerical data , Humans , Sexually Transmitted Diseases/diagnosis , Sexually Transmitted Diseases/psychology
7.
Pharmacoeconomics ; 37(11): 1329-1339, 2019 11.
Article in English | MEDLINE | ID: mdl-31549359

ABSTRACT

The use of open-source programming languages, such as R, in health decision sciences is growing and has the potential to facilitate model transparency, reproducibility, and shareability. However, realizing this potential can be challenging. Models are complex and primarily built to answer a research question, with model sharing and transparency relegated to being secondary goals. Consequently, code is often neither well documented nor systematically organized in a comprehensible and shareable approach. Moreover, many decision modelers are not formally trained in computer programming and may lack good coding practices, further compounding the problem of model transparency. To address these challenges, we propose a high-level framework for model-based decision and cost-effectiveness analyses (CEA) in R. The proposed framework consists of a conceptual, modular structure and coding recommendations for the implementation of model-based decision analyses in R. This framework defines a set of common decision model elements divided into five components: (1) model inputs, (2) decision model implementation, (3) model calibration, (4) model validation, and (5) analysis. The first four components form the model development phase. The analysis component is the application of the fully developed decision model to answer the policy or the research question of interest, assess decision uncertainty, and/or to determine the value of future research through value of information (VOI) analysis. In this framework, we also make recommendations for good coding practices specific to decision modeling, such as file organization and variable naming conventions. We showcase the framework through a fully functional, testbed decision model, which is hosted on GitHub for free download and easy adaptation to other applications. The use of this framework in decision modeling will improve code readability and model sharing, paving the way to an ideal, open-source world.


Subject(s)
Decision Making , Decision Support Techniques , Software , Cost-Benefit Analysis , Humans , Reproducibility of Results
8.
Prev Vet Med ; 159: 1-11, 2018 Nov 01.
Article in English | MEDLINE | ID: mdl-30314771

ABSTRACT

In the United States, slaughter surveillance combined with other measures has effectively maintained a very low prevalence of bovine tuberculosis (bTB). However, bTB continues to be sporadically detected, causing substantial economic burden to the government and cattle producers. To detect the infection earlier and reduce sudden economic losses, additional risk-based surveillance of live animals might be more cost-effective than slaughter surveillance alone to detect and prevent bTB infection. The objective of this study was to evaluate alternative risk-based surveillance strategies targeting high-risk herds to complement slaughter surveillance in a region with very low bTB prevalence. We developed an integrated within- and between-herd bTB transmission model with simulated premises-level cattle movements among beef and dairy herds in Minnesota for 10 years. We constructed ten risk-based surveillance strategies for beef herds and dairy herds, and predicted the epidemiological outcomes and costs for each strategy in combination with slaughter surveillance. Our models showed that slaughter surveillance alone resulted in low risk of between-herd transmission with typically small outbreak sizes, and also cost less compared to alternative risk-based surveillance measures. However, risk-based surveillance strategies could reduce the time to detect infection and the time to reach disease freedom by up to 9 months. At a higher initial prevalence, alternative risk-based surveillance could reduce the number of infected herds and shorten the time to disease freedom by almost 3 years (34-35 months). Our findings suggest that risk-based surveillance could detect infection more quickly and allow affected regions to reach disease freedom faster. If the bTB status of the affected regions changes after an outbreak happens, the reduced time to disease freedom could reduce the economic impact on the affected region.


Subject(s)
Cost-Benefit Analysis , Disease Outbreaks/veterinary , Epidemiological Monitoring/veterinary , Tuberculosis, Bovine/epidemiology , Animal Husbandry , Animals , Cattle , Dairying , Minnesota/epidemiology , Models, Economic , Models, Theoretical , Population Surveillance/methods , Prevalence , Risk Assessment , Tuberculosis, Bovine/microbiology
SELECTION OF CITATIONS
SEARCH DETAIL
...